Thursday, August 25, 2011

Is Facial Recognition A Top Privacy Issue of Our Time?

According to privacy stalwarts like the Privacy Rights Clearinghouse (PRC), the ACLU, the Electronic Frontier Foundation and EPIC the answer to this question appears to be a resounding "yes". I speak in particular of an op-ed by Amber Yoo of PRC in today's California Progress Report entitled "Facial Recognition: A Top Privacy Issue of Our Time" that lays out in detail, with accompanying links to other groups work on this topic, just why this burgeoning "security" technology is such a threat.

I have touched on this subject in the past on this blog, particularly when discussing what are called Biometrics. So before I get to some choice clips of Amber's article (and a number of others), let me refresh everyone on the concept of biometric identifiers - like fingerprints, facial, and/or iris scans.  These essentially match an individual’s personal characteristics against an image or database of images. Initially, the system captures a fingerprint, picture, or some other personal characteristic, and transforms it into a small computer file (often called a template). The next time someone interacts with the system, it creates another computer file. 

There are a number of reasons why such technological identifiers should concerns us. So let's be real clear, creating a database with millions of facial scans and thumbprints raises a host of surveillance, tracking and security question - never mind the cost. And as you might expect, such identifiers are being utilized by entities ranging from Facebook to the FBI. In fact, the ACLU of California is currently asking for information about law enforcements’ use of information gathered from facial recognition technology (as well as social networking sites, book providers, GPS tracking devices, automatic license plate readers, public video surveillance cameras).


As for Facebook, consider the ramifications: there's over 600 million members....and each day members upload over 200 million photos - with the network hosting over 90 billion photos total. Each time a photo is "tagged" its facial recognition technology learns more about what that person looks like.

As PC World noted in a recent article on the subject, "Even if you happen to "opt out" of the facial recognition tagging, Facebook's technology can surely use the tagged photos of you (hey, perhaps even the tagged photos of you that you end up un-tagging) to figure out what you look like. Right now Facebook is using this technology to help people tag photos. But once they have an accurate facial recognition database of several hundred million people?"


Facial recognition technology – especially as the technology becomes more sophisticated – may be one of the gravest privacy threats of our time. It has the potential to remove the anonymity Americans expect in crowds and most public places. There are the obvious “chilling effects” it could have on political demonstrations and speech, concerns being monitored by civil liberties advocates like the ACLU, EPIC, and EFF. However, this technology will also very likely be used in greater capacity in the commercial sector to further target consumers for advertising and discriminatory pricing purposes.

Earlier this month, Carnegie Mellon University researchers released a study detailing three experiments that reveal the possibility of identifying people, both online and in the real world, who may otherwise believe they are anonymous. The researchers took photos of people walking on campus and used facial recognition technology and information publicly available online to figure out their name, age, place of birth and, in some cases, even their Social Security number. Many individuals share a tremendous amount of information about themselves online, and the study demonstrates how easy it is to link this online information to a person using facial recognition technology.

...

In his book Niche Envy, Joseph Turow, a professor at the University of Pennsylvania, explains how companies are using increasingly sophisticated market segmentation methods to offer different prices to different people, a practice known as price discrimination. The more detailed the profile a company can build on someone, the more accurately it can estimate how much that person is willing to spend on a product.

Professor Turow focused primarily on online data collection, but as the Carnegie Mellon study illustrates, facial recognition technology makes it possible to connect someone’s offline identity with his or her online identity without obtaining consent. As facial recognition technology advances and the number of consumers using social media continues to increase, it’s not far-fetched to imagine a scenario where a consumer walks into a store and is treated differently or even sees different prices based on the combination of this biometric data and personal information publicly available online. 

A further concern is the unwanted identification of individuals with sensitive circumstances– such as victims of domestic violence, stalking victims, and law enforcement officers.

  
This is yet another CLEAR example of technology outpacing regulation, and the need for increased privacy protections for consumers. It will of course take more than just laws to protect us, it will also take knowledge and personal choice...as in the choice NOT to shop or use products sold by companies that are using such facial recognition technologies.

Of course, the use of this technology goes FAR beyond commercial interests. As the Electronic Frontier Foundation's Jennifer Lynch detailed just a couple months ago, the FBI is pursuing the next generation of Biometrics as I write this...with the Patriot Act no doubt serving as the agency's firewall of protection when it violates our civil liberties and privacy.

Lynch writes:

...the Center for Constitutional Rights (CCRFOIA lawsuit that expose the concerted efforts of the FBI and DHS to build a massive database of personal and biometric information. This database, called “Next Generation Identification” (NGI), has been in the works for several years now. However, the documents CCR posted show for the first time how FBI has taken advantage of the DHS Secure Communities program and both DHS and the State Department’s civil biometric data collection programs to build out this $1 billion database.

Unlike some government initiatives, NGI has not been a secret program. The FBI brags about it on its website (describing NGI as “bigger, faster, and better”), and both DHS and FBI have, over the past 10+ years, slowly and carefully laid the groundwork for extensive data sharing and database interoperability through publicly-available privacy impact assessments and other records. However, the fact that NGI is not secret does not make it OK. Currently, the FBI and DHS have separate databases (called IAFIS and IDENT, respectively) that each have the capacity to store an extensive amount of information—including names, addresses, social security numbers, telephone numbers, e-mail addresses, fingerprints, booking photos, unique identifying numbers, gender, race, and date of birth. Within the last few years, DHS and FBI have made their data easily searchable between the agencies. However, both databases remained independent, and were only “unimodal,” meaning they only had one biometric means of identifying someone—usually a fingerprint.


...

So why should we be worried about a program like NGI, which the FBI argues will “reduce terrorist and criminal activities”? Well, the first reason is the sheer size of the database. Both DHS and FBI claim that their current biometrics databases (IDENT and IAFIS, respectively) are the each the “largest biometric database in the world.” IAFIS contains 66 million criminal records and 25 million civil records, while IDENT has over 91 million individual fingerprint records.

Once these records are combined into one database and once that database becomes multimodal, as we discussed in our 2003 white paper on biometrics, there are several additional reasons for concern. Three of the biggest are the expanded linking and tracking capabilities associated with robust and standardized biometrics collection systems and the potential for data compromise.


Already, the National Institute for Standards and Technology, along with other standards setting bodies, has developed standards for the exchange of biometric data. FBI, DHS and DoD’s current fingerprint databases are interoperable, indicating their systems have been designed (or re-designed) to read each others’ data. NGI will most certainly improve on this standardization. While this is good if you want to check to see if someone applying for a visa is a criminal, it has the potential to be very bad for society. Once data is standardized, it becomes much easier to use as a linking identifier, not just in interactions with the government but also across disparate databases and throughout society. This could mean that instead of being asked for your social security number the next time you apply for insurance, see your doctor, or fill out an apartment rental application, you could be asked for your thumbprint or your iris scan.

This is a big problem if your records are ever compromised because you can’t change your biometric information like you can a unique identifying number such as an SSN. And the many recent security breaches show that we can never fully protect against these kinds of data losses.

The third reason for concern is at the heart of much of our work at EFF. Once the collection of biometrics becomes standardized, it becomes much easier to locate and track someone across all aspects of their life. As we said in 2003, “EFF believes that perfect tracking is inimical to a free society. A society in which everyone's actions are tracked is not, in principle, free. It may be a livable society, but would not be our society.”

Click here to read more.

The ACLU put together an excellent Q&A on facial recognition technology, which, in answering why it represents a threat to privacy, states, "One threat is the fact that facial recognition, in combination with wider use of video surveillance, would be likely to grow increasingly invasive over time. Once installed, this kind of a surveillance system rarely remains confined to its original purpose. New ways of using it suggest themselves, the authorities or operators find them to be an irresistible expansion of their power, and citizens' privacy suffers another blow. Ultimately, the threat is that widespread surveillance will change the character, feel, and quality of American life.

Another problem is the threat of abuse. The use of facial recognition in public places like airports depends on widespread video monitoring, an intrusive form of surveillance that can record in graphic detail personal and private behavior. And experience tells us that video monitoring will be misused. Video camera systems are operated by humans, after all, who bring to the job all their existing prejudices and biases. In Great Britain, for example, which has experimented with the widespread installation of closed circuit video cameras in public places, camera operators have been found to focus disproportionately on people of color, and the mostly male operators frequently focus voyeuristically on women. 

While video surveillance by the police isn't as widespread in the U.S., an investigation by the Detroit Free Press (and followup) shows the kind of abuses that can happen. Looking at how a database available to Michigan law enforcement was used, the newspaper found that officers had used it to help their friends or themselves stalk women, threaten motorists, track estranged spouses - even to intimidate political opponents.  The unavoidable truth is that the more people who have access to a database, the more likely that there will be abuse. 

Facial recognition is especially subject to abuse because it can be used in a passive way that doesn't require the knowledge, consent, or participation of the subject. It's possible to put a camera up anywhere and train it on people; modern cameras can easily view faces from over 100 yards away. People act differently when they are being watched, and have the right to know if their movements and identities are being captured. 

And, just to drive this whole post home for you, I found an article on MSNBC yesterday entitled "Post 9/11, surveillance cameras everywhere" (with the subhead, "Security industry boomed for years, but terror is rarely a focus")

Here's a few clips from the piece:

Market research firm IMS Research estimates that more than 30 million surveillance cameras have been sold in the United States in the past decade. Video surveillance alone is a $3.2 billion industry, representing about one-third of the overall security market, according to 2007 figures from the Security Industry Association, a trade group. That was the last time they gathered such data, a spokesman said.

...

Although advanced security measures are now commonplace, they are rarely being used to nab would-be terrorists. Instead, security cameras often serve other purposes, such as catching students or workers who are misbehaving, or tracking down common criminals...The increasing prevalence of security cameras, often assisted these days by facial recognition software, have raised thorny privacy questions as Americans find their images captured with increasing regularity. 

...

Privacy advocates warn that we may be too complacent about the fact that our pictures are being taken everywhere from the department store checkout counter to the high school hallway, as well as shared freely on social networks. That data can potentially be used by everyone from marketers to police investigators. “I do think it’s really important when we think about that question of where those data go in the world of social media,” said David Lyon, a professor of surveillance studies at Queen’s University in Canada. 

Click here to read more.

As I have written here NUMEROUS times, what concerns me is what are the side effects of living in a society without privacy. Where are we left when the power of corporate or government interests to monitor everything we do is absolute?

Whether its the knowledge that everything we do on the internet is followed and stored, that we can be wiretapped for no reason and without a warrant or probable cause, that smart grid systems monitor our daily in home habits and actions, that our emails can be intercepted, that our naked bodies must be viewed at airports and stored, that our book purchases can be accessed (particularly if Google gets its way and everything goes electronic), that street corner cameras are watching our every move, and that RFID tags and GPS technology allow for the tracking of clothes, cars, and phones (and the list goes on)...what is certain is privacy itself is on life support in this country...and without privacy there is no freedom. I also fear how such a surveillance society stifles dissent and discourages grassroots political/social activism that challenges government and corporate power...something that we desperately need more of in this country, not less.

No comments: