Showing posts with label Corporate Accountability. Show all posts
Showing posts with label Corporate Accountability. Show all posts

Tuesday, March 27, 2012

Banning Employers from Requesting Employee Social Media User Names/Passwords

Yes, you read that headline correctly, employers are now asking, in increasing numbers, for employee social media (like Facebook) user names and passwords. If that doesn't send chills down the spine of every American who proclaims to believe in a free country, or even the concept of privacy, I don't know what will.

Let's begin with what we already know about increasing intrusiveness from both government and corporate/employer interests: As of two years ago, Facebook reportedly receives up to 100 demands from the government each week for information about its users. AOL reportedly receives 1,000 demands a month. In 2006, a U.S. Attorney demanded book purchase records of 24,000 Amazon.com customers. Sprint recently disclosed that law enforcement made 8 million requests in 2008 alone for its customer’s cell phone GPS data for purposes of locational tracking.


Now let's get to the corporate side of this privacy creep. It was Facebook itself no less - a known enemy of privacy and the world’s biggest social networking site - that came out just a few days ago with a statement claiming it was alarmed by reports that some businesses ask potential employees for passwords in order to view private posts and pictures as part of the job-application process. 

Before I get to California State Senator Leland Yee's bill, proposed this week to ban this practice, let me continue with the initial reaction from two US Senators - New York Senator Charles Schumer and Senator Richard Blumenthal - to this hair raising practice. They have asserted the practice could violate federal anti-hacking statutes and have also, thankfully, asked the U.S. Equal Employment Opportunity Commission to examine the practice as well. 


Blumenthal said that by requiring job applicants to provide login credentials, employers could gain access to protected information that would be impermissible for them to consider when making hiring decisions. Those include religious affiliation and sexual orientation, which are protected categories under federal law.

Facebook said on March 23 that accessing such information also could expose businesses to discrimination lawsuits. The company said it might ask policy makers to take action to stop the practice.

...

Facebook and other sites are already used by some potential employers seeking additional background on job applicants because of the personal information posted there. As Facebook has given users additional ways to protect that information from public view, reports have surfaced of employers asking job applicants to voluntarily give them access by providing personal login credentials.
...

The lawmakers also asked the department to investigate whether the practice violates the Stored Communications Act, which prohibits intentional access to electronic information without authorization or in excess of authorization.

This reminds me a lot of the legislation that we (the Consumer Federation of California) supported last year - and was signed into law by Governor Brown - that banned the practices of employers checking prospective employees credit reports. Before I remind people a little more about why that was such a HUGE victory for both privacy and economic fairness, let me get to Senator Leland Yee's legislation here in California.

His legislation would stop employers from formally requesting or demanding employees or job applicants provide their social media usernames and passwords.

As the Yee rightly states,It is completely unacceptable for an employer to invade someone’s personal social media accounts. Not only is it entirely unnecessary, it is an invasion of privacy and unrelated to one’s work performance or abilities. These outlets are often for the purpose of individuals to share private information with their closest friends and family. Family photos and non-work social calendars have no bearing on a person’s ability to do their job and therefore employers have no right to demand to review it.”

Rather than formally requesting passwords and usernames, some employers have demanded applicants and employees to sit down with managers to review their social media content or fully print out their social media pages.  

Yee's bill will also prohibit this practice.  

As I argued in defense of AB 22 (Mendoza) regarding so called "requests", and thus an employee's "choice" to say yes or no, when you're trying to get a job, especially in this economy, its not exactly "voluntary" when coerced by an employer that can fire you, or choose not to hire you.

As I pointed out last year, and it appears the same is beginning to happen with these kinds of employer requests, a person's credit rating (which have suffered due to the Great Recession) - also NOT a good indicator of a person's trustworthiness or work ethic - were being increasingly demanded by employers (in fact, a whopping 40% of the time!!). 

Evi­dence also suggested that some supervisors factor credit scores into decisions regarding promotion and evaluation of current workers. Could the same be said for Facebook account content?

In the case of credit ratings, there was also the consideration of the role credit agency fraud played in the housing bubble burst, subse­quent economic crisis and the reduced credit scores suffered by so many Americans. In that context, for an employer to discriminate against someone with a less than stellar credit record is unconscionable. Wall Street excesses and Congress’ weak re­sponse have built plenty of barriers between the jobless and their prospects for future employ­ment. Allowing employers to use credit checks to deny employment only serves as another obstacle to getting Californians back to work.

And to top it all off, credit reports are often inaccurate, and correcting mistaken information is a tedious, time consuming process, and in the meantime, the job applicant is harmed due to errors by credit reporting entities.

That was a great victory for California privacy and basic economic fairness...and so should this latest legislation from Leland Yee and his efforts to end the practice of employers demanding and/or requesting access to employee Facebook pages.

If interested, here's an interview I did on the Rick Smith Show last year regarding AB 22:

Tuesday, March 20, 2012

California Legislation to Address Police Tracking/Storing License Plate Info and Driver Locations

California privacy stalwart - State Senator Joe Simitian - is back again with another critically important bill. SB 1330 will address what has become one of the fastest-growing trends in law enforcement  - including private industry: monitoring and compiling license-plate records (license plate recognition technology, or LPR) on both innocent and criminal drivers which that can then be searched by police.

It goes without saying that this locational tracking of potentially every driver on the road is a threat to privacy. To date, the courts have only begun to address whether investigators can secretly attach a GPS monitoring device to cars without a warrant (the Supreme just ruled they can't).

This ruling hasn't however deterred police from across the country - and companies like Vigilant Video - from utilizing these high-tech scanners on the exterior of their cars to take a picture of every passing license plate and automatically compare them to databases of outstanding warrants, stolen cars and wanted bank robbers.

As alluded to, these scanners are employed by a variety of law enforcement agencies, asset recovery companies and financial institutions, among other organizations. While they are admittedly a valuable resource for law enforcement, they are also valuable to private entities wishing to acquire or sell data about people’s movements and habits. 

In fact, we have learned that some private entities utilize “scout cars” whose sole purpose is to acquire LPR data; such entities possess millions of LPR data points, and claim to scan 40 percent of vehicles in the country on an annual basis.

This volume of LPR data can provide a roadmap to an individual’s personal life including his or her movements, activities, medical conditions, friendships, religious practices, vocation, political beliefs, etc. This poses a serious risk to Californians’ constitutional right to privacy, especially since LPR data is acquired without an individual’s knowledge or consent.

Senator Simitian's bill offers a critical safeguard to Californians’ constitutional right to privacy by modeling itself on existing state law governing 1) the use of LPR scanners and data by the California Highway Patrol, and 2) the disclosure of information acquired by transportation agencies through electronic toll collection systems (another bill Senator Simitian recently authored).  Most importantly, the law would limit the time enforcement agencies in California can retain such data captured by these license-plate scanners to 60 days, except when the information is being used in felony investigations.

As reported in California Watch:

Simitian said in an interview that there’s a critical distinction between consumers who voluntarily choose to turn over private information to Internet companies like Facebook and technologies that quietly collect information on drivers.

He helped hammer out the guidelines in place for the highway patrol and said balancing privacy protections enshrined in the state’s constitution with the tools police need to improve public safety is part of the legislative process. “I don’t think the two are mutually exclusive,” Simitian said.

Lee Tien of the Electronic Frontier Foundation, a digital and privacy rights group based in San Francisco, said it’s “a good attempt at beginning to address the issue.” The foundation so far plans to support the legislation, Tien said.

The bill also would prohibit police from turning the data over to entities that are not engaged in law enforcement, such as private companies.

Simitian’s proposal comes after California Watch reported in January that a Livermore-based company called Vigilant Video had amassed more than a half-billion bits of information on drivers from license-plate scanners. The data come both from police who agree to turn it over for nationwide searches and auto-repossession companies that help banks track down debtors who are delinquent on their car payments.

A company sales manager previously told California Watch that about 1,200 new law enforcement users are signed up every month to search the database, known as the National Vehicle Location Service. While using the devices to nab wanted suspects in real time has a clear value for police, storing historical data from the units is equally alluring to police who are aware of its powerful intelligence value.

Simitian’s bill also would restrict companies like Vigilant, limiting the amount of time data can be held to 60 days, barring them from selling it or giving the data to anyone who is not a law enforcement officer, and making data available to police only when a search warrant has established probable cause. Vigilant says only approved law enforcement officials can sign up to search the National Vehicle Location Service. 


Senator Simitian's legislation will be AGGRESSIVELY supported by a broad coalition of privacy and consumer advocates as it strikes a balance between law enforcement’s legitimate use of LPR scanners for public safety purposes, and Californians’ right to privacy.

Wednesday, March 14, 2012

5 Ways To Protect Online Privacy

Due to serious time constraints I'm going to refrain from much personal pontificating today and go straight to a great piece by Alternet's David Rosen entitled "Your Are Being Tracked Online: Here Are 5 Ways to Protect Your Privacy". Suffice to say, he lays out a number of the issues I've been covering on this blog, including ways that you can protect your own privacy, but more importantly, as I often argue, what kinds of rules and protections are needed to make this task easier - and give people more power over their data and what's done with it.

I think his general analysis of the President's Consumer Privacy Bill of Rights is on point too...namely, that while conceptually its got a lot of good stuff, there's not a lot of reason to be optimistic that it will end up being very strong, due to deference to the Congress and/or appeasement of big business interests when the time comes to fight for what's most important.

He also delves into the detrimental effects to privacy of media consolidation as well as the shift from paper based media to digitally based....which forces these companies to find new ways (like behavioral tracking) to raise revenue to stay afloat.

With that said, here's a few of the most important passages of his piece in case you don't have the time to read the whole thing:

Overlooked by the media, the Federal Trade Commission issued a warning earlier in February over apparent violations of children’s privacy rights involving the operating systems of the Apple iPhone and iPad as well as Google’s Android and their respective apps developers. Its report, "Mobile Apps for Kids," examined 8,000 mobile apps designed for children and found that parents couldn’t safeguard the personal information the app maker collected.

To illustrate how pernicious this practice is, one iPhone app, Path, offered by a Singapore developer, downloaded an iPhone users' entire address book without alerting them. Prodded by a letter from Congressmen Henry Waxman (D-CA) and G.K. Butterfield (D-NC), Apple’s CEO Tim Cook said the company will ensure that app developers get permission before downloading a user's address book.

The battle over your personal data is principally about ad spending.
The mass media is witnessing a shift from “broadcast” media like newspapers, radio and TV to “targeted” media like website ads, search capabilities and social networks. The consequences for newspapers and magazines are clear; TV is fighting to hold onto every ad dollar with a new “social TV” initiative. And your personal information is what enables targeted advertising.
...

Two industries, advertising and data brokers, principally drive the colonization of digital personal information. Traditional online usage practices such as monitoring of sites visited, ad click-throughs and email keywords are the bread and butter of information capture.

At a Senate hearing in September 2007 reviewing Google’s acquisition of DoubleClick, Sen. Herb Kohl warned, "The antitrust laws were written more than a century ago out of a concern with the effects of undue concentrations of economic power for our society as a whole, and not just merely their effects on consumers’ pocketbooks. No one concerned with antitrust policy should stand idly by if industry consolidation jeopardizes the vital privacy interests of our ciitzens so essential to our democracy."

The merger of these two ad-serving businesses set the stage of greater integration of personal information gathering and the online ad industry.

According to Forrester Research, total online advertising will more than double over the next five years, jumping from the 2011 estimate of $34.5 billion to $76.6 billion by 2016. Giving some texture to these numbers, eMarketing estimates that the top five online services control more than 70 percent of all monies spent. These five (and their relative market share) are: Google (43.5%), Yahoo! (11.9%), Facebook (7.7%), Microsoft (5.4%) and AOL (2.8%)

Facebook collects two types of information: (i) personal details provided by a user and (ii) usage data collected automatically as the user spends time at the site clicking around. When joining Facebook, a user discloses such information as name, email address, telephone number, address, gender and schools attended. In addition, it records a user’s online usage patterns, including the browser they use, the user's IP address and how long they spend logged into the site.

...

More pernicious, your personal Social Security number, phone numbers, credit card numbers, medical prescriptions, shopping habits, political affiliations and sexual orientation are now fodder for both corporate and government exploitation.

Both the ad agencies and data brokers have information capture down to a bad science. They track your every keystroke, your every order and bill payment, words and phrases in your emails and your every mobile movement.

And your personal information is pretty cheap as the following examples illustrate: address - $0.50; phone number - $0.25; unpublished phone number - $17.50; cell phone number - $10; Social Security number - $8; drivers license - $3; marriage/divorce - $7.95; education background - $12; employment history - $13; credit history - $9; bankruptcy information - $26.50; shareholder information - $1.50; lawsuit history – $2.95; felony record - $16; sex offender status - $13; and voter registration - $0.25. [Source: www.turbulence.org]

...

1. Privacy needs to be made a right.

“Privacy” is an implied – as distinguished from an explicit – right guaranteed by the Constitution. For all the rights suggested in the White House’s white paper, no new real right to privacy is proposed.

...

2. Regulation should replace voluntary compliance.

The White House program is based on the various interested parties, particularly online advertising companies, adopting a voluntary compliance commitment to safeguard people’s online privacy. But will self-regulation work?

...

3. Data vendors should be held accountable.


The White House document calls for data brokers to permit consumer reasonable access to the data they collect. It encourages the collectors to provide a mechanism for review, revision and limits to its use.

...

4. Bar federal agencies from buying private data.


The white paper fails to address the federal government’s growing reliance on information gathered by private data collectors, whether the information is accurate or not.

...

5. There’s a need for a global personal privacy standard.

The U.S. and Europe are moving in two opposing directions with regard to data privacy rules. The White House plan emphasizes mutual recognition of privacy approaches, an international role for codes of conduct and enforcement cooperation to safeguard personal privacy. Yet, the U.S. model is in keeping with its long tradition of putting the interest of business before its citizens; the Europeans are developing an online privacy program that places the interests of citizens first.


Click here to read the article in its entirety.

Wednesday, March 7, 2012

Blogger Exposes Hole in Body Scanner Technology

I've written about this issue extensively on this blog, in fact, you can check out an op-ed I penned over a year ago entitled "A Hobson's Holiday Travel Choice: Digital Strip Search or Get Groped?" if you want to get a real good feel of what I think about these airport body scanning machines.

For today's purposes, I'm just going to take you straight to a video posted by a blogger demonstrating yet another hole in the "security" these machines provide.

Before I post the video, here's a clip from the post: A blogger on Tuesday published a video showing how he had snuck a small metal case through the Transportation and Security Administration's (TSA) "billion dollar fleet" of so-called nude body scanners.

Engineer Jonathan Corbett, who runs the blog TSA Out of Our Pants, explained that the problem lies in how the scanner uses dark colors to highlight potential threats like weapons or explosives.

"Again that’s light figure, black background, and BLACK threat items," he explained. "Yes that’s right, if you have a metallic object on your side, it will be the same color as the background and therefore completely invisible to both visual and automated inspection."

"To put it to the test, I bought a sewing kit from the dollar store, broke out my 8th grade home ec skills, and sewed a pocket directly on the side of a shirt. Then I took a random metallic object, in this case a heavy metal carrying case that would easily alarm any of the 'old' metal detectors, and walked through a backscatter x-ray at Fort Lauderdale-Hollywood International Airport."


Again at Cleveland-Hopkins International Airport, Corbett successfully carried his small, empty metal case through the scanners.

"While I carried the metal case empty, by one with mal-intent, it could easily have been filled with razor blades, explosives, or one of Charlie Sheen’s infamous 7 gram rocks of cocaine," he warned. "With a bigger pocket, perhaps sewn on the inside of the shirt, even a firearm could get through."


Wednesday, February 29, 2012

New Google Privacy Policy and "Do Not Track"

Being that its been such a disastrous few weeks for Google in the privacy violation department I thought I'd go back to the topic of its new privacy rules as well as get into some of the important technicalities associated with Do Not Track protections in light of the President's proposed Privacy Bill of Rights.

First, let's go to reigning anti-privacy global champion Google, who is changing its privacy policies this week, placing 60 of its 70 existing product privacy policies under one blanket policy and breaking down the identity barriers between (to accommodate its new Google+ social network software) them as well. In other words, Google will combine data from all its services, so when users are signed in, Google may combine identity information users provided from one service with information from other services. The goal is to treat each user as one individual across all Google products, such as Gmail, Google Docs, YouTube and other Web services. You can read more about this policy in a recent post of mine.

Then we find out that Google has been bypassing the privacy settings in Apple's Safari browser. This is of particular concern and importance because that system, and those users, are specifically INTENDING that such monitoring be BLOCKED.

So that was the "Google" backdrop for a few other related stories. First, the President proposed a Consumer Privacy Bill of Rights that has some potential, though numerous pitfalls (I'll get to that later). And second, while Google has agreed to offer a kind of "Do Not Track" mechanism on Chrome, this didn't stop The Electronic Privacy Information Center (EPIC) from attempting to make Google obtain its users permission BEFORE sharing their private information as a result of its new privacy policy.

Unfortunately, U.S. District Judge Amy Berman Jackson said the court had no authority to force the FTC to keep Google in check. As detailed by Courthouse News, this isn't Google's first brush with the law: In June 2011, a federal judge approved an $8.5 million class action settlement brought by 31 million Gmail users who sued Google for exposing their personal information through its recently discontinued email feature, Google Buzz. In their lawsuit, users called the feature, which automatically shared their information with their email contacts, an "indiscriminate bludgeon" that could reveal the names of doctors' patients or lawyers' clients, or even the contacts of a gay person "who was struggling to come out of the closet and had contacted a gay support group."






The judge also made it clear that her ruling should not be taken as an endorsement of Google's privacy policies or her opinion on whether they violate the consent order.

So what does Google's new policy mean to you and what are some ways to better protect your privacy?

CNN.com suggests - in an article entitled "How to prepare for Google's privacy changes"- the following:

Don't sign in

This is the easiest and most effective tip.Many of Google's services -- most notably search, YouTube and Maps -- don't require you to sign in to use them. If you're not logged in, via Gmail or Google+, for example, Google doesn't know who you are and can't add data to your profile.

But to take a little more direct action ...

Removing your Google search history

Eva Galperin of the Electronic Frontier Foundation has compiled a step-by-step guide to deleting and disabling your Web History, which includes the searches you've done and sites you've visited.
It's pretty quick and easy:

-- Sign in to your Google account
-- Go to www.google.com/history
-- Click "Remove all Web History"
-- Click "OK"

As the EFF notes, deleting your history will not prevent Google from using the information internally. But it will limit the amount of time that it's fully accessible. After 18 months, the data will become anonymous again and won't be used as part of your profile.


Clearing your YouTube history

Similarly, users may want to remove their history on YouTube. That's also pretty quick and easy.
-- Sign in on Google's main page
-- Click on "YouTube" in the toolbar at the top of the page
-- On the right of the page, click your user name and select "Video Manager"
-- Click "History" on the left of the page and then "Clear Viewing History"
-- Refresh the page and then click "Pause Viewing History"
-- You can clear your searches on YouTube by going back and choosing "Clear Search History" and doing the same steps.


Interestingly, just as the White House pushes a privacy bill of rights its new online privacy legislation for Congress to consider, Google (in the wake of its privacy invasions) decided to get behind "Do Not Track," for Google Chrome. As Computerworld defines it, and how such a mechanism is eventually defined and operated is critical to its usefulness, "Do Not Track" is a "technology that relies on information in the HTTP header, part of the requests and responses sent and received by a browser as it communicates with a website, to signal that the user does not want to be tracked by online advertisers and sites.

In the browsers that now support the Do Not Track header, a user tells sites he or she does not want to be tracked by setting a single option. In Mozilla's Firefox, for instance, that's done through the Options (on Windows) or Preferences (Mac) pane by checking a box marked, "Tell web sites I do not want to be tracked."." That of course...just how well it does that and how is the million dollar question."

So what did Google just agree to by adding its support for Do Not Track to its Chrome browser? Computerworld has more:

So, when I tell my browser to send the Do Not Track request, no one will monitor my movements? 

Hold on there, pardner. Thursday's commitment by Google to support Do Not Track in Chrome may have been a clear win for the specific way that request is communicated, but there's no such clarity on what websites do -- or don't do -- when they receive that signal.

"On the technology side, this is an unambiguous win, but on the policy side there is still a lot of work to be done," Mayer said yesterday. The Electronic Frontier Foundation (EFF), an online privacy advocacy organization, said much the same. "While today was a great advancement on the Do Not Track technology, it did not meaningfully move the ball forward on the Do Not Track policy," said Rainey Reitman, the EFF's activism director, in a Thursday blog.

What have sites agreed to do with Do Not Track?  

They'll stop using cookies to craft targeted ads, the kind pointed at you based on your past surfing and other online behavior. 

But the companies that lined up Thursday to support Do Not Track -- the ad networks, websites and corporations who belong to the latest online ad industry trade group, the Digital Advertising Association (DAA) -- haven't promised to actually stop tracking users' Web movements. Instead, they've pledged to not use tracking data to serve targeted ads -- which the DAA calls "behavioral advertising" -- or use that tracking information "for the purpose of any adverse determination concerning employment, credit, health treatment or insurance eligibility, as well as specific protections for sensitive data concerning children." (IDG, the parent company of Computerworld, is a member of DAA, according to the association's list of participating companies and ad networks. Other media firms that will hew to the DAA's behavioral ad guidelines around Do No Track include Conde Nast, ESPN, Forbes and Time.)

What? So Do Not Track doesn't mean just that? 

Right, which is why privacy groups are pushing for a stricter interpretation. The EFF, for one, is leery of the advertising industry's sincerity.

"Historically, the DAA has eschewed providing users with powerful mechanisms for choices when it comes to online tracking," said EFF's Reitman. "The self-regulatory standards for behavioral advertising have offered consumers a way to opt out of viewing behaviorally targeted ads without actually stopping the online tracking, which is the root of the privacy concern."

Reitman worried that the DAA would mess with the simplicity of Do Not Track, and try to turn it into "slippery legalese that doesn't promise to do much of anything about tracking."

Anything else about the Do Not Track promises made by the advertising industry I should know? 

Yep, one interesting aspect: The DAA said it would not honor the setting if "any entity or software or technology provider other than the user exercises such a choice." EFF's Reitman interpreted that as a pre-emptive strike against browser makers that may want to turn on Do Not Track by default. (None do at this point.... It's off in Firefox, IE9 and Safari until the user manually changes the setting.)

Click here for more.

With that, let me take you to the New York Times article that delves deeply into the Do Not Track concept and where the battle lines will likely be drawn: separating those that want privacy, and more control over their own data, versus those that want to profit off violating that privacy, and selling that data.

The issue of digital privacy, especially how users’ data is collected online and then employed to show those users ads tailored to them, has been hotly debated for years. The announcements represent the attempt to satisfy consumer privacy concerns while not stifling the growth of online advertising, which is seen as the savior of media and publishing companies as well as the advertising industry. According to the Interactive Advertising Bureau, digital advertising revenues in the United States were $7.88 billion for the third quarter of 2011, a 22 percent increase over the same period in 2010. 

The industry’s compromise on a “Do Not Track” mechanism is one result of continuing negotiations among members of the Federal Trade Commission, which first called for such a mechanism in its initial privacy report; the Commerce Department; the White House; the Digital Advertising Alliance; and consumer privacy advocates. 

Until now, methods for opting out of custom advertising varied depending on the privacy settings of a user’s browser or whether a user clicked on the blue triangle icons in the corners of some digital ads. Under the new system, browser vendors will build an option into their browser settings that, when selected, will send a signal to companies collecting data that the user does not want to be tracked. 

The agreement covers all the advertising alliance’s members, including Google, Yahoo, AOL, Time Warner and NBCUniversal. 

Privacy advocates complain that the mechanism does not go nearly far enough in part because it affects only certain marketers. Many publishers and search engines, like Google, Amazon or The New York Times, are considered “first-party sites,” which means that the consumer goes to these Web pages directly. First-party sites can still collect data on visitors and serve them ads based on what is collected. 

...

Some consumer privacy advocates, while offering measured praise for the new privacy option, saw the move as an attempt to thwart a more restrictive stance on data collection. Jeffrey Chester, the head of the Center for Digital Democracy, which is pushing for more restrictions on data collection, called the move a win for the advertising lobby. 

In a statement, Mr. Chester said: “We cannot accept any ‘deal’ that doesn’t really protect consumers, and merely allows the data-profiling status quo to remain. Instead of negotiations, C.D.D. would have preferred the White House to introduce new legislation that clearly protected consumers online.” 

But advertisers have plenty to fear if consumers use Do Not Track in large numbers. “If there’s a high rate of opt-out, it’s an issue,” said George Pappachen, the chief privacy officer of the Kantar Group, the research and consultant unit of WPP. “Our position is data should flow,” Mr. Pappachen said, adding that data helps drive innovation and newer commercial models.

...

And there are still unresolved technical issues regarding Do Not Track, including what defines tracking and how that would apply to first-party and third-party Web sites. Over the last few months the World Wide Web Consortium, an international group that sets voluntary technical standards for the Web, has been working with representatives from companies like Microsoft, Google and Nielsen, along with academics, privacy advocates, legislators and digital advertising groups, to define the technical standard of Do Not Track. 

The consortium is also considering whether sites like Facebook, whose “like” button is used across multiple Web sites, would be considered first-party or third-party sites.“I do think you will see a lot of contention going forward about what Do Not Track means,” said Thomas Roessler, the technology and society domain leader at the consortium. 

Whether any companies should be allowed to collect data and follow users online, regardless of who they are, remains “the million-dollar question,” said Alex Fowler, the global privacy leader at Mozilla, the nonprofit organization that created the Firefox browser. Firefox was one of the first to include a Do Not Track option.“When you look at user testing, the expectation for the user for Do Not Track means, don’t behaviorally target me and also don’t collect information on me,” Mr. Fowler said. 


Stay tuned...

Monday, February 27, 2012

Obama Administration's Consumer Privacy Bill of Rights

By now most anyone that has come to this blog knows, at least in general terms, what is called behavioral targeting. This massive, growing multi-billion dollar industry is built on the tracking of you on the internet - and EVERYTHING you do on it...and then compiling, storing and selling that data to third party advertisers (while being accessed by government when requested...which we know is a lot)


This rise in behavioral tracking has made it possible for consumer information to be potentially misused, increases the threat of identity theft, and is a fundamental violation of privacy. Often times, such behavioral tracking is particularly targeted at vulnerable consumers for high-price loans, bogus health cures and other potentially harmful products and services. To date, to what extent "Do Not Track" rights exist, it has been a voluntary request from industry - which borders on pointless.

Now to my cautious optimism regarding the Obama Administration's announcement last week that it supported a Consumer Privacy Bill of Rights. The proposal lays out seven principles of privacy protections, including the right to exercise control over the dissemination of one’s data and the right to transparent privacy policies. The bill of rights is not legislation, acting more as a framework and statement of principles, but it does at least sound like the Administration "gets it" in a way we haven't heard before.

Consumers deserve the kinds of broad rights to protect their own information online the President is advocating - particularly that fundamental right to control how how personal data is used and that we deserve the right to avoid having our information collected and used for multiple unknown purposes. We also DESERVE the right to make sure our information is held securely, and not KEPT for long periods of time. And of course, we must have the right to hold those who are handling or misusing their personal data accountable when things go wrong.
To be sure, its an outline, and it still needs to make it through the legislative process (though the administration has threatened to bypass them...which is also a good sign) - meaning a GOP controlled House will have an opportunity to destroy, as it does with all public policy, anything it gets its hands on if it serves the profit motives of big business. 

Clearly, when you talk about companies like Google, Apple, Facebook and Microsoft...we're talking some big time heavy hitters with LARGE check books and hordes of high priced lobbyists. In other words, the devil will be in the details...and what will matter most might just be whether there are real, enforceable rules that punish these giants for breaking them.

But before I go into more of why tough legislation is needed - and privacy on the web is better protected, let's get to some of the details released.


Companies responsible for the delivery of nearly 90 percent of online behavioral advertisements — ads that appear on a user’s screen based on browsing and buying habits — have agreed to comply when consumers choose to control online tracking, the consortium said on Wednesday.

But even if a click of a mouse or a touch of a button can thwart Internet tracking devices, there is no guarantee that companies won’t still manage to gather data on Web behavior. Compliance is voluntary on the part of consumers, Internet advertisers and commerce sites.

"The real question is how much influence companies like Google, Microsoft, Yahoo and Facebook will have in their inevitable attempt to water down the rules that are implemented and render them essentially meaningless,” John M. Simpson, privacy project director for Consumer Watchdog, said in response to the administration’s plan. "A concern is that the administration’s privacy effort is being run out of the Commerce Department.”

It’s critical that government enact strong privacy regulations whose protections will remain with consumers as they interact on their home computer, cell phones, PDAs or even at the store down the street. Clear rules will help consumers understand how their information is used, obtained and tracked,” said Amina Fazlullah of U.S. Public Interest Research Group. “In the event of abuse of consumer information, this legislation could provide consumers a clear pathway for assistance from government agencies or redress in the courts.”

...

The new privacy outline brings together several efforts to develop and enforce privacy standards that have been progressing for the last couple of years on parallel tracks, under the direction of advertisers, Internet commerce sites and software companies.

The next step will be for the Commerce Department to gather Internet companies and consumer advocates to develop enforceable codes of conduct aligned with a “Consumer Privacy Bill of Rights” released as part of the administration’s plan on Wednesday. The bill of rights sets standards for the use of personal data, including individual control, transparency, security, access, accuracy and accountability. 


I'm a big supporter of limiting commercial tracking of our online activities, not just in the commercial sphere, but protecting it from government that increasingly demands this information from private companies.Similarly, there's a long, clear record that self regulation doesn't work - so creating rules and laws to protect people's privacy on the internet is critical, and now possible.

In principle, the proposal does look good...so what I'll be watching for is just how watered down this legislation becomes over time...and that we don't forget some of the key protections necessary, as recently outlined by a coalition of consumer groups, including: 

· Sensitive information should not be collected or used for behavioral tracking or targeting.
· No behavioral data should be collected or used from anyone under age 18 to the extent that age can be inferred.
· Web sites and ad networks shouldn’t be able to collect or use behavioral data for more than 24 hours without getting the individual’s affirmative consent.
· Behavioral data shouldn’t be used to unfairly discriminate against people or in any way that would affect an individual's credit, education, employment, insurance, or access to government benefits.

This would also include: No sensitive information (like health or financial information) should be used for behavioral tracking, no one under 18 should be behaviorally tracked, Web sites and ad networks shouldn’t be able to keep behavioral data for more than a day without getting an OK from the individual they’re tracking, and behavioral data can’t be used for discriminatory purposes.

Here are a couple responses from privacy advocates to the Administration's proposal worth noting here:

“The devil is going to be in the details,” acknowledges Paul Stephens, director of policy and advocacy for the nonprofit group Privacy Rights Clearinghouse. “It is a framework that certainly represents a decent start, but the key is going to be in three components,” he says, which include the legislation and regulations that grow out of it, and the enforcement thereof.

On paper, then, it looks fine as a work in progress, though Stephens does acknowledge that at least one provision – the “Respect for Context” clause, which says companies “will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data” – seems somewhat subjective and open for interpretation. As such, consumers concerned about their privacy will have to wait and see how this vague language of the bill of rights will translate into actionable regulation.

...

Anybody can stand behind some broad principles about respecting privacy rights,” Reitman says. “Whether it’s enforceable is still a far-off issue....even without legislation, the administration will convene multistakeholder processes that use these rights as a template for codes of conduct that are enforceable by the Federal Trade Commission.”
...

“The way it is right now … it’s historically been self-enforcing,” says Rainey Reitman, activism director for the digital rights advocacy group the Electronic Frontier Foundation. “The White House statement today changes that, so it will be under the umbrella of FTC enforcement.”


Ellen Bloom, a senior director of policy for Consumers Union, was at the press conference today in Washington, D.C., where the "Consumer Privacy Bill of Rights" was unveiled. She said consumers are very concerned about Internet companies passing along their private information to third parties. And she is happy that the administration is taking steps with the "Consumer Privacy Bill of Rights" to protect consumers. But she said the group will continue to educate and advocate to make sure privacy protections are strong enough to do the job.

"We are glad that the FTC and the advertising industry will breathe new life into the Do Not Track rules," she said. "This is a welcome first step toward providing a single simple tool to opt out of being tracked online. We are encouraged that we're on the right track. But we are not ready to rest."


More Backdrop on Behavioral Tracking
To get an even better understanding of why this matters, and what's happening to you and your information every time you get on the net check out this congressional testimony from a year or two ago from Jeff Chester of the Center for Digital Democracy...most of this is from the testimony and the groups press release...and it should clarify some of this obviously complicated issue.

“As with our financial system, privacy and consumer protection regulators have failed to keep abreast of developments in the area they are supposed to oversee,” he explained. “In order to ensure adequate trust in online marketing—an important and growing sector of our economy—Congress must enact sensible policies to protect consumers.”

“Whether using a search engine, watching an online video, creating content on a social network, receiving an email, or playing an interactive video game, we are being digitally shadowed online....Our travels through the digital media are being monitored, and digital dossiers on us are being created—and even bought and sold.” 

Singling out behavioral and “predictive” targeting for their violations of user privacy, Chester noted that the “consumer profiling and targeted advertising take place largely without our knowledge or consent, and affects such sensitive areas as financial transactions and health-related inquiries. Children and youth, among the most active users of the Internet and mobile devices, are especially at risk in this new media-marketing ecosystem.”

“Americans shouldn’t have to trade away their privacy and accept online profiling and tracking as the price they must pay in order to access the Internet and other digital media,” Chester declared, adding that far from being an impediment to continued growth in the online sector, meaningful privacy safeguards will actually stimulate the digital economy.

“The uncertainty over the loss of privacy and other consumer harms will continue to undermine confidence in the online advertising business,” he explained. “That’s why the online ad industry will actually greatly benefit from privacy regulation. Given a new regulatory regime protecting privacy, industry leaders and entrepreneurs will develop new forms of marketing services where data collection and profiling are done in an above-board, consumer-friendly fashion.”

Privacy is a fundamental right in the United States. For four decades, the foundation of U.S. privacy policies has been based on Fair Information Practices: collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability.

Those principles ensure that individuals are able to control their personal information
, help to protect human dignity, hold accountable organizations that collect personal data, promote good business practices, and limit the risk of identity theft. Developments in the digital age urgently require the application of Fair Information Practices to new business practices. Today, electronic information from consumers is collected, compiled, and sold; all done without reasonable safeguards.

Consumers are increasingly relying on the Internet and other digital services for a wide range of transactions and services, many of which involve their most sensitive affairs, including health, financial, and other personal matters. At the same time many companies are now engaging in behavioral advertising, which involves the surreptitious tracking and targeting of consumers.

Click by click, consumers’ online activities – the searches they make, the Web pages they visit, the content they view, the videos they watch and their other interactions on social networking sites, the content of emails they send and receive, how they spend money online, their physical locations using mobile Web devices, and other data – are logged into an expanding profile and analyzed in order to target them with more "relevant" advertising.

This is different from the "targeting" used in contextual advertising, in which ads are generated by a search that someone is conducting or a page the person is viewing at that moment. Behavioral tracking and targeting can combine a history of online activity across the Web with data derived offline to create even more detailed profiles. The data that is collected through behavioral tracking can, in some cases, reveal the identity of the person, but even when it does not, the tracking of individuals and the trade of personal or behavioral data raise many concerns.

Let's hope this Administration's actions match its words, that industry power won't weaken these principles beyond their usefulness, and of course, let's hope Congress is bypassed, as they serve NO PURPOSE (esp. the House) except to protect corporations and undermine people.

More to come....

Tuesday, February 21, 2012

Google Secretly Bypassing Safari Privacy Settings

The global tech giant Google, a company becoming an increasingly giant information control monopoly, has done it again. I speak of course of its long, sordid, and adversarial relationship with privacy, and this weekends news that it, and several other advertising companies have been bypassing the privacy settings in Apple's Safari browser. This is of particular concern and importance because that system, and those users, are specifically INTENDING that such monitoring to be BLOCKED.


Let's remember, it was just two weeks ago that a bit of a firestorm was sparked by Google changing its privacy policies rather abruptly, while making opt-ing out of the massive amount of data sharing that will take place if their proposed folding 60 of its 70 existing product privacy policies under one blanket policy and breaking down the identity barriers between (to accommodate its new Google+ social network software) nearly impossible.

Let us also remember that we know for instance, and they have been sued for it, companies like Google, Yahoo, Microsoft and other Internet companies track and profile users and then auction off ads targeted at individual consumers in the fractions of a second before a Web page loads.

We should also consider that sordid history of privacy and Google I mentioned at the start from Google Books  to the loss of "Locational Privacy" to the company's lobbying efforts in Congress, to its cloud computing, to its increasing usage and expansion of behavioral marketing techniques, to Google StreetView cars gathering private information from unaware local residents, to the company teaming with the National Security Agency (the agency responsible for such privacy violation greatest hits as warrantless wiretapping) "for technical assistance" to the infamous Google Buzz to the company's recent admittance that it gets THOUSANDS of requests from the government for information about its users to claims that the company manipulates its search results to favor its own products.

Before I get to some obvious solutions and responses to this latest Google controversy (like data retention limits and Do Not Track options), let me get to the story:

The Stanford study was written by Jonathan Mayer, a graduate student in law and computer science who has cranked out a growing body of headline-generating literature on online privacy. In his paper, he noted that unlike every other browser vendor, Apple's Safari automatically blocks tracking cookies generated by websites that users visit. Apple's Safari is one of the most popular browsers for mobile devices, and the default browser on Macs.


These cookies can collect information about where users go online and what they do - data that advertisers treasure.There are exceptions to Safari's cookie blocking, however. For instance, it allows what are known as "first-party cookies," those that sites like Facebook or Google drop onto devices so users don't have to sign in every time they visit. Certain carve-outs also allow Facebook users to "like" things on third-party sites.

Unlike Facebook, the problem for Google was that its social and ad networks run on different domains from its main one, Google.com. That prevents it from allowing a user of the Google+ social network to give a virtual thumbs up (or "+1") to an ad on another site, a step that makes such ads more valuable. That would require a "third-party cookie" that is blocked by Safari.


But it turns out there are a few ways for companies to get around these limitations. The one that Mayer's paper focused on involved inserting code to place tracking cookies within Safari. He found four companies doing this: Google, Vibrant Media, Media Innovation Group and PointRoll.


But in an interview, Mayer said Google had unilaterally decided that privacy permissions for its products superseded the privacy restrictions those users had enacted - implicitly or explicitly - by choosing to use Safari."The user is giving up some privacy in exchange for lining Google's pockets," he said.


Meanwhile, an even bigger problem occurred. Once Google tweaked the way Safari functions, other Google advertising cookies could be installed on the devices. "We didn't anticipate that this would happen, and we have now started removing these advertising cookies from Safari browsers," Whetstone said. Why this problem was spotted by a Stanford graduate student and not by a major corporation that's been under continual privacy scrutiny is a fair question that Google has yet to answer.

"I think that's a pretty big 'oops,' and it raises pretty big questions," Mayer said. Chris Hoofnagle, a digital privacy expert at UC Berkeley's law school, said there's a corporate tone-deafness within the engineering-centric culture of Google that leads to these sorts of mistakes. "To the engineer, cookie blocking appears to be a technical error that they should try to solve," he said. "It's very difficult for them to accept the frame that some people do not want this tracking."


Let me say, I don't know whether Google is telling the truth or not - my instincts say they aren't because of their track record on this issue. Regardless, this latest privacy breach, and violation of consumer desires and expectations proves yet again our regulations and rights have not caught up with technological advancements in the digital realm. 

As I have often written here, once again, an issue like this raises some particularly important questions: What kind of control should we have over our own data? And, what kind of tools should be available for us to protect it? What about ownership of our data? Should we be compensated for the billions of dollars being made bycorporations from their tracking of us? And of course, what of the government's access to this new world of data storage?

The argument from privacy advocates has largely been that this massive and stealth data collection apparatus threatens user privacy and regulators should compel (not hope that) companies to obtain express consent from consumers before serving up "behavioral" ads based on their online history.

More to the point is the simple, unavoidable fact that consumers should have MORE control, not less, over what information of ours is used, shared, and profited off.

Again, for first time, or rare readers of this blog, I would also point to the consistent dichotomy between the now proven HIGH LEVEL of concern about privacy on the internet among users with the fact they tend to do very little to actually protect it (which of course is related to how hard and complicated it can be to do so). Which in my mind, makes easy to use, clear options to protect privacy so paramount. Once people are given such a choice, not only will more people choose to "not be tracked", I think more people will become more AWARE of just how all pervasive such monitoring of nearly everything we do has become.

So, let's get to some OBVIOUS solutions to this growing online tracking problem, magnified by Google's latest violation of consumer trust. We CLEARLY have next to no privacy standards as related to these technological innovations and trends is disturbing, and more than enough of a reason for legislation like California's SB 761 (Do Not Track).

The Do Not Track flag is a rather simple concept that's already been built into Firefox and IE9. If users choose to turn on the option, every time they visit a web page the browser will send a message to the site, saying “do not track.”

SB 761 (Lowenthal) would offer consumers such a mechanism, something the bill's sponsor describe as "one of the most powerful tools available to protect consumers' privacy." The mechanism will allow anyone online to send Websites the message that they do not want their online activity monitored.

Obviously, this legislation happens to be something I'll be working on here in Sacramento this year - but a federal version would be most useful.

To be sure, there is no magic bullet when it comes to digital tracking protecting privacy. Another solution advocated by such privacy experts as Chris Hoofnagle, is data-retention limits. As he recently stated in an interview in the San Francisco Chronicle, "We know from behavioral economics that most people won't turn on do-not-track features, so if you're serious about protecting privacy, if you think there's a value here, you should protect it by default. It would require no user intervention. You would impair the ability of companies and law enforcement to create long-term profiles about people."

Similarly, as Hoofnagle points out, there are limitations to the "opt out" option alone, stating, "Under self-regulatory programs, they allowed people to opt out of targeted advertising if they wanted. But people figured out that what that meant is these companies could still track you, they just couldn't show you online behavioral advertising. They could still choose to target you in another channel (like direct-mail marketing or telemarketing.) And if you look at all the tracking they do, they can identify you in a fairly trivial way. Our study also found over 600 third-party hosts of cookies, most of which are not members of any self-regulatory organization (and thus aren't bound to the rules of opt-out programs). They're not even necessarily advertisers, they could be governments. We really don't know who they are."

The need for such consumer friendly and empowering solutions to this exploding data mining industry and tracking capabilities is clear because we KNOW marketers will stop at NOTHING to ensure they can monitor online behavior...so we can be better profiled by the government and marketed to by advertisers.

As a recent Berkeley study found, "Seven of the top 100 sites appeared to be using what's known as HTML5 local storage to back up standard cookies, and two were found to be respawning cookies....Third-party advertisers on the site were still employing the flash cookies, along with another type that takes advantage of the browser's cache, where online data is stored on the computer so it can be delivered faster. This ETag tracking allows advertisers to monitor users, even when they block all cookies and use a private browsing mode."

In other words, its time consumers regain control of our privacy and our personal information, through law, not through hope and polite requests to industries that don't care about you or your privacy - only their bottom line.

Tuesday, February 7, 2012

Privacy Threats The Constitution Can't Protect You From

I just read yet another fascinating and disturbing article by Alternet's Tana Ganeva - someone I've sourced on this blog before. The article in question, which certainly connects to many of the issues I've written on here over the years, is entitled "7 Privacy Threats the Constitution Can't Protect You Against".

Now, let's go through each, and I'll mix in some of what I have written on these topics in the past (and others I've cited), along with what Tana does in her article.

Interestingly, she begins with the Supreme Court case recently decided regarding GPS tracking of suspects without a warrant - an issue I've covered here in detail for over a year now.  She and I see this case, and the very limited (though correct) decision made by the court in a similar fashion. As I wrote just a couple weeks ago in response to the decision, "The fourth amendment isn’t completely dead after all! While this fundamental right to privacy is admittedly in tatters, the Supreme Court ruled yesterday that police must have a warrant in order to track someone using a GPS device....Unfortunately, the government will likely continue to insist that tracking the location of cell phones is unaffected by this ruling.

Certainly, the stand out Justice was Sonia Sotomayor, who went much further than her colleagues on the issue of privacy in the digital age - even making a case for revision of the “third-party” doctrine (i.e. we lose Fourth Amendment protection when we disclose certain information). She wrote, “More fundamentally, it may be necessary to reconsider the premise that an individual has no reasonable expectation of privacy in information voluntarily disclosed to third parties. This approach is ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks. People disclose the phone numbers that they dial or text to their cellular providers; the URLs that they visit and the e-mail addresses with which they correspond to their Internet service providers; and the books, groceries, and medications they purchase to online retailers.”

As you can see, my concern immediately went to the question of law enforcement tracking things like cell phones - which wasn't addressed in the decision (except implicitly by Sotomayor). This matters because
in 2009 Sprint received 8 million law enforcement requests for GPS location data in just one year. This is what we already KNOW, let alone what might be the full scope of the problem.

Tana Ganeva goes deeper, as the cell phone is only one way we can be tracked now, writing:

The Jones case itself presents an outdated problem, because police don't really have to bother with the clumsy task of sneaking a device onto a car; at this point, private companies have shoehorned location trackers in most "smart" gadgets. Justice Alito pointed out that the more than 332 million phones and wireless devices in use in the US contain technology that transmits the user's location. Many cars feature GPS as well, thanks to OnStar navigation. 


Location is just the start. There has probably not been a single week since 2005 without a story about Facebook, or Google, or Verizon, or AT&T terrifying consumers and privacy advocates with some new way to collect too much information and then share it with other companies or authorities. The problem is that the law does not adequately address private information that has been shared with third parties, like credit card companies or Google, Facebook and the telecoms, Tien says.

As Sotomayor put it, "I for one doubt that people would accept without complaint the warrantless disclosure to the Government of a list of every Web site they had visited in the last week, or month, or year. " 

Now let's get to Tana's second "threat", which she calls "Cameras everywhere: License plate readers, movement tracking on cameras." 

This too is something I have tackled here on this blog, writing about the ever expanding reach of video surveillance cameras. Certainly, polls are also not on my side, as large majorities of Americans seem generally fine with having every movement of their existence on tape, and watched by someone. Of course, we know that cameras DON'T in fact reduce crime and we also know that governments and law enforcement DO abuse our civil liberties when given such authority to monitor us. Those are two BIG strikes in my mind.

I'm still not convinced however, that this general support for such technological surveillance is a done deal, and the argument in favor of FEWER cameras in FEWER locations is a lost one. I believe this to be true for a couple reasons. One, most Americans have no concept of just how often they are being watched or worse, for what purposes. Two, few Americans have any idea the level of abuses such "watchers" are capable of...and if the Bush Administration has taught us anything its that we can't trust government when they are given more power than they know what to do with. My guess is we are just scratching the surface, on issues ranging from wiretapping to surveillance to monitoring, and when that surface is broken, public opinion might just change on this topic.

What people may also not fully comprehend is that advanced monitoring systems such as the one at the Statue of Liberty are proliferating around the country. State-of-the-art surveillance is increasingly being used in more every-day settings. By local police and businesses. In banks, schools and stores. There are an estimated 30 million surveillance cameras now deployed in the United States shooting 4 billion hours of footage a week. Americans are being watched, all of us, almost everywhere.
 
Now let's get to what Tana has to say on this:

Thanks in part to a decade of Homeland Security grants, America's cities are teeming with cameras -- they're on subways, on buses, on store fronts, in restaurants, in apartment complexes, and in schools.  In New York, the NYCLU found a five-fold increase in the number of security cameras in one area of New York between 1998 and 2005, and that was before the Bloomberg administration -- inspired by London, most heavily surveilled city in the world -- pledged to install 3,000 cameras in lower Manhattan as part of the Lower Manhattan Security Initiative (this plan was expanded to midtown Manhattan as well). The cameras, which stream footage to a centralized location, are equipped with video analytics that can alert police to "suspicious" activity like loitering. The NYPD, and municipalities all over the country and world also make generous use of license plate readers (LPR) that can track car movement. 

Tana's third example is of biometrics - another topic I've covered here in depth. A few months back I posted a pretty extensive blog on Facial Recognition technology and the threat it poses to individual privacy. As I've done in the past, because I know not everyone can read every post, I'll repeat a few of my thoughts here today: For some backdrop on biometrics, you can check out a past post I did about another article, also by Tana, entitled 5 Unexpected Places You Can Be Tracked With Facial Recognition Technology. As I wrote then, this issue has particular interest to me due to California's recent fight that we (Consumer Federation of California) were deeply involved in - whether biometric identifiers should be used by the DMV (we were able, with a host of other groups, to stop them).

As for the larger concern over facial recognition technology, groups from the Privacy Rights Clearinghouse (PRC) to the ACLU to the Electronic Frontier Foundation to EPIC have all been very active in making the case that there is a very real threat to privacy at stake in determining just how, and when, this technology can be used.

Again, going back to a prior post, I wrote: "First, let me refresh everyone on the concept of biometric identifiers - like fingerprints, facial, and/or iris scans.  These essentially match an individual’s personal characteristics against an image or database of images. Initially, the system captures a fingerprint, picture, or some other personal characteristic, and transforms it into a small computer file (often called a template). The next time someone interacts with the system, it creates another computer file.

Now, here's Tana on this privacy creep:
After 9/11 many cities and airports rushed to boost their camera surveillance with facial recognition software. The tech proved disappointing, and after testing that hit a paltry 60 percent accuracy rate in one case (that's pretty bad if you're trying to figure out identity), many programs were abandoned. In the years since then, both private companies and university research labs funded with government grants have made vast improvements in facial recognition and iris scans, like 3-D face capture and "skinprint" technology (mapping of facial skin patterns). Iris scans can allegedly tell identical twins apart.

Many private companies shill these products directly to local law enforcement agencies, a business strategy that police tend to be pretty enthusiastic about. One such success story is the MORIS device, a gadget attached to an iPhone that can run face recognition software, take digital fingerprints and grab an iris scan at a traffic stop. Starting last fall, the MORIS device has been in use in police departments all over the country. 

Tana's next example is that of ever expanding government databases and the incredible amounts of private data they are accumulating on us. In this instance, I'll go straight to the article, she writes:

Privacy advocates point out that novel types of biometric technology like facial recognition and iris scans can be an unreliable form of ID in the field, but that has not discouraged government agencies from embarking on grand plans to hugely expand their biometric databases. The FBI's billion-dollar "Next Generation Identification" system (NGI) will house iris scans, palm prints, measures of voice and gait, records of tattoos, and scars and photos searchable with facial recognition technology when it's complete in 2014. The bulk of this information is expected to come from local law enforcement. 

There are a number of reasons why such technological identifiers should concerns us. So let's be real clear, creating a database with millions of facial scans and thumbprints raises a host of surveillance, tracking and security question - never mind the cost. And as you might expect, such identifiers are being utilized by entities ranging from Facebook to the FBI. In fact, the ACLU of California is currently asking for information about law enforcements’ use of information gathered from facial recognition technology (as well as social networking sites, book providers, GPS tracking devices, automatic license plate readers, public video surveillance cameras)."

Next up on Tana's list of 7 privacy threats is a new one for me, called "FAST (Future Attribute Screening Technology)". She writes, "Then there's the tech that's supposed to peer inside your head. In 2008, the Department of Homeland security lab tested a program called Future Attribute Screening Technology (FAST), designed to thwart criminal activity by predicting "mal-intent." Unsavory plans are supposed to reveal themselves through physiological tells like heart rate, pheromones, electrodermal activity, and respiratory measurements, according to a 2008 privacy impact assessment. 

The 2008 privacy assessment, though, only addressed the initial laboratory testing of FAST's prophesying sensors on volunteers. According to a report in the journal Nature, sometime last year DHS also tested the technology in a large, undisclosed area in the northeastern US. 

Tana's 6th threat is none other than those mechanical war criminals called Drones! Apparently, they do more than just bomb innocent women and children around the world, but in fact, are perfect domestic spying devices too. She writes, "An ACLU report from December says that local law enforcement officials are pushing for domestic use of the new technology, as are drone manufacturers. As Glenn Greenwald points out, drone makers "continuously emphasize to investors and others that a major source of business growth for their drone products will be domestic, non-military use." 

Right now drones range in size from giant planes to hummingbird-sized, the ACLU report says, with the technology improving all the time. Some can be operated by only one officer, and others by no one at all. The report points to all the sophisticated surveillance technology that can take flight on a drone, including night vision, video analytics ("smart" surveillance that can track activities, and with improvements in biometrics, specific people), massive zoom, and the creepy see-through imaging, currently in development. 

And finally, Tana's 7th privacy threat are what she terms "Super drones that know who you are!" She goes on to explain, writing:

In September, Wired reported that the military has given out research grants to several companies to spruce up their drones with technology that lets them identify and track people on the move, or "tagging, tracking, and locating" (TTL). Noah Shachtman writes:

Perhaps the idea of spy drones already makes you nervous. Maybe you’re uncomfortable with the notion of an unblinking, robotic eye in the sky that can watch your every move. If so, you may want to click away now. Because if the Army has its way, drones won’t just be able to look at what you do. They’ll be able to recognize your face — and track you, based on how you look. If the military machines assemble enough information, they might just be able to peer into your heart.

One company claims it can equip drones with facial recognition technology that lets them build a 3-D model of a face based on a 2-D image, which would then allow the drone to ID someone, even in a crowd. They also say that if they can get a close enough look, they can tell twins apart and reveal not only individuals' identity but their social networks, reports Wired. That's not all. Shachtman continues: 

The Army also wants to identify potentially hostile behavior and intent, in order to uncover clandestine foes. Charles River Analytics is using its Army cash to build a so-called “Adversary Behavior Acquisition, Collection, Understanding, and Summarization (ABACUS)” tool. The system would integrate data from informants’ tips, drone footage, and captured phone calls. Then it would apply “a human behavior modeling and simulation engine” that would spit out “intent-based threat assessments of individuals and groups.” In other words: This software could potentially find out which people are most likely to harbor ill will toward the U.S. military or its objectives. Feeling nervous yet?

To answer that final question, yes...I do feel nervous. I've written a lot on this blog about what it means to live in a society without ANY privacy. As I have said, such a society, one we are rapidly approaching, has ramifications that go far deeper than simply "being watched" or feeling uneasy. What we are talking about is freedom itself...and the way such an all seeing surveillance state stifles dissent and dis-empowers citizens.

As I have written here before, "Whether its the knowledge that everything we do on the internet is followed and stored, that we can be wiretapped for no reason and without a warrant or probable cause, that smart grid systems monitor our daily in home habits and actions, that our emails can be intercepted, that our naked bodies must be viewed at airports and stored, that our book purchases can be accessed (particularly if Google gets its way and everything goes electronic), that street corner cameras are watching our every move (and perhaps drones too), and that RFID tags and GPS technology allow for the tracking of clothes, cars, and phones (and the list goes on)...what is certain is privacy itself is on life support in this country...and without privacy there is no freedom. I also fear how such a surveillance society stifles dissent and discourages grassroots political/social activism that challenges government and corporate power...something that we desperately need more of in this country, not less."

As Bruce Schneier, a security and privacy expert once wrote, "...lack of privacy shifts power from people to businesses or governments that control their information. If you give an individual privacy, he gets more power…laws protecting digital data that is routinely gathered about people are needed. The only lever that works is the legal lever...Privacy is a basic human need…The real choice then is liberty versus control.”

We would do well to - sooner rather than later - to recognize the inherent and fundamental value that privacy provides ANY claimed democracy. Without one there can not be the other...