Wednesday, April 7, 2010

Bruce Schneier on Facebook and Google: "These companies say privacy erosion is inevitable-but they're making it so"

For anyone that reads my stuff regularly you'll know I quote Bruce Schneier whenever I get the chance. I guess the reason is self explanatory: the guy REALLY knows what he's talking about.

So, it was with great pleasure that I came across his op-ed in Forbes magazine that calls out both Google and Facebook (as I often do here) for their anti-privacy policies.

For a backdrop on the history of these behemoths abysmal record on privacy and their continuous oppositional stance towards it, I suggest you check out some of my past posts. I just posted a fairly extensive run down of Facebook on Friday, and here's a fairly recent one I wrote on Google.

Before I get to the article, that I will post nearly in full, let me highlight a recent study that tells the story about what Facebook, and other social networking sites are doing to undermine privacy (and which could be stopped immediately without effecting the general enjoyment one gets from using them).

The study found that the 43 leading sites made privacy control settings difficult to find and to understand; and the defaults were almost always set to allow maximum dispersal of data. That's just a taste of wherein lies the problem...check out my posts for more.

As I stated to the PUC two weeks ago, with Google lobbyists in the room no doubt, " Google product after another – from Google Buzz to Google Books - has been a virtual privacy train wreck. The company's refusal to make public how often information about their users is demanded by, or disclosed to the government, is all the more disconcerting."

Google’s CEO, Steve Schmidt recently stated "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place."

As you let that sink in, he also said:

"… the reality is that search engines including Google do retain this information for some time, and it's important, for example that we are all subject in the United States to the Patriot Act. It is possible that that information could be made available to the authorities."

Finally, I also quoted Scheier to the Public Utilities Commission, a quote I think deserves repeating: “…lack of privacy shifts power from people to businesses or governments that control their information. If you give an individual privacy, he gets more power…laws protecting digital data that is routinely gathered about people are needed. The only lever that works is the legal lever...Privacy is a basic human need…The real choice then is liberty versus control.”

With that, let's get to the man himself. Schneier writes in Forbes:

In January Facebook Chief Executive, Mark Zuckerberg, declared the age of privacy to be over. A month earlier, Google Chief Eric Schmidt expressed a similar sentiment. Add Scott McNealy's and Larry Ellison's comments from a few years earlier, and you've got a whole lot of tech CEOs proclaiming the death of privacy--especially when it comes to young people.

It's just not true. People, including the younger generation, still care about privacy. Yes, they're far more public on the Internet than their parents: writing personal details on Facebook, posting embarrassing photos on Flickr and having intimate conversations on Twitter. But they take steps to protect their privacy and vociferously complain when they feel it violated. They're not technically sophisticated about privacy and make mistakes all the time, but that's mostly the fault of companies and Web sites that try to manipulate them for financial gain.


People's relationship with privacy is socially complicated. Salience matters: People are more likely to protect their privacy if they're thinking about it, and less likely to if they're thinking about something else. Social-networking sites know this, constantly reminding people about how much fun it is to share photos and comments and conversations while downplaying the privacy risks. Some sites go even further, deliberately hiding information about how little control--and privacy--users have over their data. We all give up our privacy when we're not thinking about it.


Here's the problem: The very companies whose CEOs eulogize privacy make their money by controlling vast amounts of their users' information. Whether through targeted advertising, cross-selling or simply convincing their users to spend more time on their site and sign up their friends, more information shared in more ways, more publicly means more profits. This means these companies are motivated to continually ratchet down the privacy of their services, while at the same time pronouncing privacy erosions as inevitable and giving users the illusion of control.

You can see these forces in play with Google's launch of Buzz. Buzz is a Twitter-like chatting service, and when Google launched it in February, the defaults were set so people would follow the people they corresponded with frequently in Gmail, with the list publicly available. Yes, users could change these options, but--and Google knew this--changing options is hard and most people accept the defaults, especially when they're trying out something new. People were upset that their previously private e-mail contacts list was suddenly public. A Federal Trade Commission commissioner even threatened penalties. And though Google changed its defaults, resentment remained.


Facebook tried a similar control grab when it changed people's default privacy settings last December to make them more public. While users could, in theory, keep their previous settings, it took an effort. Many people just wanted to chat with their friends and clicked through the new defaults without realizing it.

Facebook has a history of this sort of thing. In 2006 it introduced News Feeds, which changed the way people viewed information about their friends. There was no true privacy change in that users could not see more information than before; the change was in control--or arguably, just in the illusion of control. Still, there was a large uproar. And Facebook is doing it again; last month, the company announced new privacy changes that will make it easier for it to collect location data on users and sell that data to third parties.

With all this privacy erosion, those CEOs may actually be right--but only because they're working to kill privacy. On the Internet, our privacy options are limited to the options those companies give us and how easy they are to find. We have Gmail and Facebook accounts because that's where we socialize these days, and it's hard--especially for the younger generation--to opt out. As long as privacy isn't salient, and as long as these companies are allowed to forcibly change social norms by limiting options, people will increasingly get used to less and less privacy.

There's no malice on anyone's part here; it's just market forces in action. If we believe privacy is a social good, something necessary for democracy, liberty and human dignity, then we can't rely on market forces to maintain it. Broad legislation protecting personal privacy, by giving people control over their personal data is the only solution.

Click here for the rest of the article (though I admittedly nearly posted all of it).

I can't agree with him more. I used to work, and still occasionally do (and will again in the future), on environmental issues, and the dilemma is very similar. People want to, if enabled, to recycle, to drive more efficient cars, and to even use solar energy in their homes. BUT, not if its made overly difficult, or costly, and if "systems" aren't in place, usually through law, that make it easy, logical, and practical, they won't.

The good news is it takes very little to create these "systems" that enable us to make better life choices, for ourselves as individuals, and for the society as a whole. The fight is never in the practicality of the laws, or system itself, its in the special, usually corporate interests that are fighting change...because that change might undercut their profits.

In the case of the environment, its the automakers, big oil, big coal, nuclear, and others. When it comes to privacy, its the HUGE money that can be made off our data, both through selling it and then utilizing it for marketing.

So the fact is, privacy is similar. If we are given an easy to use and understand "system" (i.e. laws) that allows us to protect our data, share it only through opting in, then we will. If its like solving a Rubic's Cube to do so, we won't. So we come down to the choice that Schneier articulates: if we value privacy as a social good (which it is), and a fundamental liberty and right, then we MUST put rules and laws in place that protect it as such.

No comments: