Tuesday, February 21, 2012

Google Secretly Bypassing Safari Privacy Settings

The global tech giant Google, a company becoming an increasingly giant information control monopoly, has done it again. I speak of course of its long, sordid, and adversarial relationship with privacy, and this weekends news that it, and several other advertising companies have been bypassing the privacy settings in Apple's Safari browser. This is of particular concern and importance because that system, and those users, are specifically INTENDING that such monitoring to be BLOCKED.


Let's remember, it was just two weeks ago that a bit of a firestorm was sparked by Google changing its privacy policies rather abruptly, while making opt-ing out of the massive amount of data sharing that will take place if their proposed folding 60 of its 70 existing product privacy policies under one blanket policy and breaking down the identity barriers between (to accommodate its new Google+ social network software) nearly impossible.

Let us also remember that we know for instance, and they have been sued for it, companies like Google, Yahoo, Microsoft and other Internet companies track and profile users and then auction off ads targeted at individual consumers in the fractions of a second before a Web page loads.

We should also consider that sordid history of privacy and Google I mentioned at the start from Google Books  to the loss of "Locational Privacy" to the company's lobbying efforts in Congress, to its cloud computing, to its increasing usage and expansion of behavioral marketing techniques, to Google StreetView cars gathering private information from unaware local residents, to the company teaming with the National Security Agency (the agency responsible for such privacy violation greatest hits as warrantless wiretapping) "for technical assistance" to the infamous Google Buzz to the company's recent admittance that it gets THOUSANDS of requests from the government for information about its users to claims that the company manipulates its search results to favor its own products.

Before I get to some obvious solutions and responses to this latest Google controversy (like data retention limits and Do Not Track options), let me get to the story:

The Stanford study was written by Jonathan Mayer, a graduate student in law and computer science who has cranked out a growing body of headline-generating literature on online privacy. In his paper, he noted that unlike every other browser vendor, Apple's Safari automatically blocks tracking cookies generated by websites that users visit. Apple's Safari is one of the most popular browsers for mobile devices, and the default browser on Macs.


These cookies can collect information about where users go online and what they do - data that advertisers treasure.There are exceptions to Safari's cookie blocking, however. For instance, it allows what are known as "first-party cookies," those that sites like Facebook or Google drop onto devices so users don't have to sign in every time they visit. Certain carve-outs also allow Facebook users to "like" things on third-party sites.

Unlike Facebook, the problem for Google was that its social and ad networks run on different domains from its main one, Google.com. That prevents it from allowing a user of the Google+ social network to give a virtual thumbs up (or "+1") to an ad on another site, a step that makes such ads more valuable. That would require a "third-party cookie" that is blocked by Safari.


But it turns out there are a few ways for companies to get around these limitations. The one that Mayer's paper focused on involved inserting code to place tracking cookies within Safari. He found four companies doing this: Google, Vibrant Media, Media Innovation Group and PointRoll.


But in an interview, Mayer said Google had unilaterally decided that privacy permissions for its products superseded the privacy restrictions those users had enacted - implicitly or explicitly - by choosing to use Safari."The user is giving up some privacy in exchange for lining Google's pockets," he said.


Meanwhile, an even bigger problem occurred. Once Google tweaked the way Safari functions, other Google advertising cookies could be installed on the devices. "We didn't anticipate that this would happen, and we have now started removing these advertising cookies from Safari browsers," Whetstone said. Why this problem was spotted by a Stanford graduate student and not by a major corporation that's been under continual privacy scrutiny is a fair question that Google has yet to answer.

"I think that's a pretty big 'oops,' and it raises pretty big questions," Mayer said. Chris Hoofnagle, a digital privacy expert at UC Berkeley's law school, said there's a corporate tone-deafness within the engineering-centric culture of Google that leads to these sorts of mistakes. "To the engineer, cookie blocking appears to be a technical error that they should try to solve," he said. "It's very difficult for them to accept the frame that some people do not want this tracking."


Let me say, I don't know whether Google is telling the truth or not - my instincts say they aren't because of their track record on this issue. Regardless, this latest privacy breach, and violation of consumer desires and expectations proves yet again our regulations and rights have not caught up with technological advancements in the digital realm. 

As I have often written here, once again, an issue like this raises some particularly important questions: What kind of control should we have over our own data? And, what kind of tools should be available for us to protect it? What about ownership of our data? Should we be compensated for the billions of dollars being made bycorporations from their tracking of us? And of course, what of the government's access to this new world of data storage?

The argument from privacy advocates has largely been that this massive and stealth data collection apparatus threatens user privacy and regulators should compel (not hope that) companies to obtain express consent from consumers before serving up "behavioral" ads based on their online history.

More to the point is the simple, unavoidable fact that consumers should have MORE control, not less, over what information of ours is used, shared, and profited off.

Again, for first time, or rare readers of this blog, I would also point to the consistent dichotomy between the now proven HIGH LEVEL of concern about privacy on the internet among users with the fact they tend to do very little to actually protect it (which of course is related to how hard and complicated it can be to do so). Which in my mind, makes easy to use, clear options to protect privacy so paramount. Once people are given such a choice, not only will more people choose to "not be tracked", I think more people will become more AWARE of just how all pervasive such monitoring of nearly everything we do has become.

So, let's get to some OBVIOUS solutions to this growing online tracking problem, magnified by Google's latest violation of consumer trust. We CLEARLY have next to no privacy standards as related to these technological innovations and trends is disturbing, and more than enough of a reason for legislation like California's SB 761 (Do Not Track).

The Do Not Track flag is a rather simple concept that's already been built into Firefox and IE9. If users choose to turn on the option, every time they visit a web page the browser will send a message to the site, saying “do not track.”

SB 761 (Lowenthal) would offer consumers such a mechanism, something the bill's sponsor describe as "one of the most powerful tools available to protect consumers' privacy." The mechanism will allow anyone online to send Websites the message that they do not want their online activity monitored.

Obviously, this legislation happens to be something I'll be working on here in Sacramento this year - but a federal version would be most useful.

To be sure, there is no magic bullet when it comes to digital tracking protecting privacy. Another solution advocated by such privacy experts as Chris Hoofnagle, is data-retention limits. As he recently stated in an interview in the San Francisco Chronicle, "We know from behavioral economics that most people won't turn on do-not-track features, so if you're serious about protecting privacy, if you think there's a value here, you should protect it by default. It would require no user intervention. You would impair the ability of companies and law enforcement to create long-term profiles about people."

Similarly, as Hoofnagle points out, there are limitations to the "opt out" option alone, stating, "Under self-regulatory programs, they allowed people to opt out of targeted advertising if they wanted. But people figured out that what that meant is these companies could still track you, they just couldn't show you online behavioral advertising. They could still choose to target you in another channel (like direct-mail marketing or telemarketing.) And if you look at all the tracking they do, they can identify you in a fairly trivial way. Our study also found over 600 third-party hosts of cookies, most of which are not members of any self-regulatory organization (and thus aren't bound to the rules of opt-out programs). They're not even necessarily advertisers, they could be governments. We really don't know who they are."

The need for such consumer friendly and empowering solutions to this exploding data mining industry and tracking capabilities is clear because we KNOW marketers will stop at NOTHING to ensure they can monitor online behavior...so we can be better profiled by the government and marketed to by advertisers.

As a recent Berkeley study found, "Seven of the top 100 sites appeared to be using what's known as HTML5 local storage to back up standard cookies, and two were found to be respawning cookies....Third-party advertisers on the site were still employing the flash cookies, along with another type that takes advantage of the browser's cache, where online data is stored on the computer so it can be delivered faster. This ETag tracking allows advertisers to monitor users, even when they block all cookies and use a private browsing mode."

In other words, its time consumers regain control of our privacy and our personal information, through law, not through hope and polite requests to industries that don't care about you or your privacy - only their bottom line.

1 comment:

Anonymous said...

Thanks for the detailed work!