Thursday, March 17, 2011

Digital Privacy, Data Mining, and the Future

Granted, that's a cryptic title...but sometimes privacy in the digital age is a cryptic subject. What are the real threats? How do we quantify them? Or is it all just paranoia - and we should give ourselves up to the Matrix that is Facebook, Google, and all the rest?

None of these questions are easy to answer. But, I do have some thoughts, and I do want to share with you two really comprehensive recent articles, one from MSNBC, and the other in TIME magazine, tackling the larger subject of digital privacy and data mining.

I want to largely provide key pieces of those articles, rather than get overly wordy myself. But, let me open with a few thoughts (yes, I realize I have said these things before), and then we'll get to the articles.

First, just what is "behavioral marketing" - because that's what this is really all about. I've found the description by the Center for Digital Democracy particularly useful:

Perhaps the most powerful - but largely invisible - force shaping our digital media reality is the role of interactive advertising and marketing. Much of our online experience, from websites to search engines to social networks, is being shaped to better serve advertisers. Increasingly, individuals are being electronically "shadowed" online, our actions and behaviors observed, collected, and analyzed so that we can be "micro-targeted." Now a $24 billion a year industry [2008 estimates] in the U.S., with expected dramatic growth to $80 billion or more by 2011, the goal of interactive marketing is to use the awesome power of new media to deeply engage you in what is being sold: whether it's a car, a vacation, a politician or a belief. An explosion of digital technologies, such as behavioral targeting and retargeting, "immersive" rich media, and virtual reality, are being utilized to drive the market goals of the largest brand advertisers and many others.

A major infrastructure has emerged
to expand and promote the interests of this sector, including online advertising networks, digital marketing specialists, and trade lobbying groups.
  • The role which online marketing and advertising plays in shaping our new media world, including at the global level, will help determine what kind of society we will create.
  • Will online advertising evolve so that everyone's privacy is truly protected?
  • Will there be only a few gatekeepers determining what editorial content should be supported in order to better serve the interests of advertising, or will we see a vibrant commercial and non-commercial marketplace for news, information, and other content necessary for a civil society?
  • Who will hold the online advertising industry accountable to the public, making its decisions transparent and part of the policy debate?
  • Will the more harmful aspects of interactive marketing - such as threats to public health - be effectively addressed?
To give you an idea how important this whole issue is, just last year privacy advocates - including Center for Digital Democracy, U.S. PIRG, and the World Privacy Forum - filed a complaint with federal regulators against tracking and profiling practices used by Google, Yahoo, Microsoft and other Internet companies to auction off ads targeted at individual consumers in the fractions of a second before a Web page loads.

The charge was that a "massive and stealth data collection apparatus threatens user privacy," and asks regulators to compel companies to obtain express consent from consumers before serving up "behavioral" ads based on their online history.

For instance, internet companies would be asked to acknowledge that the data they collect about a person's online movements through software "cookies" embedded in a Web browser allows advertisers to know details about them, even if those cookies don't have a person's name attached.

Privacy advocates have long argued that when enabled to protect their privacy and control their data people will do so. BUT, not if it’s made difficult, confusing, or time consuming. And this is why new rules, laws are so desperately needed for cyberspace...we need "systems" that will allow users to control their information in an easy, logical, and practical way.

More generally, particularly on the issue of privacy on the internet, as I have written here before, the fact that we have next to no privacy standards as related to these technological innovations and trends is disturbing, and more than enough of a reason for some of the bills being offered here.

This leads to a number of important questions, like: What kind of control should we have over our own data? And, what kind of tools should be available for us to protect it? What about ownership of our data? Should we be compensated for the billions of dollars being made by corporations from their tracking of us? And of course, what of the government's access to this new world of data storage?

The argument by some, such as Mark Zuckerberg, is that all information should be public, and as time goes on we'll only be sharing more of it. In addition, we all will benefit from this communal sharing of private information in ways yet to even be discovered. Already, from this sharing, we forge more online friendships and connections, old friends are reconnected, distant parents see pictures of their kids' day-to-day activities, jobs might be more easily found due to our profiles being more public, internet services improve as companies like Facebook and Google learn about peoples' Web browsing histories, sites are able to tailor content to the user, and so on, and so forth.

What concerns me, and some of these concerns are mentioned in both articles I'm going to feature today, is what are the side effects of living in a society without privacy?

I don't think its by accident that we are told by the same interests that profit off our information that privacy is dead, and people don't care about it anymore. Well, that's easy to say when you are the ones developing the complicated and difficult to find privacy settings consumers have to deal with. 

On that note, let's get to some of the key sections of the article by Bob Sullivan of MSNBC entitled "Why should I care about digital privacy?":

Welcome to the world of privacy experts like Larry Ponemon and Alessandro Acquisti. Their chosen field of work is an area where research can be pretty depressing. Consumer behavior shows, repeatedly, that people just don't care about privacy, no matter how much lip service they might give to the topic. Ponemon's research shows that most U.S. adults — 60 percent —claim they care about privacy but will barely lift a finger in an effort to preserve it. They don't alter Facebook privacy settings, they don't complain when supermarkets demand their phone numbers and they certainly don't insist on encrypted e-mail. LosHuertos' experiment underscores this point well. Even people who have experienced a "privacy mugging" often don't change their behavior.

...

The usual way to do grab attention to the topic is to trot out privacy nightmares, such as the secret dossiers that hundreds of companies keep on you (they do), the man who was accused of arson because his grocery store records showed he purchased fire starters (he was), or the idea that a potential employer may one day pass on you because your musical tastes suggest you will be late to work three time per week (they could). But privacy nightmares are beginning to feel a bit like the boy who cried wolf. Cyber experts have warned about both a Digital Pearl Harbor and an information Three Mile Island for more than a decade now; doesn't the absence of that kind of disaster show that perhaps privacy is no big deal?

...

For many, he thinks, there is a sense of learned helplessness — the feeling that their privacy is lost anyway, so why go through the hassle of faking a supermarket loyalty card application? For others, the decision tree is so complex that it's no surprise they usually take the easier option.


"There are so many mental steps we have to go through," he said. "Do I even know there is a potential privacy risk? If I do, do I know I there are alternative strategies, such as adjusting privacy settings? Do I know, or at least feel, that these will be effective, or are they a waste of time? And then, if they are effective, are they too costly in terms of time or effect? After all that, I may very well decide not to take those steps."
...

For starters, people almost always engage in "hyperbolic discounting" when faced with a privacy choice — they overvalue present benefits and undervalue future costs. You probably do that every day when you convince yourself that an extra cookie or scoop of ice cream is worth the bargain with your waistline. In the realm of privacy, judging such bargains can be impossible. What's the future cost of sharing your phone number with a grocery store? It could be nothing. It could be annoying phone calls or junk mail. It could be intense profiling by a marketer. It could ultimately be an increase to your health premium, as a medical insurance company one day decides you buy too much ice cream every month.


Despite recent rhetoric to the contrary, long ago America decided that there are realms where it's not OK to let consumers make decisions that guaranteed to cause self-harm. We don't let people eat in restaurants that fail health inspections; we don't let people buy buildings that aren't earthquake proof near fault lines; we don't let them buy cars without seat belts — even if all these options were cheaper, or somehow more enjoyable. Why? It's impossible for consumers to really understand the consequences of such actions at the time of the choice. We wouldn't expect every San Francisco home buyer to become an expert seismologist, or every eater to become a biologist. Even if you care nothing for personal safety, it would be a terribly inefficient way to run an economy.

Acquisti thinks it's time that society erected some strict safety rules around privacy issues, and end the charade of 27-page end user license agreements that no one — not even Acquisti — reads. The right answer for the majority of Americans who care about privacy but don't know what to do about it is for leaders to make some tough choices.

There are some efforts under way in that direction. There are no fewer than seven pieces of privacy-related legislation that have either been introduced in the U.S. House of Representatives, or soon will be. The most significant involves creation of the Do Not Track legislation, which would authorize the Federal Trade Commission to create a regime that forced companies to allow users to opt out of various data collection efforts. It would also give consumers a "right of access" to personal information stored by any company — a right Europeans have enjoyed for years. While the law is meant to evoke the very popular Do Not Call list, critics worry that few consumers would take the time required to opt out.

The Financial Information Privacy Act of 2011 would prevent banks from sharing customer information with third parties unless consumers opt-in, a significant step further along in privacy protection. Banks would then have to sell people on the idea of information sharing. (A detailed look at these proposals.)

Timid as they are, virtually all these bills have run up against ferocious industry lobbying. Facebook, among many other firms, has told the FTC it's worried that the Do Not Track initiative would stifle innovation.


...

Ponemon doesn't see Facebook as a panopticon — yet. But it doesn't have to go that far to put a serious dent in the American dream, he worries. People no longer expect to keep secrets, Ponemon said, which means that every stupid thing you do in high school will follow you around for the rest of your life. He is scared about the implications of that.

"The end of privacy is the end of second chances," Ponemon said. "Some people may think I'm just being a cranky old guy ... but the thing about what made this country great is our ancestors came with nothing. They didn't have a reputation, positive or negative. They could, like my dad, go to Arizona and become a dentist, something he couldn't do in his home country. The ability to reinvent ourselves has made great fortunes. The ability to do that today is significantly diminished because of all the information that is attached to us. Could we have another Thomas Edison now, who dropped out of elementary school in his first year (at age 7)? Maybe not."

Acquisti isn't just worried about the American way of life; he's worried about humanity itself.

"What I fear is the normalization of privacy invasions in a world where we become so adjusted to being public in everything that it is normal," he said. "I fear that world will be a world where we will be less human. Part of being human is having a private sphere and things you only share with special people, or with no one. I fear for the future of that world."

Acquisti, despite his exhaustive research on the subject, said he has no desire to persuade others to change their privacy-related behaviors. People make rational choices every day to share themselves with others, and to great benefit — they form relationships, find work and in extreme cases use social networking tools to fight for freedom, he said. People who want to share everything with everyone have the freedom to do so.

But it's freedom he's most interested in preserving — the freedom of some people to keep their lives private in a world while the costs of doing so are increasingly rising.

"It will become increasingly costly not to be on a social network, just as not having a mobile phone now is," he said. "It will dramatically cut people off from professional and personal life opportunities. The more people who join the social networks, the more costly it becomes for others to be loyal to their views."

In economics, it's called an "externality" — the costs of your choices go up because of factors that have nothing to do with you. On the Internet, it's called the network effect. In reality, it means that someone who has no interest in being on Facebook is now the last to know about last-minute parties, new romances, even weddings and funerals. (We've all heard at least once: "Didn't you see my Facebook post?")

As the network effect deepens, and the majority speeds down its road toward a completely open second life in the virtual world, society must work to preserve the right of the minority's desire to stay private in the first life — not unlike efforts we make today to preserve rights of other minority groups, such as the handicapped, Acquisti said.

"Freedom means making sure people have the option to stay off the grid; the more people surrender, the deeper the network effect, the more the punishment for being disconnected," Acquisti says.


Click here for more...and be sure to read the parts about things you can do to protect your privacy!

Now let's get to the Time magazine piece by Joel Stein entitled Data Mining: How Companies Now Know Everything About You, which goes into more detail about HOW your data is mined and by whom:


The Creep Factor
There is now an enormous multibillion-dollar industry based on the collection and sale of this personal and behavioral data, an industry that Senator John Kerry, chair of the Subcommittee on Communications, Technology and the Internet, is hoping to rein in. Kerry is about to introduce a bill that would require companies to make sure all the stuff they know about you is secured from hackers and to let you inspect everything they have on you, correct any mistakes and opt out of being tracked. He is doing this because, he argues, "There's no code of conduct. There's no standard. There's nothing that safeguards privacy and establishes rules of the road." 

At Senate hearings on privacy beginning March 16, the Federal Trade Commission (FTC) will be weighing in on how to protect consumers. It has already issued a report that calls upon the major browsers to come up with a do-not-track mechanism that allows people to choose not to have their information collected by companies they aren't directly doing business with. Under any such plan, it would likely still be O.K. for Amazon to remember your past orders and make purchase suggestions or for American Express to figure your card was stolen because a recent purchase doesn't fit your precise buying patterns. But it wouldn't be cool if they gave another company that information without your permission. (See "Will FTC's 'Do Not Track' Go Even Further than Expected?")

Taking your information without asking and then profiting from it isn't new: it's the idea behind the phone book, junk mail and telemarketing. Worrying about it is just as old: in 1890, Louis Brandeis argued that printing a photograph without the subject's permission inflicts "mental pain and distress, far greater than could be inflicted by mere bodily harm." Once again, new technology is making us weigh what we're sacrificing in privacy against what we're gaining in instant access to information. Some facts about you were always public — the price of your home, some divorce papers, your criminal records, your political donations — but they were held in different buildings, accessible only by those who filled out annoying forms; now they can be clicked on. Other information was not possible to compile pre-Internet because it would have required sending a person to follow each of us around the mall, listen to our conversations and watch what we read in the newspaper. Now all of those activities happen online — and can be tracked instantaneously. 

Part of the problem people have with data mining is that it seems so creepy. Right after I e-mailed a friend in Texas that I might be coming to town, a suggestion for a restaurant in Houston popped up as a one-line all-text ad above my Gmail inbox. But it's not a barbecue-pit master stalking me, which would indeed be creepy; it's an algorithm designed to give me more useful, specific ads. And while that doesn't sound like all that good a deal in exchange for my private data, if it means that I get to learn when the next Paul Thomas Anderson movie is coming out, when Wilco is playing near my house and when Tom Colicchio is opening a restaurant close by, maybe that's not such a bad return. 

I deeply believe that, but it's still too easy to find our gardens. Your political donations, home value and address have always been public, but you used to have to actually go to all these different places — courthouses, libraries, property-tax assessors' offices — and request documents. "You were private by default and public by effort. Nowadays, you're public by default and private by effort," says Lee Tien, a senior staff attorney for the Electronic Frontier Foundation, an advocacy group for digital rights. "There are all sorts of inferences that can be made about you from the websites you visit, what you buy, who you talk to. What if your employer had access to information about you that shows you have a particular kind of health condition or a woman is pregnant or thinking about it?" Tien worries that political dissidents in other countries, battered women and other groups that need anonymity are vulnerable to data mining. At the very least, he argues, we're responsible to protect special groups, just as Google Street View allows users to request that a particular location, like an abused-women's shelter, not be photographed. (See the top 10 Twitter moments of 2010.)

Other democratic countries have taken much stronger stands than the U.S. has on regulating data mining. Google Street View has been banned by the Czech Republic. Germany — after protests and much debate — decided at the end of last year to allow it but to let people request that their houses not be shown, which nearly 250,000 people had done as of last November. E.U. Justice Commissioner Viviane Reding is about to present a proposal to allow people to correct and erase information about themselves on the Web. "Everyone should have the right to be forgotten," she says. "Due to their painful history in the 20th century, Europeans are naturally more sensitive to the collection and use of their data by public authorities." 

After 9/11, not many Americans protested when concerns about security seemed to trump privacy. Now that privacy issues are being pushed in Congress, companies are making last-ditch efforts to become more transparent. New tools released in February for Firefox and Google Chrome browsers let users block data collecting, though Firefox and Chrome depend on the data miners to respect the users' request, which won't stop unscrupulous companies. In addition to the new browser options, an increasing number of ads have a little i (an Advertising Option Icon), which you can click on to find out exactly which companies are tracking you and what they do. The technology behind the icon is managed by Evidon, the company that provides the Ghostery download. Evidon has gotten more than 500 data-collecting companies to provide their info.

They're not even moving that much faster with the generation that grew up with the Internet. While young people expect more of their data to be mined and used, that doesn't mean they don't care about privacy. "In my research, I found that teenagers live with this underlying anxiety of not knowing the rules of who can look at their information on the Internet. They think schools look at it, they think the government looks at it, they think colleges can look at it, they think employers can look at it, they think Facebook can see everything," says Sherry Turkle, a professor at MIT who is the director of the Initiative on Technology and Self and the author of Alone Together: Why We Expect More from Technology and Less From Each Other. "It's the opposite of the mental state I grew up in. My grandmother took me down to the mailbox in Brooklyn every morning, and she would say, 'It's a federal offense for anyone to look at your mail. That's what makes this country great.' In the old country they'd open your mail, and that's how they knew about you." (Comment on this story.)
 
Data mining, Turkle argues, is a panopticon: the circular prison invented by 18th century philosopher Jeremy Bentham where you can't tell if you're being observed, so you assume that you always are. "The practical concern is loss of control and loss of identity," says Marc Rotenberg, executive director of the Electronic Privacy Information Center. "It's a little abstract, but that's part of what's taking place."

The Facebook and Google Troves
Our identities, however, were never completely within our control: our friends keep letters we've forgotten writing, our enemies tell stories about us we remember differently, our yearbook photos are in way too many people's houses. Opting out of all those interactions is opting out of society. Which is why Facebook is such a confusing privacy hub point. Many data-mining companies made this argument to me: How can I complain about having my Houston trip data-mined when I'm posting photos of myself with a giant mullet and a gold chain on Facebook and writing columns about how I want a second kid and my wife doesn't? Because, unlike when my data is secretly mined, I get to control what I share. Even narcissists want privacy. "It's the difference between sharing and tracking," says Bret Taylor, Facebook's chief technology officer. 

Since targeted ads are so much more effective than nontargeted ones, websites can charge much more for them. This is why — compared with the old banners and pop-ups — online ads have become smaller and less invasive, and why websites have been able to provide better content and still be free. Besides, the fact that I'm going to Houston is bundled with the information that 999 other people are Houston-bound and is auctioned by a computer; no actual person looks at my name or my Houston-boundness. Advertisers are interested only in tiny chunks of information about my behavior, not my whole profile, which is one of the reasons M. Ryan Calo, a Stanford Law School professor who is director of the school's Consumer Privacy Project, argues that data mining does no actual damage. (See "How Facebook Is Redefining Privacy.")

"We have this feeling of being dogged that's uncomfortable," Calo says, "but the risk of privacy harm isn't necessarily harmful. Let's get serious and talk about what harm really is." The real problem with data mining, Calo and others believe, arises when the data is wrong. "It's one thing to see bad ads because of bad information about you. It's another thing if you're not getting a credit card or a job because of bad information," says Justin Brookman, the former chief of the Internet bureau of the New York attorney general's office, who is now the director of the Center for Democracy and Technology, a nonprofit group in Washington. (Comment on this story.)

...

In 1989 I augmented some technology at a major financial services company that would track offers made to prospects to become customers, and I remained involved in this industry for 16 more years. I can tell you that most activity of this kind is innocuous and for the most part designed to send targeted advertising offers that will make people happy. However, there definitely are darksides.  Identity theft. Social Security numbers are not supposed to be released except for bonafide activities such as evaluating credit risk.  I have to think it's a violation of law if a financial services company has released your credit card number to a marketing company.  Track down the source of your social security number residing at a marketing company and I believe you will find a violator.  In addition, marketing companies have no real business seeking your social security number so that should be outlawed.

Politics. Companies like Acxiom supply consolidated personal data to political campaigns so that politicians can craft targeted messages to various demographic groups.  Since there are no 'truth in politics' laws, messages that are crafted lies are another misuse of this data.

Consoildated personal data is also used by the FBI.  This can be bad or good depending the FBI's intentions.

I think you get the gist...check out the whole piece here.

While much of this kind of data mining is innocuous, and won't do any specific damage, I would still argue its important to give people more control, or better, force companies to get our permission (i.e. opt-in) before our information is bought and sold. I'd also point out, that by definition, the larger the amount of information about us is stored, the easier it will be to get stolen or accessed by those we don't want to. And finally, because this has been a mammoth post as it is, I worry, again, about the very meaning of privacy, and what the ramifications are of it dissolving completely.

As Bruce Schneier noted, “…lack of privacy shifts power from people to businesses or governments that control their information. If you give an individual privacy, he gets more power…laws protecting digital data that is routinely gathered about people are needed. The only lever that works is the legal lever...Privacy is a basic human need…The real choice then is liberty versus control.”

No comments: