Digital Privacy, Data Mining, and the Future
Granted, that's a cryptic title...but sometimes privacy in the digital age is a cryptic subject. What are the real threats? How do we quantify them? Or is it all just paranoia - and we should give ourselves up to the Matrix that is Facebook, Google, and all the rest?
None of these questions are easy to answer. But, I do have some thoughts, and I do want to share with you two really comprehensive recent articles, one from MSNBC, and the other in TIME magazine, tackling the larger subject of digital privacy and data mining.
I want to largely provide key pieces of those articles, rather than get overly wordy myself. But, let me open with a few thoughts (yes, I realize I have said these things before), and then we'll get to the articles.
A major infrastructure has emerged to expand and promote the interests of this sector, including online advertising networks, digital marketing specialists, and trade lobbying groups.
- The role which online marketing and advertising plays in shaping our new media world, including at the global level, will help determine what kind of society we will create.
- Will online advertising evolve so that everyone's privacy is truly protected?
- Will there be only a few gatekeepers determining what editorial content should be supported in order to better serve the interests of advertising, or will we see a vibrant commercial and non-commercial marketplace for news, information, and other content necessary for a civil society?
- Who will hold the online advertising industry accountable to the public, making its decisions transparent and part of the policy debate?
- Will the more harmful aspects of interactive marketing - such as threats to public health - be effectively addressed?
The charge was that a "massive and stealth data collection apparatus threatens user privacy," and asks regulators to compel companies to obtain express consent from consumers before serving up "behavioral" ads based on their online history.
For instance, internet companies would be asked to acknowledge that the data they collect about a person's online movements through software "cookies" embedded in a Web browser allows advertisers to know details about them, even if those cookies don't have a person's name attached.
Privacy advocates have long argued that when enabled to protect their privacy and control their data people will do so. BUT, not if it’s made difficult, confusing, or time consuming. And this is why new rules, laws are so desperately needed for cyberspace...we need "systems" that will allow users to control their information in an easy, logical, and practical way.
More generally, particularly on the issue of privacy on the internet, as I have written here before, the fact that we have next to no privacy standards as related to these technological innovations and trends is disturbing, and more than enough of a reason for some of the bills being offered here.
This leads to a number of important questions, like: What kind of control should we have over our own data? And, what kind of tools should be available for us to protect it? What about ownership of our data? Should we be compensated for the billions of dollars being made by corporations from their tracking of us? And of course, what of the government's access to this new world of data storage?
The argument by some, such as Mark Zuckerberg, is that all information should be public, and as time goes on we'll only be sharing more of it. In addition, we all will benefit from this communal sharing of private information in ways yet to even be discovered. Already, from this sharing, we forge more online friendships and connections, old friends are reconnected, distant parents see pictures of their kids' day-to-day activities, jobs might be more easily found due to our profiles being more public, internet services improve as companies like Facebook and Google learn about peoples' Web browsing histories, sites are able to tailor content to the user, and so on, and so forth.
What concerns me, and some of these concerns are mentioned in both articles I'm going to feature today, is what are the side effects of living in a society without privacy?
...
The usual way to do grab attention to the topic is to trot out privacy nightmares, such as the secret dossiers that hundreds of companies keep on you (they do), the man who was accused of arson because his grocery store records showed he purchased fire starters (he was), or the idea that a potential employer may one day pass on you because your musical tastes suggest you will be late to work three time per week (they could). But privacy nightmares are beginning to feel a bit like the boy who cried wolf. Cyber experts have warned about both a Digital Pearl Harbor and an information Three Mile Island for more than a decade now; doesn't the absence of that kind of disaster show that perhaps privacy is no big deal?
...
For many, he thinks, there is a sense of learned helplessness — the feeling that their privacy is lost anyway, so why go through the hassle of faking a supermarket loyalty card application? For others, the decision tree is so complex that it's no surprise they usually take the easier option.
"There are so many mental steps we have to go through," he said. "Do I even know there is a potential privacy risk? If I do, do I know I there are alternative strategies, such as adjusting privacy settings? Do I know, or at least feel, that these will be effective, or are they a waste of time? And then, if they are effective, are they too costly in terms of time or effect? After all that, I may very well decide not to take those steps."
...
For starters, people almost always engage in "hyperbolic discounting" when faced with a privacy choice — they overvalue present benefits and undervalue future costs. You probably do that every day when you convince yourself that an extra cookie or scoop of ice cream is worth the bargain with your waistline. In the realm of privacy, judging such bargains can be impossible. What's the future cost of sharing your phone number with a grocery store? It could be nothing. It could be annoying phone calls or junk mail. It could be intense profiling by a marketer. It could ultimately be an increase to your health premium, as a medical insurance company one day decides you buy too much ice cream every month.
Despite recent rhetoric to the contrary, long ago America decided that there are realms where it's not OK to let consumers make decisions that guaranteed to cause self-harm. We don't let people eat in restaurants that fail health inspections; we don't let people buy buildings that aren't earthquake proof near fault lines; we don't let them buy cars without seat belts — even if all these options were cheaper, or somehow more enjoyable. Why? It's impossible for consumers to really understand the consequences of such actions at the time of the choice. We wouldn't expect every San Francisco home buyer to become an expert seismologist, or every eater to become a biologist. Even if you care nothing for personal safety, it would be a terribly inefficient way to run an economy.
Acquisti thinks it's time that society erected some strict safety rules around privacy issues, and end the charade of 27-page end user license agreements that no one — not even Acquisti — reads. The right answer for the majority of Americans who care about privacy but don't know what to do about it is for leaders to make some tough choices.
There are some efforts under way in that direction. There are no fewer than seven pieces of privacy-related legislation that have either been introduced in the U.S. House of Representatives, or soon will be. The most significant involves creation of the Do Not Track legislation, which would authorize the Federal Trade Commission to create a regime that forced companies to allow users to opt out of various data collection efforts. It would also give consumers a "right of access" to personal information stored by any company — a right Europeans have enjoyed for years. While the law is meant to evoke the very popular Do Not Call list, critics worry that few consumers would take the time required to opt out.
The Financial Information Privacy Act of 2011 would prevent banks from sharing customer information with third parties unless consumers opt-in, a significant step further along in privacy protection. Banks would then have to sell people on the idea of information sharing. (A detailed look at these proposals.)
Timid as they are, virtually all these bills have run up against ferocious industry lobbying. Facebook, among many other firms, has told the FTC it's worried that the Do Not Track initiative would stifle innovation.
...
Ponemon doesn't see Facebook as a panopticon — yet. But it doesn't have to go that far to put a serious dent in the American dream, he worries. People no longer expect to keep secrets, Ponemon said, which means that every stupid thing you do in high school will follow you around for the rest of your life. He is scared about the implications of that.
"The end of privacy is the end of second chances," Ponemon said. "Some people may think I'm just being a cranky old guy ... but the thing about what made this country great is our ancestors came with nothing. They didn't have a reputation, positive or negative. They could, like my dad, go to Arizona and become a dentist, something he couldn't do in his home country. The ability to reinvent ourselves has made great fortunes. The ability to do that today is significantly diminished because of all the information that is attached to us. Could we have another Thomas Edison now, who dropped out of elementary school in his first year (at age 7)? Maybe not."
Acquisti isn't just worried about the American way of life; he's worried about humanity itself.
"What I fear is the normalization of privacy invasions in a world where we become so adjusted to being public in everything that it is normal," he said. "I fear that world will be a world where we will be less human. Part of being human is having a private sphere and things you only share with special people, or with no one. I fear for the future of that world."
Acquisti, despite his exhaustive research on the subject, said he has no desire to persuade others to change their privacy-related behaviors. People make rational choices every day to share themselves with others, and to great benefit — they form relationships, find work and in extreme cases use social networking tools to fight for freedom, he said. People who want to share everything with everyone have the freedom to do so.
"It will become increasingly costly not to be on a social network, just as not having a mobile phone now is," he said. "It will dramatically cut people off from professional and personal life opportunities. The more people who join the social networks, the more costly it becomes for others to be loyal to their views."
In economics, it's called an "externality" — the costs of your choices go up because of factors that have nothing to do with you. On the Internet, it's called the network effect. In reality, it means that someone who has no interest in being on Facebook is now the last to know about last-minute parties, new romances, even weddings and funerals. (We've all heard at least once: "Didn't you see my Facebook post?")
As the network effect deepens, and the majority speeds down its road toward a completely open second life in the virtual world, society must work to preserve the right of the minority's desire to stay private in the first life — not unlike efforts we make today to preserve rights of other minority groups, such as the handicapped, Acquisti said.
"Freedom means making sure people have the option to stay off the grid; the more people surrender, the deeper the network effect, the more the punishment for being disconnected," Acquisti says.
Click here for more...and be sure to read the parts about things you can do to protect your privacy!
Now let's get to the Time magazine piece by Joel Stein entitled Data Mining: How Companies Now Know Everything About You, which goes into more detail about HOW your data is mined and by whom:
There is now an enormous multibillion-dollar industry based on the collection and sale of this personal and behavioral data, an industry that Senator John Kerry, chair of the Subcommittee on Communications, Technology and the Internet, is hoping to rein in. Kerry is about to introduce a bill that would require companies to make sure all the stuff they know about you is secured from hackers and to let you inspect everything they have on you, correct any mistakes and opt out of being tracked. He is doing this because, he argues, "There's no code of conduct. There's no standard. There's nothing that safeguards privacy and establishes rules of the road."
Our identities, however, were never completely within our control: our friends keep letters we've forgotten writing, our enemies tell stories about us we remember differently, our yearbook photos are in way too many people's houses. Opting out of all those interactions is opting out of society. Which is why Facebook is such a confusing privacy hub point. Many data-mining companies made this argument to me: How can I complain about having my Houston trip data-mined when I'm posting photos of myself with a giant mullet and a gold chain on Facebook and writing columns about how I want a second kid and my wife doesn't? Because, unlike when my data is secretly mined, I get to control what I share. Even narcissists want privacy. "It's the difference between sharing and tracking," says Bret Taylor, Facebook's chief technology officer.
Politics. Companies like Acxiom supply consolidated personal data to political campaigns so that politicians can craft targeted messages to various demographic groups. Since there are no 'truth in politics' laws, messages that are crafted lies are another misuse of this data.
Consoildated personal data is also used by the FBI. This can be bad or good depending the FBI's intentions.
I think you get the gist...check out the whole piece here.
While much of this kind of data mining is innocuous, and won't do any specific damage, I would still argue its important to give people more control, or better, force companies to get our permission (i.e. opt-in) before our information is bought and sold. I'd also point out, that by definition, the larger the amount of information about us is stored, the easier it will be to get stolen or accessed by those we don't want to. And finally, because this has been a mammoth post as it is, I worry, again, about the very meaning of privacy, and what the ramifications are of it dissolving completely.
As Bruce Schneier noted, “…lack of privacy shifts power from people to businesses or governments that control their information. If you give an individual privacy, he gets more power…laws protecting digital data that is routinely gathered about people are needed. The only lever that works is the legal lever...Privacy is a basic human need…The real choice then is liberty versus control.”
No comments:
Post a Comment