"The future of privacy: How privacy norms can inform regulation"
Most of the conversation surrounding privacy tends to focus on the market, the technology, and the law. All of these entities tend to use social norms to justify their position without actually understanding the nuance that goes on and without taking an opportunity to learn from what people do to manage privacy. I believe that if you actually dive into and understand social norms, you will be able to develop more innovative and appropriate policies, technologies, and business models.
Teens are fully aware of how difficult achieving privacy is. Many complain non-stop about the impossibilities of obtaining privacy at home, talking relentlessly about their parents who are always "in their business." But we don't give teens enough credit. They're creative and they use all sorts of tactics to achieve privacy, online and off.
Interestingly, the "privacy features" are usually the least reliable for teens. I don't care how "simplified" Facebook says those settings are, the teens that I'm meeting can't make heads or tails of what those settings mean. They read the notification at the top of the page that says that Facebook has taken extra precautions for minors and they hope that Facebook's settings are good enough. They've fiddled with the settings some but have no sense of whether or not they're doing the job. Luckily for Facebook, all that teens expect those settings to do is to keep out the "creepers." Teens have given up trusting Facebook to help them limit how far a photo can be seen or restricting access to a status update. The problems that they face are more systemic. No matter what tools Facebook builds, nothing will let them keep their mother from looking over their shoulder at home. And nothing will stop their ex-best-friend from re-posting a photo. And besides, from their point of view, somehow, people seem to be looking in which is why they're getting advertisements that are connected to their content because they don't understand how behavioral ads work. They are convinced their content is being read by people and they don't like it, but they don't know how to stop it. And it's too socially important to be in the place where all other teens are to worry too much about it anyways. So they stomach the surveillance and look for different tactics.
Part of what's shaping these dynamics has to do with audience awareness. Teens from more privileged backgrounds _know_ that their parents are lurking and have completely accepted the message that college admissions officers will read their profiles. They're expecting it and they're performing for them. I met teens who purposely composed their Facebook profiles with college admissions officers in mind. One teen crafted his photos page to look like the All-American teen. He had sports photos that made him look talented and friend photos that made him look popular and friendly. He knew that getting into a top school required looking his best and he used his Facebook as part of that performance. That is the marker of a highly privileged, highly strategic teen who has been taught to understand how to navigate adult worlds. And to manipulate them based on knowing what kinds of future students they'll like. While I see variations of this among elite teens, I see nothing of the sort among working class teens.
Social media isn't being used to even the playing field - it's being used to replicate pre-existing structural dynamics in a more public forum. All teens are being surveilled - by governmental agencies and corporations - but the teens who are developing strategies to cope are those teens who are responding to surveillance by people they see everyday - their parents. Most other teens only feel the surveillance when something goes terrible awry'
I do think that companies that have data about consumers should be required to make that data available back to them. Consumers should have the right to know what companies know about them. And consumers should have a right to know who has the ability to access that data, including which employees and 3rd parties. When companies monetize user data, consumers should have access to the partners that the companies work with when they "match" data. They should have the right to know how the information is matched and what happens when they click on links. Don't get me wrong - most users won't bother looking. But transparency is key to accountability when we're talking about monetizing user-generated content.
Link:
http://www.danah.org/papers/talks/2010/PrivacyGenerations.html