Face-to-Face with Facebook

NEW YORK – Long ago, when Facebook founder Mark Zuckerberg was in grade school, I wrote a book (Release 2.1: A Design for Living in the Digital Age) in which I lauded something called “P3” (now p3p), the platform for privacy preferences. I was sure that people would start using P3 or something like it to control access to data about themselves.

Saturday, June 19, 2010

NEW YORK – Long ago, when Facebook founder Mark Zuckerberg was in grade school, I wrote a book (Release 2.1: A Design for Living in the Digital Age) in which I lauded something called "P3” (now p3p), the platform for privacy preferences.

I was sure that people would start using P3 or something like it to control access to data about themselves. Of course, I was wrong...for about 10 years.

Now, at last, it’s starting to happen – though not exactly the way I envisioned it. Nor is it exactly the way Zuckerberg envisioned it...

While many people are up in arms about Facebook’s shifting privacy policies, millions of others are calmly managing their reputations online, using the tools that Facebook and other social Web sites provide. In fact, Facebook has helped them learn how to do that.

Statistics recently published by the Pew Research Center’s Internet & American Life Project (http://pewinternet.org/Reports/2010/Reputation-Management.aspx) indicate that young people manage their reputations online more carefully than do older people.

I don’t think that means that the young care more about privacy; in fact, I think Zuckerberg is right when he says, "People have really gotten comfortable not only sharing more information and different kinds, but more openly and with more people. That social norm is just something that has evolved over time.” 

The younger generation is comfortable with sharing precisely because they know how to control it. Zuckerberg is right that they don’t necessarily want privacy or what analyst Danah Boyd calls "seclusion,” but they do want "control.”

They won’t blindly follow defaults. Generally, young people post more information of an edgy nature about themselves and about their friends than do older adults, so they are quickly learning the need for specific controls.

By contrast, older users (1) are less likely to do stupid things in the first place, and (2) are less likely to do so in range of a camera. (Of course, there are notable exceptions.) But they, too, don’t want default settings that leave them feeling exposed. They want to "evolve” on their own, rather than be pushed (or led unwittingly) by Facebook.

In short, the market is working, slowly, and consumers are educating themselves, slowly. Facebook has been a big part of that: it has provided the tools to do so, and is constantly tweaking them.

It has also publicized those tools, often inadvertently, by setting overly public defaults.

But in its rush to make money, the company seems to have been greedy – and pretty tone-deaf – in its response to critics.  The critics may not represent the majority of users, but they deserve a polite and considered response.

And when the company does screw up – as with Beacon and more recently its plans for user data-sharing with favored vendors such as Pandora – it needs to show more humility.

That’s not a moral judgment, but a business one. If the market is working towards a world where users exercise the power to manage their own reputations, it will ultimately punish companies that become too arrogant.

But that is only part of the privacy discussion. The comforting thing about the kind of data that Facebook primarily deals with is that it’s public. If your friends and other people can see it, so can you.

More troubling is the data you don’t even know about – the kind of data about your online activities collected by ad networks and shared with advertisers and other marketers, and sometimes correlated with offline data from other vendors.

By and large, that’s information you can’t see – what you clicked on, what you searched for, which pages you came from and went to – and neither can your friends, for the most part. But that information is sold and traded, manipulated with algorithms to classify you and to determine what ads you see, what e-mails you receive, and often what offers are made to you. Of course, some of that information could go astray...

Personally, I don’t really mind, but there are many people who would, if they understood what was going on in the first place. I predict that, whether it’s this year or in ten years, this will become a much greater issue than what information people share openly with friends.

The challenge is to make this hidden sharing of information less confusing, more explicit, and more transparent before the majority of people discover it for themselves in a way that leaves them feeling deceived. I regularly attend advertising industry events and raise this issue, and the answers I get are something along the lines of "what people don’t know won’t hurt them,” or "all we’re doing is giving them ads better targeted to their interests.”

But people have a way of objecting to being manipulated like that. They like to be treated as individuals, not to be put into buckets. I keep telling marketers that this is not a threat, but an opportunity.

Just as Facebook has educated people, clumsily, about privacy controls, so marketers must educate people, ideally more elegantly, about tracking controls. I hope it won’t take ten years and a lot of bad publicity for advertisers to figure that out.

From my own perspective, as I look at the emerging market for people generating and using their own health/behavior data, this commotion is useful. People who share health data are likely to be much more adept at managing it than they would have been just a few years ago.

They will also be fairly skeptical about the companies they entrust it to.

That ultimately argues for transparency and paid models, so that consumers don’t wonder what other allegiances companies may have. "I pay you to manage my data,” is pretty straightforward. "I give you my data and you give me a free service” leaves the customer wondering what else the company may do with the data.

Esther Dyson, chairman of EDventure Holdings, is an active investor in a variety of start-ups around the world.  Her interests include information technology, health care, private aviation, and space travel.

Copyright: Project Syndicate, 2010.

www.project-syndicate.org