By Frank Spillers

Facebook-privacy Facebook has dominated the tech news this month since their April 2010 launch of the Open Graph. The problem stems from the new platform changes extending Facebook's ongoing bad habit of violating users actual or perceived privacy.

Facebook's problem with privacy has been both a business and a design problem. Today, Facebook held a press conference announcing major changes to their privacy policy in addition to what looks like a significant change in Facebook leadership's attitude toward user privacy.

In this post, I'll explore the implications of Facebook's failure to take heed to privacy concerns and what it means for designing privacy for the social networking user experience.

 

Some background:

Privacy groups have long been outspoken on privacy violations (as have government bodies with consumer-friendly legislation to uphold in countries outside the US). This time, a noticeable backlash started in the tech community with speculation that Facebook was dying and Gizmodo providing ten compelling reasons to quit Facebook. A Quit Facebook Day was even designated for 31st May 2010. In addition a growing Google query "How Do I Delete My Facebook Account" was reported.

Three new open-source social network competitors also emerged with an over-arching mission to protect user's privacy by design:

 

Does privacy matter to users?

As I pointed out in a previous post, privacy is your #1 user experience challenge. Numerous studies over the past decade have shown that privacy ranks as a top concern, and in ecommerce *the* top concern with online shopping. Not only is privacy a concern for users, it is a legal requirement in many countries, and soon to be in the US (probably a major reason that Facebook moved so strongly on today's "once and for all" privacy design changes).The other reason for what look like real changes, I cover in my post, Is Usability a Public Relations problem?

Unhelpful perception #1: Users don't seem to mind

One of the defenses used by Facebook that privacy is not such a concern is that they have not seen a drop in user registration since announcing the Open Graph. This is where you have to take off your technical IT professional hat and think like a user: most users are not paying attention to this story and can not even have a discussion about it. That does not mean user do not value their privacy. It means you know way too much about how the Internet works with regard to privacy, security, databases etc.!

This explains why a few in the Internet technical community have come out in defense of Facebook this past week, and taking an apologist position on privacy protection. I've also noticed a little of the blame the "stupid users" game being played, phrased as "it's your own fault if you share that or don't change your settings" both from Facebook and a small but influential minority of the tech community: Robert Scoble called the erosion of privacy scary but good (for the discovery of like interests). Tim O'Reilly indicated mixed feelings praising Facebook for giving us all the opportunity to explore the privacy issue.

Why would technical folks ignore Facebook's eroding privacy policy timeline?

Probably because IT people get that the Web's architecture is "open" and non-secure by default. Users do not. Nor can they even conceptualize these issues to understand and judge their merit or efficacy. Even if Facebook is as protective to users as they claim, perception of privacy is what rules the day. Cavalier attitudes influence how users feel about the 'open Web' Facebook is trying so hard to foster.

 

Unhelpful perception #2: Users didn't understand what we told them

 

The other defense used by Facebook's public relations director penned in the New York Times this week  is that they failed to communicate all these privacy changes to their users. The problem was not a communication breakdown-- this is the "user education" argument. See my post on How Relying on User Education is a Failed Strategy

 

Facebook has argued that users understand how to manage their privacy settings. The design decision was to provide granular control of privacy settings (yes that intimidating section of check boxes and screens). It looks from the screen shots that the new privacy controls might help Facebook regain its loyalty and trust.

What's wrong with that (privacy lax) business decision? Plenty.

Mark Zuckerberg, Facebook's CEO offended many industry insiders earlier this year declaring that privacy was disappearing as a social trend--as if openness and privacy are not close cousins. Zuckerberg today did a major redaction of his controversial belief eating his own words.

Facebook relied on what seems like an arrogant view about users. It doesn't help that Zuckerberg started the Facebook experiment with an attitude joking about how naive users are about online privacy.

Historical footnote: It's important to recognize that Facebook encountered (and largely ignored) similar backlash from its community, privacy advocacy groups and government bodies around the world with many of their previous releases of features (Beacon, Connect) that took user control away and left user sharing behavior vulnerable to security hacks, social embarrassment or simply predatory "openness". This cartoon sums up the mood perfectly!

Facebook also did not account for the privacy shock many Americans are still on the rebound from in learning that the US government has been covertly spying on its citizens:

 

While torture and aggressive war may have been the most serious crimes which the Bush administration committed, its warrantless eavesdropping on American citizens was its clearest and most undeniable lawbreaking. Salon.com

 

 

Stories like a recent Facebook application that exposed users phone numbers doesn't help communicate trust from the ever changing Facebook platform either (including sneaky changes to privacy policy and sharing of data). Nor does it provide much reassurance to learn that Facebook is deepening its ties and lobbying US intelligence (spy) agencies.

Worse, Facebook ignored Google's regular beatings from privacy groups (Google being one of the worst privacy offenders in the industry which in 2007 earned it worst company in the world status for privacy violations). Google's shame caused it to lower the "spying" time period of its hostile persistent cookie. Google seems to have the same privacy blindness of Facebook: they have been "forced to change" with releases of Wave, Buzz and almost every product they have launched (followed by complaints) in the past two years. 

Are we learning from Google's FAIL yet? Check out Danah Boyd's thoughts...

Yahoo! and Microsoft have both taken a privacy protection position as a business decision. In 2008, Yahoo!  demonstrated leadership in protecting user privacy by scrubbing search logs after 90 days.  Google's changed their "expires in 2038" search log record, keeping your searches for a two-year period. Microsoft has even followed Yahoo!'s lead in the area of privacy.

Up to now Facebook has ignored the warning signs, instead seemingly relying on the "stupid user" theory of handling privacy in what now is the world's largest social network.

At issue: Perception of Privacy

One of the interesting back-tracks done by the Facebook team this week is the claiming of naivete about their users ("we missed the mark, users didn't get it"). However, these unhelpful perceptions (mentioned above) do not account for perception of privacy let alone cultural aspects of privacy or the limits of configuration behavior.

Consumer Reports in a recent study found that 52% of users are over-sharing and not realizing it. Read Write Web concluded:

"That means that a good many of the 73% of Facebook adults who think they're sharing just with friends are sadly mistaken".

Consumer Reports places the blame on poor privacy controls, not stupid users. The New York Times analysis corroborates this revealing a whopping 50 settings with 170 options involved in privacy control management on Facebook. Also see the Facebook privacy settings infographic...

Bottom line: Users do not study the technical workings of the Internet. Who can blame them? An empirical study of how people perceive online advertising at Carnegie Mellon found that:

Many participants have a poor understanding of how Internet advertising works, do not understand the use of first-party cookies, let alone third-party cookies, did not realize that behavioral advertising already takes place, believe that their actions online are completely anonymous unless they are logged into a website, and believe that there are legal protections that prohibit companies from sharing information they collect online. We found that participants have substantial confusion about the results of the actions they take within their browsers, do not understand the technology they work with now, and clear cookies as much out of a notion of hygiene as for privacy.

Should we be requiring users to understand the technicalities of online privacy to use our social or social networking features? That sounds absurd, but without designing privacy controls in from the start, we are essentially requiring users to do so. Calling users dumb for being unaware of technical knowledge can only last so long before it becomes a backlash. The tolerance for privacy violation (or perceived violation- a usability and design issue) in online user experience is declining, or as the Washington Post puts it- privacy is having its day

Facebook's actions today reveal that open is not the trend in social networks, control of privacy is:

Today we're starting to roll out some changes that will make all of these controls a lot simpler. We've focused on three things: a single control for your content, more powerful controls for your basic information and an easy control to turn off all applications. -Mark Zuckerberg blog post.

In the next post on this topic, I'll explore research-based proven techniques for designing privacy as an enhancer of sociability including strategies for designing better privacy user experience.

Best Wishes,
Frank Spillers, MS