Lessons from Facebook and Designing Privacy

Facebook has dominated the tech news this month since their April 2010 launch of the Open Graph. The problem stems from the new platform changes extending Facebook’s ongoing bad habit of violating users actual or perceived privacy.

Facebook’s problem with privacy has been both a business and a design problem. Today, Facebook held a press conference announcing major changes to their privacy policy and what looks like a significant change in Facebook leadership’s attitude toward user privacy. In this post, we’ll explore the implications of Facebook’s failure to heed privacy concerns and what it means to design for privacy in user experience.

Some background:

Privacy groups have long been outspoken on privacy violations (as have government bodies with consumer-friendly legislation to uphold in countries outside the US). This time, a noticeable backlash started in the tech community with speculation that Facebook was dying and Gizmodo providing ten compelling reasons to quit Facebook. A Quit Facebook Day was even designated for 31st May 2010. In addition, a growing Google query How Do I Delete My Facebook Account” was reported.

Three new open-source social network competitors also emerged with an over-arching mission to protect user’s privacy by design:

  • Diaspora: an open-source social network under development (see Computerworld story)
  • Noserub: a decentralized social network
  • Pip.io- a conversation-driven privacy-friendly community (see NYT story)

Does privacy matter to users?

As I mentioned in a previous post, privacy is your #1 user experience challenge. Over the past decade, numerous studies have shown that privacy ranks as a top concern, and in e-commerce *the* top concern with online shopping. Not only is privacy a concern for users, it is a legal requirement in many countries, and soon to be in the US (probably a major reason that Facebook moved so strongly on today’s “once and for all” privacy design changes).

Also see Is your UX a Public Relations Problem?

Unhelpful perception #1: Users don’t seem to mind

One of the defenses used by Facebook that privacy is not such a concern is that they have not seen a drop in user registration since announcing the Open Graph. This is where you have to take off your technical IT professional hat and think like a user: most users are not paying attention to this story and can not even have a discussion about it. That does not mean user do not value their privacy. It means you know way too much about how the Internet works with regard to privacy, security, databases etc.!

This explains why a few in the Internet technical community have come out “in defense of Facebook” this past week and taken an apologist position on privacy protection. I’ve also noticed a little of the blame for the “stupid users” game being played, phrased as “it’s your own fault if you share that or don’t change your settings,” both from Facebook and a small but influential minority of the tech community: Robert Scoble called the erosion of privacy scary but good (for the discovery of like interests). Tim O’Reilly indicated mixed feelings, praising Facebook for giving us all the opportunity to explore the privacy issue.

Why would technical folks ignore Facebook’s eroding privacy policy timeline?

Probably because IT people get that the Web’s architecture is “open” and non-secure by default. They take privacy for granted. Users do not. Nor can they even conceptualize these issues to understand and judge their merit or efficacy. Even if Facebook is as protective to users as they claim, perception of privacy is what rules the day. Cavalier attitudes influence how users feel about the ‘open Web’ Facebook is trying so hard to foster.

Unhelpful perception #2: Users didn’t understand what we told them

The other defense used by Facebook’s public relations director penned in the New York Times this week  is that they failed to communicate all these privacy changes to their users. The problem was not a communication breakdown– this is the “user education” argument.

See How Relying on User Education is a Failed Strategy

Facebook has argued that users understand how to manage their privacy settings. The design decision was to provide granular control of privacy settings (yes, that intimidating section of check boxes and screens). The screenshots show that the new privacy controls might help Facebook regain its loyalty and trust.

What’s wrong with that (privacy lax) business decision? Plenty.

Mark Zuckerberg offended many industry insiders earlier this year, declaring that “privacy was disappearing as a social trend”. Zuckerberg today did a major redaction of his controversial belief by eating his own words.

Facebook relied on what seems like an arrogant view of users. It doesn’t help that Zuckerberg started the Facebook experiment with an attitude of joking about how naive users are about online privacy.

Historical footnote: It’s important to recognize that Facebook encountered (and largely ignored) similar backlash from its community, privacy advocacy groups and government bodies around the world with many of their previous releases of features (Beacon, Connect) that took user control away and left user sharing behavior vulnerable to security hacks, social embarrassment or simply predatory “openness”.

Facebook also did not account for the privacy shock many Americans are still on the rebound from in learning that the US government has been covertly spying on its citizens:

While torture and aggressive war may have been the most serious crimes which the Bush administration committed, its warrantless eavesdropping on American citizens was its clearest and most undeniable lawbreaking. source: Salon.com

Stories like a recent Facebook application that exposed users’ phone numbers don’t help communicate trust from the ever-changing Facebook platform either (including sneaky changes to privacy policy and data sharing). Nor does it provide much reassurance to learn that Facebook is deepening its ties and lobbying US intelligence (spy) agencies.

Worse, Facebook ignored Google’s regular beatings from privacy groups (Google being one of the worst privacy offenders in the industry, which in 2007 earned it the worst company in the world for privacy violations). Google’s shame caused it to lower its hostile persistent cookie’s “spying” period. Google seems to have the same privacy-violating bias of Facebook: they have been “forced to change” with the releases of Wave, Buzz, and almost every product they have launched (followed by complaints) in the past two years.

Yahoo! and Microsoft have both taken a privacy protection position as a business decision. In 2008, Yahoo!  demonstrated leadership in protecting user privacy by scrubbing search logs after 90 days.  Google’s changed their “expires in 2038” search log record, keeping your searches for a two-year period. Microsoft has even followed Yahoo!’s lead in the area of privacy. Up to now, Facebook has ignored the warning signs, instead seemingly relying on the “stupid user” theory of handling privacy in what now is the world’s largest social network.

At issue: Perception of Privacy

One of the interesting backtracks done by the Facebook team this week is the claiming of naivete about their users (“We missed the mark, users didn’t get it”). However, these unhelpful perceptions (mentioned above) do not account for the perception of privacy, let alone cultural aspects of privacy or the limits of configuration behavior.

See Configuration Hell– the case for plug n play user experience

Consumer Reports, in a recent study, found that 52% of users are over-sharing and not realizing it. Read Write Web concluded:

“That means that a good many of the 73% of Facebook adults who think they’re sharing just with friends are sadly mistaken”.

Consumer Reports places the blame on poor privacy controls, not stupid users. The New York Times analysis of Facebook privacy settings(infographic) corroborates this, revealing a whopping 50 settings with 170 options involved in privacy control management on Facebook.

Bottom line: Users do not study the technical workings of the Internet. Who can blame them? An empirical study of how people perceive online advertising at Carnegie Mellon found that:

Many participants have a poor understanding of how Internet advertising works, do not understand the use of first-party cookies, let alone third-party cookies, did not realize that behavioral advertising already takes place, believe that their actions online are completely anonymous unless they are logged into a website, and believe that there are legal protections that prohibit companies from sharing information they collect online. We found that participants have substantial confusion about the results of the actions they take within their browsers, do not understand the technology they work with now, and clear cookies as much out of a notion of hygiene as for privacy.

Conclusion

We should not require users to understand the technicalities of online privacy. Without designing privacy controls from the start, we require users to do so. Calling users dumb for being unaware of technical knowledge can only last so long before it becomes a backlash.

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

Search

Search

Recent Posts

Scroll to top

Get a quote or discuss your project

Tell us about your project

Arrange a 30 min call

Project in mind?

logoblack

Fight for the rights of your users. We'll show you how.

Read more articles like this for exclusive insights into the best ways to approach UX and Service Design challenges. Find out when events occur first. Privacy protected, no exceptions.

Subscribing indicates your consent to our Privacy Policy

Should we add you to our email list?

Privacy protected-You can unsubscribe at any time.

Download the Better UX kit