Home > News > It’s no accident that Facebook is so addictive
150 views 10 min 0 Comment

It’s no accident that Facebook is so addictive

- August 6, 2018
The Facebook logo appears on screens at the Nasdaq MarketSite in New York’s Times Square. (Richard Drew/AP)

Siva Vaidhyanathan is the Robertson professor of media studies and director of the Center for Media and Citizenship at the University of Virginia. He is also the author of a new book, “Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.” I asked him questions about the book.

HF: Your book suggests that Facebook uses the same kinds of techniques to keep you coming back as a casino does. What does this mean?

SV: Facebook engineers were for many years influenced by a strain of thought that emerged from Stanford University, where in the early 2000s scholars of human-computer interaction, design and behavioral economics were promoting the idea that games could generate “stickiness” among users, giving users just enough positive feedback to want to return to the game but deny users enough pleasure so that they don’t get satiated. As technology consultant Nir Eyal explains in his revealing and, frankly, frightening book, “Hooked: How to Build Habit-Forming Products,” this idea spread quickly through Silicon Valley, uniting game designers, application engineers, advertising professionals and marketing executives.

Facebook played this game better than most. It’s perfectly designed, like a fruit machine in a casino, to give us a tiny sliver of pleasure when we use it and introduce a small measure of anxiety when we do not use it. A Facebook user says, “What am I missing out on? Did anyone ‘like’ my joke?” A casino patron says, “I wonder if THIS is my lucky moment or lucky pull of the lever.”

HF: For Facebook’s model to work, you suggest that users can’t have real control over their personal information. Why is this so, and what consequences does it have for politics?

SV: From Facebook’s point of view, users shouldn’t know all the ways that Facebook uses and distributes their basic data and records of interactions for two reasons. One, it’s just too vast a collection of users; two, users might get turned off if their experience on the surface of Facebook — all the “likes,” clicks, videos, comments and messages — [is] interrupted by the reality of what’s behind the wizard’s curtain. So Facebook keeps assuring us we “have control” of our information. But that’s only limited to superficial control over the audiences within Facebook to which we share posts.

I can limit a post to friends, friends of friends, or everybody on Facebook. I can exclude certain Facebook users from seeing certain posts. But it takes vigilance to manage all that. And it only controls the flow among Facebook users. All the back-end data, the really valuable and sensitive stuff, gets mined by Facebook and used to help target ads and content at you. And for more than five years, Facebook gave away our valuable back-end data to thousands of applications and developers who ranged from scientists working for Cambridge Analytica to the Obama campaign to the people who made Words With Friends.

The consequences for politics are stunning: In 2012, the head of state of a country with massive surveillance and military power had sensitive personal data on millions of Americans, and no one cared. When my colleagues in the social media scholarship world and privacy world tried to raise this issue, no one responded with interest. We could not get reporters to pay attention or editors to run op-eds. The Obama campaign was seen as this supercool digital pioneer, a happy, friendly phenomenon. No one thought about what might happen if a not-so-friendly campaign got the same sort of information on millions of Americans. Then it happened in 2016.

HF: Facebook’s algorithms, like YouTube’s and others’, try to maximize user “engagement.” What does this focus on engagement mean for Facebook’s broader business model and impact on society?

SV: Facebook is in the social engineering business. It constantly tries to manipulate our experience and, thus, our perspective on our friends, issues and the world. It does so haphazardly and incoherently, it seems at first. But, in fact, there is a coherent driving force. Facebook wants to maximize something close to “happiness.” It has fallen under the sway of those who believe one can measure affective states and make changes that can increase satisfaction or joy. It turns out that Bentham’s Panopticon was not his major influence on 21st-century digital culture. It was the idea of maximizing happiness by counting “hedons,” or units of pleasure. Well, you can only dial up something you can count. You can’t really count happiness. So you count a proxy.

For Facebook, that proxy is “engagement,” the number of clicks, shares, “likes” and comments. If a post or a person generates a lot of these measurable actions, that post or person will be more visible in others’ News Feeds. You can already see how this could go wrong. Unsurprisingly, items advocating hatred and bigotry, conspiracy theories or wacky health misinformation generate massive reactions — both positive and negative. A false post about the danger of vaccines would generate hundreds of comments, most of them arguing with the post. But the very fact of that “engagement” would drive the post to more News Feeds. That’s why you can’t argue against the crazy. You just amplify the crazy. Such are algorithms and feedback mechanisms.

HF: You are skeptical about some of the claims that have been made about, e.g., Cambridge Analytica’s use of Facebook data, but you are also worried about how Facebook could reshape politics. Why should we be worried?

SV: Facebook’s ability to precisely target voters allows for massive amounts of political communication to occur without oversight or an opportunity to respond. It removes political communication from the gaze of the public. It’s ephemeral and coded. Political communication moves even further from the Habermasian or Jeffersonian ideal of public conversation about matters of policy and more toward motivation. Healthy republics need both motivation and deliberation.

HF: Social media like Facebook had consequences for India’s recent election. How did [Prime Minister Narendra] Modi’s BJP use Facebook to try to achieve its political ends?

SV: Modi is the most liked politician in the world on Facebook. He and his party and its more radical affiliate movements like the RSS and Shiv Sena use Facebook and WhatsApp to harass, threaten and undermine the reputations of critics, activists, journalists and scholars. That effort is on top of the more common uses of Facebook to target political ads at slivers of the electorate to appeal to their provincial concerns and inclinations. India is already the world’s most fractured democracy. Modi’s campaign style unites the nation in fear and hatred rather than a vision of a better future.

This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the network is responsible for the article’s specific content. Other posts can be found here.