Skip to main contentSkip to navigationSkip to navigation
‘Russian propaganda did not inject unfamiliar thought into American politics. It just amplified latent radicalism already flowing at the margins of society.’
‘Russian propaganda did not inject unfamiliar thought into American politics. It just amplified latent radicalism already flowing at the margins of society.’ Photograph: Niall Carson/PA
‘Russian propaganda did not inject unfamiliar thought into American politics. It just amplified latent radicalism already flowing at the margins of society.’ Photograph: Niall Carson/PA

Facebook is ripe for exploitation – again – in 2020

This article is more than 5 years old

Facebook claims to have cleaned up its act. But the platform remains vulnerable to the same sorts of divisive propaganda

We won’t need Russia in 2020. We will hijack our democracy ourselves. And Facebook is sure to be a major factor in that hijacking – once again.

The platform is ripe for further exploitation by domestic forces bent on distorting the political conversation and stirring up irrational passions in a way sure to benefit Donald Trump’s re-election efforts. The continued proliferation of white supremacists on Facebook, and its refusal to block a heavily doctored video of the House speaker, Nancy Pelosi, are just the latest demonstrations of Facebook’s cowardice.

interactive

Despite scrutiny in the three years since Facebook’s troublesome role in Trump’s 2016 election – embedding Facebook staff in the campaign itself, hosting millions of dollars of targeted ad spending, and distributing false and divisive messages sponsored by Russia and meant to divide the United States and promote Trump – Facebook remains vulnerable to the sorts of divisive propaganda that motivate nationalist and authoritarian movements. This was evident in recent elections in Brazil, Italy and India, where nationalist forces assumed power with the aid of Facebook-centric election campaigns filled with vitriol and conspiracy theories.

Such propaganda starts with a concerted effort using platforms other than Facebook, such as Reddit, YouTube, state-sponsored systems like Russia’s RT, or private media like Fox News in the US. The messages then migrate to Facebook, with its 220 million American users and 2.4 billion users worldwide. Once there, Facebook’s algorithms take over, amplifying extremist content and connecting susceptible people who might never otherwise find each other. It’s a complex ecosystem that can’t be examined properly by isolating its elements. What happens on Reddit and Fox changes Facebook, and what happens on Facebook changes Reddit and Fox.

US intelligence agencies and the Mueller investigation have extensively documented Russian efforts to spread divisive and pro-Trump propaganda through Facebook. But much of the same sort of content came from domestic sources, such as the National Rifle Association or groups supporting Texas independence. Russian propaganda did not inject unfamiliar thought into American politics. It just amplified and extended latent radicalism already flowing at the margins of society.

Facebook is certainly on guard to limit overt Russian influence in American politics, but it is unable to stem the flow of homegrown white nationalism. It is willing and able to block very specific and literal expressions of white supremacy or white nationalism. But its definitions of both are too narrow and easily evaded.

Trump himself regularly pumps his white supremacist and anti-democratic agenda into the media ecosystem. More than 25 million people follow and often repost content from Trump’s public figure Facebook page. Whether he calls for violence against people or pushes policies explicitly designed to keep white people in power, Facebook won’t do anything about it.

The locus of much of the most noxious content on Facebook is sure to be large private “Groups”. Mark Zuckerberg has characterized Groups as promoting what he calls (but does not understand) “community”. But radical rightwing content that bubbles up through private groups can easily jump into the newsfeeds of innocent people. We have already seen fringe figures use Groups to sidestep Facebook bans. Journalists recently discovered a secret Facebook group in which members of the US border patrol shared racist and sexist jokes and made jokes about migrants dying.

As with all content on Facebook, the more extreme the message, the farther it will travel. Facebook’s algorithms are designed to amplify items that generate strong emotions. Trump’s messages of nationalism, xenophobia and general resentment provoked his followers to share ads and campaign videos on their own timelines, creating momentum. Hillary Clinton’s plan for universal childcare was always unlikely to invoke such passion, so it sank on Facebook. The platform is wonderful for motivation but terrible for deliberation.

After the 2016 election, criticism of Facebook came swiftly. Besides transmitting a flood of divisive and misleading propaganda, Facebook political ads offered no accountability. It was impossible for Facebook users to tell if an advertisement came from a political campaign or some individual or group acting independently, like hate groups or a foreign entity. Ads also disappeared almost instantly, never to be seen again or by anyone not in the targeted audience, so it was impossible for candidates to respond to attacks or for journalists to interrogate the ad’s factual claims.

Facebook has made some cosmetic changes for 2020. Now the platform requires all advertisers that invoke political themes to register and disclose their affiliations, lending a bit of transparency to a massive and confusing system. In 2016, Facebook stationed staff inside the Trump campaign, as it had with the campaign of Rodrigo Duterte in the Philippines earlier that year, to help the campaign target ads more effectively. Facebook has promised to stop that practice.

But much has not changed, and some changes at Facebook might make American democracy worse.

In addition to encouraging users to spend more time in Groups, Zuckerberg would like users to make more use of group chats on his encrypted messaging service WhatsApp. As we saw in Brazil and India, this is an invitation to a more chaotic and unstable public sphere, one that amplifies extreme political messages yet removes much of the content even further from public scrutiny. Most significantly, however, the Facebook advertising system, which allows advertisers to target users extraordinarily precisely, is still massive and effective.

Much of this is just Facebook being Facebook. No company would willingly abandon or dismantle such a lucrative advertising system.

But there is one thing Facebook could do to limit its pernicious influence on democracy: restrict targeted advertising for political candidates.

Facebook could change its policies so that, for example, political candidates could limit the audiences of their ads to the geographic area of their constituency, but not target users in smaller segments by gender, income, education, interests, political ideology, race or any other category.

Under such a system, the entire electorate would be able to see what a candidate has to say about the issues and the other candidates. Opponents and citizens could then respond. Candidates would have to construct messages with broad appeal. Campaigns would have to treat the electorate as an electorate, not as a collection of distinct interests to be manipulated, misled or pandered to.

Of course, Facebook will probably never do this voluntarily – so the US Congress must. Disclosure is not enough. This would not limit free expression or even total spending. This would merely ensure that ads are more visible and scrutable so that candidates may be held accountable. In fact, far from promoting free political expression, targeted ads constrain the speech of citizens, interest groups, journalists, and opposition candidates by preventing them from being able to respond to claims and accusations.

A healthy democracy is not just one in which nationalist demagogues like Trump lose. A healthy democracy is one in which citizens have access to the best information about candidates and issues; candidates face scrutiny; and the losing side retains trust in the fairness of the system and the expression of the will of the electorate.

It’s unlikely the United States will be such a healthy democracy in 2020. But if the world’s richest, most powerful and oldest continuous democracy does not take serious steps very soon, it will be no kind of democracy at all.

  • Siva Vaidhyanathan is a professor of media studies at the University of Virginia and the author of Antisocial Media: How Facebook Disconnects Us and Undermines Democracy (Oxford University Press)

Most viewed

Most viewed