Lorentz-Spreen2020 Democracy Online

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Perspective

https://doi.org/10.1038/s41562-020-0889-7

How behavioural sciences can promote truth,


autonomy and democratic discourse online
Philipp Lorenz-Spreen   1 ✉, Stephan Lewandowsky   2,3, Cass R. Sunstein   4 and Ralph Hertwig   1

Public opinion is shaped in significant part by online content, spread via social media and curated algorithmically. The current
online ecosystem has been designed predominantly to capture user attention rather than to promote deliberate cognition and
autonomous choice; information overload, finely tuned personalization and distorted social cues, in turn, pave the way for
manipulation and the spread of false information. How can transparency and autonomy be promoted instead, thus fostering the
positive potential of the web? Effective web governance informed by behavioural research is critically needed to empower indi-
viduals online. We identify technologically available yet largely untapped cues that can be harnessed to indicate the epistemic
quality of online content, the factors underlying algorithmic decisions and the degree of consensus in online debates. We then
map out two classes of behavioural interventions—nudging and boosting— that enlist these cues to redesign online environ-
ments for informed and autonomous choice.

T
o the extent that a “wealth of information creates a poverty problems—from climate change to the coronavirus pandemic—
of attention” (p. 41)1, people have never been as cognitively require coordinated collective solutions, making a democratically
impoverished as they are today. Major web platforms such as interconnected world crucial30.
Google and Facebook serve as hubs, distributors and curators2; their
algorithms are indispensable for navigating the vast digital land- Why behavioural sciences are crucial for shaping the
scape and for enabling bottom-up participation in the production online ecosystem
and distribution of information. Technology companies exploit this More than any traditional media, online media permit and encour-
all-important role in pursuit of the most precious resource in the age active behaviours31 such as information search, interaction and
online marketplace: human attention. Employing algorithms that choice. These behaviours are highly contingent on environmental
learn people’s behavioural patterns3–6, such companies target their and social structures and cues32. Even seemingly minor aspects of
users with advertisements and design users’ information and choice the design of digital environments can shape individual actions and
environments7. The relationship between platforms and people is scale up to notable changes in collective behaviours. For instance,
profoundly asymmetric: platforms have deep knowledge of users’ curtailing the number of times a message can be forwarded on
behaviour, whereas users know little about how their data is col- WhatsApp (thereby slowing large cascades of messages) may have
lected, how it is exploited for commercial or political purposes, been a successful response to the spread of misinformation in Brazil
or how it and the data of others are used to shape their online and India33.
experience. To a substantial degree, social media and search engines have
These asymmetries in Big Tech’s business model have created an taken on a role as intermediary gatekeepers between readers and
opaque information ecology that undermines not only user auton- publishers. Today, more than half (55%) of global internet users
omy but also the transparent exchange on which democratic societ- turn to either social media or search engines to access news articles2.
ies are built8,9. Several problematic social phenomena pervade the One implication of this seismic shift is that a small number of global
internet, such as the spread of false information10–14—which includes corporations and Silicon Valley CEOs have significant responsibil-
disinformation (intentionally fabricated falsehoods) and misin- ity for curating the general population’s information34 and, by impli-
formation (falsehoods created without intent, for example, poorly cation, for interpreting discussions of major policy questions and
researched content or biased reporting)—or attitudinal and emo- protecting civic freedoms. Facebook’s recent decision to declare
tional polarization15,16 (for example, polarization of elites17, partisan politicians’ ads off-limits to their third-party fact checkers illustrates
sorting18 and polarization with respect to controversial topics19,20). how corporate decisions can affect citizens’ information ecology
Some disinformation and misinformation involve public health and and the interpretation of fundamental rights, such as freedom of
safety; some of it undermines processes of self-governance. speech. The current situation, in which political content and news
We argue that the behavioural sciences should play a key role diets are curated by opaque and largely unaccountable third par-
in informing and designing systematic responses to such threats. ties, is considered unacceptable by a majority of the public35,36, who
The role of behavioural science is not only to advance active sci- continue to be concerned about their ability to discern online what
entific debates on the causes and reach of false information21–25 is true and what is false2 and who rate accuracy as a very important
or on whether mass polarization is increasing26–28; it is also to attribute for social media sharing37.
find new ways to promote the Internet’s potential to bolster rather How can citizens and democratic governments be empow-
than undermine democratic societies29. Solutions to many global ered38 to create an ecosystem that “values and promotes truth”

Center for Adaptive Rationality, Max Planck Institute for Human Development, Berlin, Germany. 2School of Psychological Science and Cabot Institute,
1

University of Bristol, Bristol, UK. 3School of Psychological Science, University of Western Australia, Perth, Australia. 4Harvard Law School, Cambridge, MA,
USA. ✉e-mail: [email protected]

1102 Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav


NaTurE Human BEhaviour Perspective
Table 1 | Overview of challenges, cues and potential targets of nudging and boosting interventions in three online contexts
Context Challenges Cues Nudging Boosting
Online articles Information overload and Cues to epistemic quality, like …to pay attention to epistemic …procedures to
fragmentation of sources cited references cues and external evidence. systematically check
epistemic cues.
Algorithmic curation Asymmetry of knowledge Transparent recommendation …awareness of factors that …self-nudging towards
and opaque manipulation and sorting criteria shape recommendations and quality information.
the news feed.
Social media Lack of global network Global social cues that …to consider global social cues …to infer credibility from
information and false include base rates and passive and accuracy before sharing. social context and history of
consensus effects behaviour content.

(p. 1096)14? The answers must be informed by independent behav- cannot reliably distinguish between facts and opinions, nor can
ioural research, which can then form the basis both for improved they detect irony, humour or sarcasm56. They also have difficulty
self-regulation by the relevant companies and for government reg- differentiating between extremist content and counter-extremist
ulation39,40. Regulators in particular face three serious problems in messages57, because both types of messages tend to be tagged with
the online domain that underscore the importance of enlisting the similar keywords. A more general shortcoming of current endog-
behavioural sciences. The first problem is that online platforms can enous cues of epistemic quality is that their evaluation requires
leverage their proprietary knowledge of user behaviour to defang background knowledge of the issue in question, which often
regulations. An example comes from most of the current consent makes them non-transparent and potentially prone to abuse for
forms under the European Union (EU) General Data Protection censorship purposes.
Regulation: instead of obtaining genuinely informed consent, the dia- By contrast, exogenous cues are easier to harness as indicators
logue boxes influence people’s decision-making through self-serving of epistemic quality. They refer to the context of information rather
forms of choice architecture (for example, consent is assumed from than the content, are relatively easy to quantify and can be inter-
pre-ticked boxes or inactivity)41,42. This example highlights the need preted intuitively. A famous example of the use of exogenous cues is
for industry-independent behavioural research to ensure transpar- Google’s PageRank algorithm, which takes centrality as a key indi-
ency for the user and to avoid opportunistic responses by those cator of quality. Well-connected websites appear higher up in search
who are regulated. The second problem is that the speed and adapt- results, irrespective of their content. Exogenous cues can indicate
ability of technology and its users exceed that of regulation directly how well a piece of information is embedded in existing knowledge
targeting online content. If uninformed by behavioural science, or the public discourse.
any regulation that focuses only on the symptoms and not on the From here on we focus on exogenous cues and how they can be
actual human–platform interaction could be quickly circumvented. enlisted by nudging46 and boosting47. Let us emphasize that a single
The third problem is the risk of censorship inherent in regulations measure will not reach everyone in a heterogeneous population
that target content; behavioural sciences can reduce that risk as well. with diverse motives and behaviours. We therefore propose a range
Rather than deleting or flagging posts based on judgements about of measures that differ in their scope and in the level of user engage-
their content, we focus here on how to redesign digital environ- ment required. Nudging interventions shape behaviour primarily
ments so as to provide a better sense of context and to encourage through the design of choice architectures and typically require little
and empower people to make critical decisions for themselves43–45. active user engagement. Boosting interventions, in contrast, focus
Our aim is to enlist two streams of research that illustrate the on creating and promoting cognitive and motivational compe-
promise of behavioural sciences. The first examines the infor- tences, either by directly targeting competences as external tools or
mational cues that are available online31 and asks which can help indirectly by enlisting the choice environment. They require some
users gauge the epistemic quality of content or the trustworthiness level of user engagement and motivation. Both nudging and boost-
of the social context from which it originated. The second stream ing have been shown to be effective in various domains, including
concerns the use of meaningful and predictive cues in behavioural health58,59 and finances60. Recent empirical results from research
interventions. Interventions can take the form of nudging46, which on people’s ability to detect false news indicate that informational
alters the environment or choice architecture so as to draw users’ literacy can also be boosted61. Initial results on the effectiveness of
attention to these cues, or boosting47, which teaches users to search simple nudging interventions that remind people to think about
for them on their own, thereby helping them become more resistant accuracy before sharing content37 also suggest that such interven-
to false information and manipulation, especially but not only in tions can be effective in the online domain62. While empirical tests
the long run. and evidence are urgently needed, the first step is to outline the con-
ceptual space of possible interventions and make specific proposals.
Digital cues and behavioural interventions for Table 1 examines three online contexts: articles from newspapers
human-centred online environments or blogs, algorithmic curation systems that automatically suggest
The online world has the potential to provide digital cues that can products or information (for example, search engines or algorith-
help people assess the epistemic quality of content48–50—the poten- mic curation of news feeds), and social media that display infor-
tial of self-contained units of information (here we focus on online mation about the behaviour of others (for example, shared posts or
articles and social media posts) to contribute to true beliefs, knowl- social reactions such as comments or ‘likes’). Each is associated with
edge and understanding—and the public’s attitudes to societal a unique set of challenges, cues and potential interventions. Next,
issues51,52. We classify those cues as endogenous or exogenous53. we review the challenges and cues in Table 1 and detail some inter-
Endogenous cues refer to the content itself, like the plot or ventions in the subsequent sections.
the actors and their relations. Modern search engines use natu-
ral language-processing tools that analyse content54. Such tools Online articles: information overload and epistemic cues
have considerable virtues and promise, but current results rarely The capacity to transfer information online continues to increase
afford nuanced interpretations55. For example, these methods exponentially (average annual growth rate: 28%)63. Content can

Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav 1103


Perspective NaTurE Human BEhaviour

a b
Opaque +
manipulative
Past
Recent

Co
lle
cti Direct
ve
d Indirect
rre Transparent +
Infe non-manipulative 4 vs. 2 Perceived
In 5 vs. 8 Global
div
idu d
vide
al Pro

Fig. 1 | Challenges in automatically curated environments and on social media platforms. a, Dimensions of knowledge that platforms can acquire with
information technology, which make their recommendations continuously opaque and manipulative. b, Perceived group sizes versus the actual global
sizes, from the viewpoint of one user (head icon in the centre) in a homophilic social network.

be distributed more rapidly and reaches an audience faster64. This up-to-date and engaging, algorithms can trade recency for impor-
increasing pace has consequences. In 2013, a hashtag on Twitter tance79 and, by optimizing on click rates, trade ‘clickbait’ for quality.
remained in the top 50 most popular hashtags worldwide for an Similarly, aggregated previous user selections make targeted
average of 17.5 h; by 2016, a hashtag’s time in the limelight had commercial nudging—and even manipulation—possible80,81. For
dropped to 11.9 h. The same declining half-life has been observed example, given just 300 Facebook likes from one person, a regres-
for Google queries and movie ticket sales65. This acceleration, sion model can better predict that person’s personality traits than
arguably driven by the finite limits of attention available for the friends and family82. There are at least three dimensions of knowl-
ever-increasing quantity of topics and content66 alongside an appar- edge where platforms can far exceed individual human capabilities
ent thirst for novelty, has significant but underappreciated psy- (Fig. 1a): data that reaches further back in time (for example, years
chological consequences. Information overload makes it harder of location history on Google Maps), information about behav-
for people to make good decisions about what to look at, spend iour on a collective rather than an individual level (for example,
time on, believe and share67,68. For instance, longer-term offline millions of Amazon customers with similar interests can be used
decisions such as choosing a newspaper subscription (which then to recommend further products to an individual) and knowledge
constrains one’s information diet) have evolved into a multitude of that is inferred from existing data using machine-learning methods
online microdecisions about which individual articles to read from (for example, food preferences inferred from movement patterns
a scattered array of numerous sources. The more sources crowd the between restaurants).
market, the less attention can be allocated to each piece of content Moving further along these dimensions, it becomes more diffi-
and the more difficult it becomes to assess the trustworthiness of cult for a user to comprehend the wealth and predictive potential
each—even more so given the demise and erosion of classic indica- of this knowledge. Automatic customization of online environ-
tors of quality69 (for example, name recognition, reputation, print ments that is based on this knowledge can therefore be opaque and
quality, price). For this reason, new cues for epistemic quality that manipulative (Fig. 1a). Recent surveys in the USA and Germany
are readily accessible even under information overload are neces- found that a majority of respondents consider such data-driven
sary. Exogenous cues can highlight the epistemic quality of individ- personalization of political content (61%), social media feeds (57%)
ual articles, in particular by showing how an article is embedded in and news diets (51%) unacceptable, whereas they are much more
the existing corpus of knowledge and public discourse. These cues accepting of it when it pertains to commercial content35,36. To rebal-
include, for instance, a newspaper article’s sources and citation net- ance the relationship between algorithmic and human decision
work (i.e., sources that cite the article or are cited by it), references making and to allow for heterogeneous preferences across differ-
to established concepts and topical empirical evidence, and even the ent domains, a two-step process is required. First, steps should be
objectivity of the language. taken toward the design and implementation of more transparent
algorithms. They should offer cues that clearly represent the data
Algorithmic curation: asymmetry of knowledge and types and the weighting that led to a system’s suggestions as well as
transparency offer information about the target audience. Second, users should
To help users navigate the overabundance of information, search be able to adapt these factors to their personal preferences in order
engines automatically order results70,71, and recommender systems72 to regain autonomy.
guide users to content they are likely to prefer73. But this conve-
nience exacts a price. Because user satisfaction is not necessarily in Social media: network effects and social cues
line with the goals of algorithms—to maximize user engagement More than two thirds of all internet users (around 3 billion people)
and screen time74—algorithmic curation often deprives users of actively use social media83. These platforms offer information about
autonomy. For instance, feedback loops are created that can arti- the behaviour of others (for example, likes and emoticons)84 and new
ficially reinforce preferences75–78, and recommender systems can opportunities for interaction (for example, follower relationships
eliminate context in order to avoid overburdening users. To stay and comment sections). However, these signals and interactions

1104 Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav


NaTurE Human BEhaviour Perspective
a b
News feed: Sort by:
The Daily News @dailynews - 21m Recency
Newspaper.com News Famous celebrity is accused in court for several crimes

Score: 8(recency)+2(likes)+7(refs.)+2(friends) = 19
Recent news, you need to read this! Total likes
Monica Smith @MonicaSmith - 13h
Today is a beautiful day to go for a walk, who will join?
This article cites two sources: Friends Number of references cited
Score: 6(recency)+2(likes)+0(refs.)+9(friends) = 17
https://www.website.com/section/news/2019/article_1
https://www.othernews/recent/jun/title_2 Super-sports @sportsbrand - Sep 9 Friends’ engagement
Follow your dream and reach new goals with our products.
Ad
First posted 12/05/2018 Sponsored
Add more criteria: +
Posted 874 times
Promoted elsewhere by 14 accounts John @JoeyYourFriend - Aug 21
What a great weekend I had, awesome time!
Friends
Score: 1(recency)+5(likes)+0(refs.)+9(friends) = 15
seen by 125,000 people
1,657 comments Breaking News Now @BNN_breakingnews - 14h
Shared 125 times News
Southern coast is hit by 5.4 magnitude earthquake, read
more: website.com/article/new
456 likes
Score: 7(recency)+0(likes)+4(refs.)+0(friends) = 11

Fig. 2 | Nudging interventions that modify online environments. a, Examples of exogenous cues and how they could appear alongside a social media post.
b, Example of a transparently organized news feed on social media. Types of content are clearly distinguished, sorting criteria and their values are shown
with every post, and users can adjust weightings.

a b

Cascade stats: Is the source identifiable


Source (e.g., via clear URL or the
• Depth: 4
• Max. breadth: 5 ‘about’ page)?
Monday • Size: 12
No Yes


Are hyperlinks to
Unreliable external references
Yesterday
Yesterday Yesterday provided?
Wednesday 2h, 3h ago 2h, 3h ago

Yes No
Depth


Do these links lead to
relevant articles from Unreliable
reputable sources?
3h ago 2h ago 2h ago
No Yes

⨯ Can you find other


Unreliable reports of the story in
an independent
1h ago internet search?

You Yes No

✓ ⨯
Reliable Unreliable
Breadth

Multiple Reliable
reposts article

Fig. 3 | Illustrations of boosting interventions as they could appear within an online environment or as external tools. a, Visualization of a sharing
cascade. Alongside metrics, like the depth or the breadth of a cascades, a pop-up window on social media can provide a simple visualization of a sharing
cascade that shows who (if the profile is public) and when others have shared content before it reached the user. b, A fast-and-frugal decision tree as an
example of a boosting intervention. A pop-up or an external tool can show a fast-and-frugal decision tree alongside an online article that helps a reader
check criteria to evaluate the article’s reliability, where the criteria were adapted from professional fact checkers and primarily point to checking external
information90.

are often one-dimensional, represent only a user’s immediate online an individual’s neighbourhood leads people wrongly to believe it
neighbourhood and do not distinguish between different types of reflects the actual majority opinion; Fig. 1b). When people associate
connections85. These limitations can have drastic effects, such as with like-minded others from a globally dispersed online commu-
dramatically changing a user’s perception of group sizes86,87 and nity, their self-selected social surroundings (known as a homo-
giving rise to false-consensus effects (i.e., the majority opinion in philic social network) and the low visibility of the global state of the

Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav 1105


Perspective NaTurE Human BEhaviour

network88,89 can create the illusion of broad support90 and reinforce alongside additional information about which of the above cues
opinions or even make them more extreme91,92. For instance, even if are missing or have critical values.
only a tiny fraction (for example, one in a million) of the more than Another type of nudge targets how content is arranged in brows-
two billion Facebook users believe that the Earth is flat, they could ers. The way a social media news feed sorts content is crucial in shap-
still form an online community of thousands, thereby creating a ing how much attention is devoted to particular posts. Indeed, news
shield of like-minded people against corrective efforts93–96. feeds have become one of the most sophisticated algorithmically
Although large social media platforms routinely aggregate driven choice architectures of online platforms7,108. Transparent sort-
information that would foster a realistic assessment of societal ing algorithms for news feeds (such as the algorithm used by Reddit)
attitudes, they currently do not provide a well-calibrated impres- that show the factors that determine how posts are sorted can help
sion of the degree of public consensus97. Instead, they show reac- people understand why they see certain content; at the very least this
tions from others as asymmetrically positive—there typically is nudging intervention would make the design of the feed’s architec-
no ‘dislike’ button—or biased toward narrow groups or highly ture more transparent. Relatedly, platforms that clearly differentiate
active users98 to maximize user engagement. This need not be the between types of content (for example, ads, news, or posts by friends)
case. The interactive nature of social media could be harnessed to can make news feeds more transparent and clearer (Fig. 2b).
promote diverse democratic dialogue and foster collective intel-
ligence. To achieve this goal, social media needs to offer more Boosting interventions to foster user competences
meaningful, higher-dimensional cues that carry information Boosting seeks to empower people in the longer term by helping
about the broader state of the network rather than just the user’s them build the competences they need to navigate situations auton-
direct neighbourhood, which can mitigate biased perceptions omously (for a conceptual map of boosting interventions online,
caused by the network structure99. For instance, social media plat- see also ref. 109). These interventions can be integrated directly into
forms could provide a transparent crowd-sourced voting system100 the environment itself or be available in an app or browser add-on.
or display informative metrics about the behaviour and reactions Unlike some nudging interventions, boosting interventions will
of others (for example, including passive behaviour, like the total ideally remain effective even when they are no longer present in
number of people who scrolled over a post), which might counter the environment, because they have become routinized and have
false-consensus effects. We note that some platforms have taken instilled a lasting competence in the user.
steps in the directions we suggest. The competence of acting as one’s own choice architect, or
self-nudging, can be boosted110. For instance, when users can cus-
Nudging interventions to shape online environments tomize how their news feed is designed and sorted (Fig. 2b), they
Nudging interventions can alter choice architectures to promote can become their own choice architects and regain some informa-
the epistemic quality of information and its spread. One type of tional autonomy. For instance, users could be enabled or encour-
nudge, educative nudging, integrates epistemic cues into the choice aged to design information ecologies for themselves that are tailored
environment primarily to inform behaviour (as opposed to actively toward high epistemic quality, making sources of low epistemic
steering it). For instance, highlighting when content stems from few quality less accessible. Such boosting interventions would require
or anonymous sources (as used by Wikipedia) can remind people to changes to the online environment (for example, transparent sort-
scrutinize content more thoroughly101,102 and simultaneously create ing algorithms or clear layouts; see previous section and Fig. 2b) and
an incentive structure for content producers to meet the required the provision of epistemic cues.
criteria. Such outlets can be made more transparent, for example Another competence that could be boosted to help users deal
by disclosing the identity of their confirmed owners. Similarly, more expertly with information they encounter online is the ability
pages that are run by state-controlled media might be labelled as to make inferences about the reliability of information based on the
such103. Going a step further, adding prominent hyperlinks to vetted social context from which it originates111. The structure and details of
reference sources for important concepts in a text could encourage the entire cascade of individuals who have previously shared an arti-
a reader to gain context by perusing multiple sources—a strategy cle on social media has been shown to serve as proxies for epistemic
used by professional fact checkers104. quality112. More specifically, the sharing cascade contains metrics
Nudges can also communicate additional information about such as the depth and breadth of dissemination by others, with deep
what others are doing, thereby invoking the steering power of and narrow cascades indicating extreme or niche topics and breadth
descriptive social norms105: For instance, contextualizing the num- indicating widely discussed issues113. A boosting intervention could
ber of likes by expressing them against the absolute frequency of provide this information (Fig. 3a) to display the full history of a post,
total readers (for example, ‘4,287 of 1.5 million readers liked this including the original source, the friends and public users who dis-
article’) might counteract false-consensus effects that a number seminated it, and the timing of the process (showing, for example, if
presented without context (‘4,287 people liked this article’) may the information is old news that has been repeatedly and artificially
otherwise engender. Transparent numerical formats have already amplified). Cascade statistics teaches concepts that may take some
been shown to improve statistical literacy in the medical domain106. practice to read and interpret, and one may need to experience a
Similarly, displaying the total number of readers and their average number of cascades to learn to recognize informative patterns.
reading time in relation to the potential total readership could help Yet another competence required for distinguishing between
users evaluate the content’s epistemic quality: if only a tiny portion sources of high and low quality is the ability to read laterally104.
of the potential readership has actually read an article, whereas the Lateral reading is a skill developed by professional fact checkers
majority spent just a few seconds on it, it might be clickbait. The that entails looking for information on sites other than the informa-
presentation of many other cues, including ones that reach into the tion source in order to evaluate its credibility (for example, ‘who is
history of a piece of content, could be used to promote epistemic behind this website?’ and ‘what is the evidence for its claims?’) rather
value on social media. Figure 2a shows a nudging intervention that than evaluating a website’s credibility by using the information pro-
integrates several exogenous cues into a social media news feed. vided there. This competence can be boosted with simple decision
Similarly, users could be discouraged from sharing low-quality aids such as fast-and-frugal decision trees114,115. Employed in a wide
information without resorting to censorship by introducing range of areas (for example, medicine, finance, law, management),
‘sludge’ or ‘friction’—for instance, by making the act of sharing fast-and-frugal decision trees can guide the user to scrutinize rel-
slightly more effortful107. In this case, sharing low-quality content evant cues. For example, users can respond to prompts in a pop-up
may require a further mouse click in a pop-up warning message, window (for example, ‘are references provided?’), with each answer

1106 Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav


NaTurE Human BEhaviour Perspective
leading either to an immediate decision (for example, ‘unreliable’) For this dynamic to gain momentum, it is not necessary that
or to the next cue until a final judgment about content reliability is all or even the majority of users engage with nudging or boosting
reached (for example, ‘reliable’; Fig. 3b)116. Decision trees can also interventions. As the first Wikipedia contributors have proven,
enhance the transparency of third-party decisions. If reliability is a critical mass may suffice to allow positive effects to scale up to
judged by third-party fact checkers or via an automated process, users major improvements. Such a dynamic may counteract a possible
could opt to see the decision tree and follow the path that led to the drawback of the proposed interventions; namely, widening infor-
decision, thereby gaining insight that will be useful in the long-term. mation gaps between users if only empowered consumers are able
Eventually, fast-and-frugal decision trees may help people establish to recognise quality information. If a critical mass is created, nudg-
a habit of checking epistemic cues when reading content even in the ing and boosting interventions might well help to mitigate gaps cur-
absence of a pop-up window suggesting they do so47. rently arising from disparities in education or in the ability to pay
Finally, the competence of understanding what makes intention- for quality content. In light of the high stakes—for health, safety and
ally false information so alluring (for example, novelty and the ele- self-governance itself—we err on the side of adopting interventions
ment of surprise) can be boosted by mental inoculation techniques. that empower as many people as possible.
Being informed about manipulative methods before encountering
them online enables an individual to detect parasitic imitations of Received: 10 July 2019; Accepted: 23 April 2020;
trustworthy sources and other sinister tactics117,118. Making people Published online: 15 June 2020
aware of such strategies or of their own personal vulnerabilities
leaves them better able to identify and resist manipulation. For References
instance, having people take on the role of a malicious influencer in 1. Simon, H.A. Designing organizations for an information-rich world.
Computers, Communications and the Public Interest (ed. Greenberger, M.)
a computer game has been demonstrated to improve their ability to 37–72 (1971).
spot and resist misinformation61,119. This inoculation technique can 2. Newman, N., Fletcher, R., Kalogeropoulos, A. & Nielsen, R. Reuters Institute
be used in a range of contexts online; for example, learning about Digital News Report 2019 https://ora.ox.ac.uk/objects/uuid: 18c8f2eb-f616-
the target group of an advertisement can increase people’s ability to 481a-9dff-2a479b2801d0 (Reuters Institute for the Study of Journalism, 2019).
detect advertising strategies. 3. Kosinski, M., Stillwell, D. & Graepel, T. Private traits and attributes are
predictable from digital records of human behavior. Proc. Natl Acad. Sci.
USA 110, 5802–5805 (2013).
Conclusion 4. Boerman, S. C., Kruikemeier, S. & Zuiderveen Borgesius, F. J. Online
Any attempt to regulate or manage the digital world must begin behavioral advertising: a literature review and research agenda. J. Advert 46,
with the understanding that online communication is already 363–376 (2017).
5. Ruths, D. & Pfeffer, J. Social media for large studies of behavior. Science
regulated, to some extent by public policy and laws but primar-
346, 1063–1064 (2014).
ily by search engines and recommender systems whose goals and 6. Tufekci, Z. Engineering the public: big data, surveillance and computational
parameters may not be publicly known, let alone subject to public politics. First Monday https://doi.org/10.5210/fm.v19i7.4901 (2014).
scrutiny. The current online environment has given rise to opaque 7. Harris, T. How technology is hijacking your mind—from a magician and
and asymmetric relationships between users and platforms, and it Google design ethicist. Thrive Global https://thriveglobal.com/stories/how-
technology-is-hijacking-your-mind-from-a-magician-and-google-design-
is reasonable to question whether the industry will take sufficient ethicist/ (18 May 2016).
action on its own to foster an ecosystem that values and promotes 8. Persily, N. The 2016 US election: can democracy survive the internet?
truth. The interventions we propose are aimed primarily at empow- J. Democracy 28, 63–76 (2017).
ering individuals to make informed and autonomous decisions in 9. Habermas, J. The Structural Transformation of the Public Sphere: An Inquiry
the online ecosystem and, through their own behaviour, to foster into a Category of Bourgeois Society. (MIT Press, 1991).
10. Vosoughi, S., Roy, D. & Aral, S. The spread of true and false news online.
and reinforce truth. The interventions are partly conceptualized on Science 359, 1146–1151 (2018).
the basis of existing empirical findings. However, not all interven- 11. Mocanu, D., Rossi, L., Zhang, Q., Karsai, M. & Quattrociocchi, W.
tions have been tested in the specific context in which they may be Collective attention in the age of (mis) information. Comput. Human Behav.
deployed. It follows that some of the interventions that we have rec- 51, 1198–1204 (2015).
ommended, and others designed to promote the same goals, should 12. Rich, M.D. Truth Decay: An Initial Exploration of the Diminishing Role of
Facts and Analysis in American Public Life. (RAND Corporation, 2018).
be subject to further empirical testing. Current results identify some 13. Vargo, C. J., Guo, L. & Amazeen, M. A. The agenda-setting power of fake
interventions as effective37,119 while also indicating that others are news: A big data analysis of the online media landscape from 2014 to 2016.
less promising120. Both set of results will inform the design of more New Media Soc. 20, 2028–2049 (2018).
effective interventions. 14. Lazer, D. M. J. et al. The science of fake news. Science 359, 1094–1096
In our view, the future task for scientists is to design interven- (2018).
15. Baldassarri, D. & Gelman, A. Partisans without constraint: political
tions that meet at least three selection criteria. They must be trans- polarization and trends in American public opinion. Am. J. Sociol. 114,
parent and trustworthy to the public; standardisable within certain 408–446 (2008).
categories of content; and, importantly, hard to game by bad-faith 16. Abramowitz, A. I. & Saunders, K. L. Is polarization a myth? J. Polit. 70,
actors or those with vested interests contrary to those of users or 542–555 (2008).
society as a whole. We also emphasize the importance of examining 17. McCarty, N., Poole, K.T. & Rosenthal, H. Polarized America: the Dance of
Ideology and Unequal Riches. (MIT Press, 2006).
a wide spectrum of interventions, from nudges to boosts, to reach 18. Fiorina, M. P. & Abrams, S. J. Political polarization in the American public.
different types of people, who have heterogeneous preferences, Annu. Rev. Polit. Sci. 11, 563–588 (2008).
motivations and online behaviours. These interventions will not 19. McCright, A. M. & Dunlap, R. E. The politicization of climate change and
completely prevent manipulation or active dissemination of false polarization in the American public’s views of global warming, 2001–2010.
Sociol. Q. 52, 155–194 (2011).
information, but they will help users recognise when malicious tac-
20. Cota, W. et al. Quantifying echo chamber effects in information
tics are at work. They will also permit producers of quality infor- spreading over political communication networks. EPJ Data Sci. 8,
mation to differentiate themselves from less trustworthy sources. 35 (2019).
Behavioural interventions in the online ecology can not only inform 21. DiMaggio, P., Evans, J. & Bryson, B. Have American’s social attitudes
government regulations, but also signal a platform’s commitment become more polarized? Am. J. Sociol. 102, 690–755 (1996).
22. Fletcher, R., Cornia, A., Graves, L., & Nielsen, R. K., Measuring the
to truth, epistemic quality and trustworthiness. Platforms can indi- reach of “fake news” and online disinformation in Europe. Reuters
cate their commitment to these values by providing their users with Institute Digital News Publication. http://www.digitalnewsreport.org/
exogenous cues and boosting and nudging interventions, and users publications/2018/measuring-reach-fake-news-online-disinformation-
can choose to avoid platforms that do not offer them these features. europe/ (2018).

Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav 1107


Perspective NaTurE Human BEhaviour
23. Cinelli, M., Cresci, S., Galeazzi, A., Quattrociocchi, W. & Tesconi, M. The 52. Klašnja, M., Barberá, P., Beauchamp, N., Nagler, J. & Tucker, J. Measuring
limited reach of fake news on Twitter during 2019 European elections. public opinion with social media data. in The Oxford Handbook of Polling
Preprint at arXiv https://arxiv.org/abs/1911.12039 (2020). and Survey Methods (eds Atkeson, L. R. & Alvarez, R. M.) https://www.
24. Guess, A. M., Nyhan, B. & Reifler, J. Exposure to untrustworthy websites in oxfordhandbooks.com/view/10.1093/oxfordhb/9780190213299.001.0001/
the 2016 US election. Nat. Hum. Behav. https://doi.org/10.1038/s41562-020- oxfordhb-9780190213299-e-3 (2017).
0833-x (2020). 53. Dong, X. L. et al. Knowledge-based trust: Estimating the trustworthiness of
25. Barberá, P., Jost, J. T., Nagler, J., Tucker, J. A. & Bonneau, R. Tweeting from web sources. Proceedings VLDB Endowment 8, 938–949 (2015).
left to right: is online political communication more than an echo chamber? 54. Hull, J. Google Hummingbird: where no search has gone before. Wired
Psychol. Sci. 26, 1531–1542 (2015). https://www.wired.com/insights/2013/10/google-hummingbird-
26. Evans, J. H. Have Americans’ attitudes become more polarized?—An where-no-search- has-gone-before/ (accessed: 9 July 2019).
update. Soc. Sci. Q. 84, 71–90 (2003). 55. Luo, H., Liu, Z., Luan, H. & Sun, M. Online learning of interpretable word
27. Lelkes, Y. Mass polarization: manifestations and measurements. Public Opin. embeddings. In Proceedings of the 2015 Conference on Empirical Methods in
Q. 80, 392–410 (2016). Natural Language Processing, 1687–1692 (2015).
28. Del Vicario, M. et al. The spreading of misinformation online. Proc. Natl 56. Schmidt, A. & Wiegand, M. A survey on hate speech detection using
Acad. Sci. USA 113, 554–559 (2016). natural language processing. In Proceedings of the Fifth International
29. Watts, D. J. Should social science be more solution-oriented? Nat. Hum. Workshop on Natural Language Processing for Social Media,
Behav. 1, 15 (2017). 1–10 (2017).
30. Larson, H. J. The biggest pandemic risk? Viral misinformation. Nature 562, 57. Schmitt, J. B., Rieger, D., Rutkowski, O. & Ernst, J. Counter-messages as
309–310 (2018). prevention or promotion of extremism?! The potential role of YouTube:
31. Sundar, S. The MAIN model: a heuristic approach to understanding recommendation algorithms. J. Commun. 68, 780–808 (2018).
technology effects on credibility. in Digital Media, Youth, and Credibility 58. Arno, A. & Thomas, S. The efficacy of nudge theory strategies in
(eds Metzger, M. J. & Flanagin, A. J.) 73–100 (MIT Press, 2007). influencing adult dietary behaviour: a systematic review and meta-analysis.
32. Gigerenzer, G., Hertwig, R. & Pachur, T. Heuristics: The Foundations of BMC Public Health 16, 676 (2016).
Adaptive Behavior (Oxford University Press, 2011). 59. Kurvers, R. H. et al. Boosting medical diagnostics by pooling independent
33. de Freitas Melo, P., Vieira, C.C., Garimella, K., de Melo, P.O.V. & judgments. Proc. Natl Acad. Sci. USA 113, 8777–8782 (2016).
Benevenuto, F. Can WhatsApp counter misinformation by limiting message 60. Lusardi, A. & Mitchell, O. S. The economic importance of financial literacy:
forwarding? in International Conference on Complex Networks and Their theory and evidence. J. Econ. Lit. 52, 5–44 (2014).
Applications 372–384 (2019). 61. Roozenbeek, J. & van der Linden, S. Fake news game confers psychological
34. Baron-Cohen, S. Keynote address at ADL’s 2019 Never Is Now Summit on resistance against online misinformation. Palgrave Commun. 5, 65 (2019).
anti-Semitism and hate. Anti-Defamation League https://www.adl.org/news/ 62. Pennycook, G. & Rand, D. G. Lazy, not biased: susceptibility to partisan
article/sacha-baron-cohens-keynote-address-at-adls-2019-never-is-now- fake news is better explained by lack of reasoning than by motivated
summit-on-anti-semitism (Accessed 7 December 2019). reasoning. Cognition 188, 39–50 (2019).
35. Kozyreva, A., Herzog, S., Lorenz-Spreen, P., Hertwig, R. & Lewandowsky, S. 63. Hilbert, M. & López, P. The world’s technological capacity to store,
Artificial Intelligence in Online Environments: Representative Survey of Public communicate, and compute information. Science 332, 60–65 (2011).
Attitudes in Germany (Max Planck Institute for Human Development, 64. Rosa, H. Social Acceleration: A New Theory of Modernity. (Columbia
2020). University Press, 2013).
36. Smith, A. Public Attitudes Toward Computer Algorithms (Pew Research 65. Lorenz-Spreen, P., Mønsted, B. M., Hövel, P. & Lehmann, S. Accelerating
Center, 2018). dynamics of collective attention. Nat. Commun. 10, 1759 (2019).
37. Pennycook, G. et al. Understanding and reducing the spread of 66. Wu, F. & Huberman, B. A. Novelty and collective attention. Proc. Natl Acad.
misinformation online. Preprint at PsyArXiv https://psyarxiv.com/3n9u8/ Sci. USA 104, 17599–17601 (2007).
(2019). 67. Hills, T. T., Noguchi, T. & Gibbert, M. Information overload or
38. Zuboff, S. Surveillance capitalism and the challenge of collective action. search-amplified risk? Set size and order effects on decisions from
New Labor Forum 28, 10–29 (2019). experience. Psychon. Bull. Rev. 20, 1023–1031 (2013).
39. Klein, D., & Wueller, J. Fake news: a legal perspective. J. Internet Law 68. Hills, T. T. The dark side of information proliferation. Perspect. Psychol. Sci.
https://ssrn.com/abstract=2958790 (2017). 14, 323–330 (2019).
40. Assemblée Nationale. Proposition de loi relative à la lutte contre la 69. American Society of News Editors (ASNE). ASNE statement of principles.
manipulation de l’information, No. 799 [Proposed Bill on the Fight Against ASNE.org https://www.asne.org/content.asp?pl=24& sl=171&contentid=171
the Manipulation of Information, No. 799] http://www.assemblee-nationale. (accessed 27 May 2019).
fr/15/ta/tap0190.pdf (Accessed 26 June 2019). 70. Epstein, R. & Robertson, R. E. The search engine manipulation effect
41. van Ooijen, I. & Vrabec, H. U. Does the GDPR enhance consumers’ control (SEME) and its possible impact on the outcomes of elections. Proc. Natl
over personal data? An analysis from a behavioural perspective. J. Consum. Acad. Sci. USA 112, E4512–E4521 (2015).
Policy 42, 91–107 (2019). 71. Lazer, D. The rise of the social algorithm. Science 348, 1090–1091 (2015).
42. Nouwens, M., Liccardi, I., Veale, M., Karger, D. & Kagal, L. Dark patterns 72. Resnick, P. & Varian, H. R. Recommender systems. Commun. ACM 40,
after the GDPR: scraping consent pop-ups and demonstrating their 56–58 (1997).
influence. Proceedings of the 2020 CHI Conference on Human Factors in 73. Bakshy, E., Messing, S. & Adamic, L. A. Exposure to ideologically diverse
Computing Systems, 1–13 https://doi.org/10.1145/3313831.3376321 (2020). news and opinion on Facebook. Science 348, 1130–1132 (2015).
43. Hertwig, R. When to consider boosting: some rules for policy-makers. 74. Martens, Be., Aguiar, L., Gomez-Herrera, E. & Mueller-Langer, F. The
Behav. Public Policy 1, 143–161 (2017). digital transformation of news media and the rise of disinformation and
44. Epstein, Z., Pennycook, G. & Rand, D. Will the crowd game the algorithm? fake news. Digital Economy Working Paper 2018–02, Joint Research Centre
Using layperson judgments to combat misinformation on socialmedia by Technical Reports. https://ssrn.com/abstract=3164170 (2018).
downranking distrusted sources. Proceedings of the 2020 CHI Conference on 75. Cosley, D., Lam, S.K., Albert, I., Konstan, J.A. & Riedl, J. Is seeing
Human Factors in Computing Systems, 1–11 https://doi. believing? How recommender system interfaces affect users’ opinions. In
org/10.1145/3313831.3376232 (2020). Proceedings of the SIGCHI conference on Human factors in computing
45. Britt, M. A., Rouet, J. F., Blaum, D. & Millis, K. A reasoned approach to systems 585–592 (2003).
dealing with fake news. Policy Insights Behav. Brain Sci. 6, 94–101 (2019). 76. Pan, B. et al. In Google we trust: users’ decisions on rank, position, and
46. Thaler, R.H. & Sunstein. C. R. Nudge: Improving Decisions about Health, relevance. J. Comput. Mediat. Commun. 12, 801–823 (2007).
Wealth, and Happiness (Yale University Press, 2008) 77. Bozdag, E. Bias in algorithmic filtering and personalization. Ethics Inf.
47. Hertwig, R. & Grüne-Yanoff, T. Nudging and boosting: steering or Technol. 15, 209–227 (2013).
empowering good decisions. Perspect. Psychol. Sci. 12, 973–986 (2017). 78. Sunstein, C.R. Republic.com. (Princeton University Press, 2002).
48. Griffiths, K. M. & Christensen, H. Website quality indicators for consumers. 79. Chakraborty, A., Ghosh, S., Ganguly, N. & Gummadi, K.P. Optimizing the
J. Med. Internet Res. 7, e55 (2005). recency-relevancy trade-off in online news recommendations. In
49. Nickel, M., Murphy, K., Tresp, V. & Gabrilovich, E. A review of relational Proceedings of the 26th International Conference on World Wide Web
machine learning for knowledge graphs. Proc. IEEE 104, 11–33 (2015). 837–846 (2017).
50. Dong, X. et al. Knowledge Vault: a web-scale approach to probabilistic 80. Zuboff, S. Big other: surveillance capitalism and the prospects of an
knowledge fusion. in Proceedings of the 20th ACM SIGKDD International information civilization. J. Inf. Technol. 30, 75–89 (2015).
Conference on Knowledge Discovery and Data Mining, 601–610 (2014). 81. Matz, S. C., Kosinski, M., Nave, G. & Stillwell, D. J. Psychological targeting
51. Shu, K., Sliva, A., Wang, S., Tang, J. & Liu, H. Fake news detection on social as an effective approach to digital mass persuasion. Proc. Natl Acad. Sci.
media: A data mining perspective. SIGKDD Explor. 19, 22–36 (2017). USA 114, 12714–12719 (2017).

1108 Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav


NaTurE Human BEhaviour Perspective
82. Youyou, W., Kosinski, M. & Stillwell, D. Computer-based personality 106. Hoffrage, U., Lindsey, S., Hertwig, R. & Gigerenzer, G. Communicating
judgments are more accurate than those made by humans. Proc. Natl Acad. statistical information. Science 290, 2261–2262 (2000).
Sci. USA 112, 1036–1040 (2015). 107. Tucker, J. A., Theocharis, Y., Roberts, M. E. & Barberá, P. From liberation to
83. Ortiz-Ospina, E. The rise of social media. Our World in Data https:// turmoil: social media and democracy. J. Democracy 28, 46–59 (2017).
ourworldindata.org/ rise-of-social-media (accessed: 5 December 2019). 108. Facebook for Business. Capturing attention in feed: the science behind
84. Porten-Cheé, P. & Eilders, C. The effects of likes on public opinion effective video creative. https://www.facebook.com/business/news/insights/
perception and personal opinion. Communications https://doi.org/10.1515/ capturing-attention-feed-video-creative (accessed 8 December 2019).
commun-2019-2030 (2019). 109. Kozyreva, A., Lewandowsky, S. & Hertwig, R. Citizens versus the internet:
85. Dandekar, P., Goel, A. & Lee, D. T. Biased assimilation, homophily, and the confronting digital challenges with cognitive tools. Preprint at PsyArXiv
dynamics of polarization. Proc. Natl Acad. Sci. USA 110, 5791–5796 (2013). https://psyarxiv.com/ky4x8/ (2019).
86. Lee, E. et al. Homophily and minority-group size explain perception biases 110. Reijula, S. & Hertwig, R. Self-nudging and the citizen choice architect.
in social networks. Nat. Hum. Behav. 3, 1078–1087 (2019). Behav. Publ. Policy https://doi.org/10.1017/bpp.2020.5 (2020).
87. Stewart, A. J. et al. Information gerrymandering and undemocratic 111. Noriega-Campero, A. et al. Adaptive social networks promote the wisdom
decisions. Nature 573, 117–121 (2019). of crowds. Proc. Natl Acad. Sci. USA 117, 11379–11386 (2020).
88. Ross, L., Greene, D. & House, P. The “false consensus effect”: an egocentric 112. Vosoughi, S. Automatic detection and verification of rumors on Twitter.
bias in social perception and attribution processes. J. Exp. Soc. Psychol. 13, Doctoral dissertation, Massachusetts Institute of Technology (2015).
279–301 (1977). 113. Zhou, X. & Zafarani, R. Fake news: a survey of research, detection
89. Colleoni, E., Rozza, A. & Arvidsson, A. Echo chamber or public sphere? methods, and opportunities. Preprint at arXiv https://arxiv.org/
Predicting political orientation and measuring political homophily in abs/1812.00315 (2018).
Twitter using big data. J. Commun. 64, 317–332 (2014). 114. Martignon, L., Katsikopoulos, K. V. & Woike, J. K. Categorization with limited
90. Leviston, Z., Walker, I. & Morwinski, S. Your opinion on climate change resources: A family of simple heuristics. J. Math. Psychol. 52, 352–361 (2008).
might not be as common as you think. Nat. Clim. Chang. 3, 334–337 115. Phillips, N. D., Neth, H., Woike, J. K. & Gaissmaier, W. FFTrees: a toolbox
(2013). to create, visualize, and evaluate fast-and-frugal decision trees. Judgm.
91. Baumann, F., Lorenz-Spreen, P., Sokolov, I., Starnini, M., Modeling echo Decis. Mak. 12, 344–368 (2017).
chambers and polarization dynamics in social networks. Phys. Rev. Letters 116. Banerjee, S., Chua, A. Y. & Kim, J. J. Don’t be deceived: using linguistic
(in the press). analysis to learn how to discern online review authenticity. J. Assoc. Inf. Sci.
92. Sunstein, C. R. The law of group polarization. J. Polit. Philos. 10, 175–195 Technol. 68, 1525–1538 (2017).
(2002). 117. Cook, J., Lewandowsky, S. & Ecker, U. K. H. Neutralizing misinformation
93. Sunstein, C.R. Conspiracy Theories and Other Dangerous Ideas. (Simon and through inoculation: Exposing misleading argumentation techniques
Schuster, 2014). reduces their influence. PLoS ONE 12, e0175799 (2017).
94. Van der Linden, S. The conspiracy-effect: exposure to conspiracy theories 118. Roozenbeek, J. & van der Linden, S. The fake news game: actively inoculating
(about global warming) decreases pro-social behavior and science against the risk of misinformation. J. Risk Res. 22, 570–580 (2018).
acceptance. Pers. Individ. Dif. 87, 171–173 (2015). 119. Basol, M., Roozenbeek, J. & van der Linden, S. Good news about bad news:
95. Lewandowsky, S., Oberauer, K. & Gignac, G. E. NASA faked the moon gamified inoculation boosts confidence and cognitive immunity against fake
landing–therefore, (climate) science is a hoax: an anatomy of the motivated news. J. Cognition 3, 2 (2020).
rejection of science. Psychol. Sci. 24, 622–633 (2013). 120. Dias, N., Pennycook, G. & Rand, D. G. Emphasizing publishers does not
96. Scheufele, D. A. & Krause, N. M. Science audiences, misinformation, and effectively reduce susceptibility to misinformation on social media. Harvard
fake news. Proc. Natl Acad. Sci. USA 116, 7662–7669 (2019). Kennedy School Misinformation Review https://doi.org/10.37016/mr-2020-
97. Lewandowsky, S., Cook, J., Fay, N. & Gignac, G. E. Science by social media: 001 (2020).
attitudes towards climate change are mediated by perceived social
consensus. Mem. Cognit. 47, 1445–1456 (2019). Acknowledgements
98. Muchnik, L., Aral, S. & Taylor, S. J. Social influence bias: a randomized We thank A. Kozyreva and S. Herzog for their helpful comments and D. Ain for editing
experiment. Science 341, 647–651 (2013). the manuscript. R.H. and S.L. acknowledge support from the Volkswagen Foundation.
99. Alipourfard, N., Nettasinghe, B., Abeliuk, A., Krishnamurthy, V. & Lerman, The funders had no role in study design, data collection and analysis, decision to publish
K. Friendship paradox biases perceptions in directed networks. Nat. or preparation of the manuscript.
Commun. 11, 707 (2020).
100. Pennycook, G. & Rand, D. G. Fighting misinformation on social media
using crowdsourced judgments of news source quality. Proc. Natl Acad. Sci.
Author contributions
P.L.S., S.L. and R.H. conceptualized the project; P.L.S., S.L., C.R.S. and R.H. wrote the
USA 116, 2521–2526 (2019).
manuscript.
101. Ecker, U. K., Lewandowsky, S. & Tang, D. T. Explicit warnings reduce but
do not eliminate the continued influence of misinformation. Mem. Cognit.
38, 1087–1100 (2010). Competing interests
102. Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N. & Cook, J. C.R.S. has served as a paid consultant on a few occasions for Facebook.
Misinformation and its correction: continued influence and successful
debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012).
103. Rosen, G., Harbath, K., Gleicher, N. & Leathern, R. Helping to protect the Additional information
2020 US elections. Facebook https://about.fb.com/news/2019/10/update-on- Correspondence should be addressed to P.L.-S.
election-integrity-efforts/ (accessed 22 January 2020). Peer review information Primary handling editors: Mary Elizabeth Sutherland and
104. Wineburg, S. & McGrew, S. Lateral reading: reading less and learning more Stavroula Kousta
when evaluating digital information. Working Paper No 2017.A1/Stanford Reprints and permissions information is available at www.nature.com/reprints.
History Education Group https://ssrn.com/abstract=3048994 (2017).
105. Schultz, P. W., Nolan, J. M., Cialdini, R. B., Goldstein, N. J. & Griskevicius, Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in
V. The constructive, destructive, and reconstructive power of social norms. published maps and institutional affiliations.
Psychol. Sci. 18, 429–434 (2007). © Springer Nature Limited 2020

Nature Human Behaviour | VOL 4 | November 2020 | 1102–1109 | www.nature.com/nathumbehav 1109

You might also like