Download as pdf or txt
Download as pdf or txt
You are on page 1of 49

Office of the New York State

Attorney General Letitia James

Investigative Report
on the role of online platforms
in the tragic mass shooting
in Buffalo on May 14, 2022

OCTOBER 18, 2022


Table of Contents

I. Executive Summary................................................................................................................................................................... 1

II. Background.............................................................................................................................................................................. 6
A. The OAG’s Investigation into Online Platforms...................................................................................................................... 6
1. 4chan, 8kun, and their Derivatives................................................................................................................................... 6
2. Discord................................................................................................................................................................................ 8
3. Reddit.................................................................................................................................................................................. 8
4. Twitch.................................................................................................................................................................................. 9
B. The Shooting in Buffalo............................................................................................................................................................ 9
C. The Shooting’s Impact on the Community............................................................................................................................ 11
1. Personal Stories from Victims’ Families............................................................................................................................ 11
2. Effect on the Community.................................................................................................................................................. 12
3. Residents’ Views on the Responsibility of Social Media Companies........................................................................... 14
D. The Buffalo Shooter’s Ideology: “White Genocide,” the “Great Replacement” Theory, and Accelerationism.............. 15
E. The Christchurch Shooter and the Model of Virality............................................................................................................ 17
F. The Influence of Prior Extremist Writings on the Buffalo Shooter’s Writings....................................................................... 19
G. The Buffalo Shooter’s Content Doubles as an Inspirational Guide
and Instructional Manual for the Next Mass Shooter......................................................................................................... 20

III. The Role of Online Platforms in the Buffalo Shooting...................................................................................................... 23


A. Online Memes Helped the Buffalo Shooter Learn about the Great Replacement Theory
and Express his Views in his Manifesto................................................................................................................................. 23
B. Online Platforms Cited by Buffalo Shooter as Formative to Ideology of Hate................................................................. 24
1. 4chan “Politically Incorrect”............................................................................................................................................. 24
2. Reddit.................................................................................................................................................................................25
C. Online Platform Used by the Shooter to Plan the Details of His Attack.............................................................................27
D. Online Platforms Used by the Shooter to Equip His Arsenal.............................................................................................. 28
1. YouTube............................................................................................................................................................................. 28
2. 4chan /k/ Board............................................................................................................................................................... 29
3. Discord.............................................................................................................................................................................. 29
4. Reddit................................................................................................................................................................................ 30
E. Online Platform Used by the Shooter to Broadcast His Violence....................................................................................... 31

IV. Online Platforms’ Response to the Buffalo Shooting....................................................................................................... 34


A. The Spread of Graphic Content Related to the Buffalo Shooting..................................................................................... 34
B. The Global Internet Forum to Counter Terrorism’s Response to the Tragedy in Buffalo................................................. 34
C. Mainstream Platforms’ Moderation of Graphic Content Related to the Shooting......................................................... 35
D. Improved Moderation Policies Reduced Prevalence of Violent Content, but Additional Improvement is Needed......37
E. Online Platforms’ Advertisements and Monetization of Graphic Content Related to the Shooting............................. 39

V. Recommendations.................................................................................................................................................................41
A. Legislative Proposals............................................................................................................................................................... 41
1.New Affirmative Liability .................................................................................................................................................. 41
2. Reforming CDA Section 230............................................................................................................................................ 43
3. Restrictions on Livestreaming......................................................................................................................................... 45
B. Recommendations for Industry............................................................................................................................................. 45
1. Online Platforms Should Strengthen and Share Their Moderation Technologies..................................................... 45
2. Online Platforms Should Increase Transparency......................................................................................................... 46
3. Internet Service and Infrastructure Providers Can and Should Do More................................................................... 47

i
I. Executive Summary

The mass shooting in and around the Tops grocery store in Buffalo, New York on May 14, 2022 that claimed
the lives of ten individuals and injured three others was all the more horrific because of the white supremacist
ideological motivation that fueled it and the shooter’s meticulous planning. The disturbing reality is that
this attack is part of an epidemic of mass shootings often perpetrated by young men radicalized online by
an ideology of hate. This report details what my office has learned about how the Buffalo shooter was first
indoctrinated and radicalized through online platforms, and how he used these and other platforms to plan,
implement, and promote these acts of terror.1 The report assesses the strengths and weaknesses of the response
of various online platforms in the wake of the Buffalo shooting. Readers should be cautioned that this report
contains graphic textual descriptions of bigotry and violence, including quotes from the shooter’s own writing
that, in our opinion, are necessary to contextualize and explain this story.

No discussion of the Buffalo shooting can ignore the pressing need for stronger gun laws, as I have called for
several times. New York has been at the forefront of enacting state-wide laws to protect its residents from
gun violence,2 but the national problem of gun violence demands national solutions and federal partnership.
This should include repeal of existing federal laws that protect firearms manufacturers from liability, reduce
transparency into gun sales, and inhibit effective law enforcement;3 implementation of universal federal
background checks and state-wide background checks for ammunition purchases; institution of a federal
assault weapons ban; the use of federal extreme risk protection orders; closure of the so-called Charleston
Loophole allowing guns to be sold if the FBI does not complete a background check within three days; universal
law enforcement data-sharing; stronger regulation of imitation guns; the large-scale expansion of public health
and violence-interruption funding; and significantly increased funding for both the Bureau of Alcohol, Tobacco,
Firearms and Explosives, and the oversight of gun dealers. Each of these actions will help reduce the amount of
gun violence and unnecessary gun-related death and injury in New York and around the country.

1
The individual referenced throughout as the Buffalo shooter has pleaded not guilty to criminal charges arising from the attack, and the
criminal charges remain allegations as of the date of this report. This report reflects my office’s finding of facts in connection with the
investigation described in detail below. The defendant is presumed innocent unless and until found guilty at trial or by plea.
2
Recent measures include prohibiting undetectable firearms and unserialized “ghost guns,” N.Y. Pen. L. §§ 265.50, 265.55, 265.60-61, and
the “unfinished” frames and receivers that criminals use to build them without any background check. Id. §§ 265.63-64. On July 1, 2022,
New York passed the Concealed Carry Improvement Act, a package of reforms designed to comply with the Supreme Court’s decision
in NYSRPA v. Bruen, 142 S. Ct. 2111 (2022) by, among other things, revising New York’s licensing laws, strengthening background checks,
and specifying the list of sensitive locations such as schools, parks, and health care facilities where guns do not belong. See generally
N.Y. Pen. L. §§ 400.00, 265.01-d, 265.01-e; N.Y. Exec. Law § 228. The Office of the Attorney General continues to defend New York’s gun
safety laws against political attacks, and to bring enforcement actions against those who would violate them.
3
In particular, Congress should begin by repealing the Protection of Lawful Commerce in Arms Act, the Tiahrt Amendments,
and the Firearm Owners Protection Act.

1
In addition, we must make investments in our healthcare infrastructure to address mental health issues,
including those that arise out of the isolation of the recent pandemic and from the prevalence of young people
spending countless hours online.

In this report, however, consistent with the directive from Governor Hochul that authorized and initiated the
underlying investigation, my office’s focus is on the role of social media and other online platforms in radicalizing
people to commit acts of hate-inspired violence. We know that online platforms and their business models,
which are generally supported by advertising, create incentives to keep users glued to their service for as many
hours as possible. The psychological consequences of spending more and more time online and away from
engaging with physical reality are significant, and it is hard to ignore the correlation between the rise in mass
shootings perpetrated by young men and the prevalence of online platforms where racist ideology and hate
speech flourish, in some cases by design. However, the focus of our report is not on the overall ecosystem, but
the particulars: what precise mechanisms enabled the shooter to see prior mass shootings, plan his own mass
shooting, and correctly expect that his manifesto and video would be widely shared.

The Buffalo shooter authored hundreds of written pages in which he draws the connection between those
platforms and his own radicalization and decision to commit a mass attack animated by racism, anti-Semitism,
and other execrable bigotry. And he makes clear that he intended to use those platforms to further promote his
racist ideology with his writings, and to inspire new acts of violence through the preservation and dissemination
of the videos and images of his own.

The report details several key findings:

Violent White Supremacist Radicalization Traumatizes Vulnerable Communities: The Buffalo shooter
killed ten Black people and wounded several other individuals. The victims leave behind dozens of loved
ones whose grief is compounded by the fear that another attack could similarly target them and their
community. That fear is grounded, in part, in the ease with which the shooter planned and enacted his
attack, and the lack of options to hold accountable any of the individuals or entities who may have been
complicit in his radicalization or otherwise enabled it. Community members described the extra layer of
shock and hardship caused by the shooter’s decision to target a grocery store in an underserviced area
that is otherwise a food desert. The Buffalo Black community has consistently demonstrated its strength
and resilience in the wake of this horror, but we must acknowledge the role that systemic racism and
poverty played in leaving the community vulnerable.

2
Fringe Platforms Fuel Radicalization: Anonymous, virtually unmoderated websites and platforms
radicalized the shooter. By his own account, the Buffalo shooter’s path towards becoming a white
supremacist terrorist began upon viewing on the 4chan website a brief clip of a mass shooting at a
mosque in Christchurch, New Zealand. His radicalization deepened further through engagement with
virulent racist and antisemitic content posted by users on 4chan. The anonymity offered by 4chan
and platforms like it, and their refusal to moderate content in any meaningful way ensures that these
platforms are and remain breeding grounds for racist hate speech and radicalization. In the wake of the
Buffalo shooting, graphic video and images of the shooting proliferated through 4chan more than any
other site viewed by my office. When discussing its policy on such content, a head moderator said that
“it’s not even against the rules” because “the footage itself isn’t illegal, any more than footage of any act
of violence is illegal.” In the absence of changes to the law, platforms like 4chan will not take meaningful
action to prevent the proliferation of this kind of content on its site.

Livestreaming Has Become a Tool for Mass Shooters and Requires Regulation: The Buffalo shooter was
galvanized by his belief that others would be watching him commit violence in real-time. Livestreaming
has become a tool of mass shooters to instantaneously publicize their crimes, further terrorizing the
public and the communities targeted by the shooter. Livestreaming is also used by shooters as a
mechanism to incite and solicit additional violent acts. Twitch, the platform used to livestream this
atrocity, disabled the livestream within two minutes of the onset of violence, an improvement over
Facebook’s response to the livestream of the Christchurch attack, where the video was only removed
after the attack ended. But two minutes is still too much. Even this relatively short video is enough for the
horrific content to spread widely and to inspire future shooters.

Online Platforms Currently Lack Accountability: By his own account, the shooter’s path to
radicalization was cemented by explicitly racist, bigoted, and violent content he viewed online on 4chan,
Reddit, and elsewhere. He used the platform Discord to keep a private journal for months, where he
wrote down his hateful beliefs and developed specific plans for equipping himself and perpetrating his
massacre. He livestreamed his attack through both Twitch and Discord. In the wake of the attack, other
users disseminated graphic video of his attack throughout the internet, everywhere from fringe websites
to mainstream platforms like Facebook, Instagram, Twitter, and others. The First Amendment has no
categorical exemption for hate speech; most of the content the shooter viewed is rankly offensive, but its
creation and distribution cannot, constitutionally, be unlawful. Moreover, even when a user posts content
that is unlawful, Section 230 of the Communications Decency Act of 1996 (CDA), codified at 47 U.S.C.
§ 230, largely insulates platforms from liability for claims related to their content moderation decisions.

3
Voluntary Commitments are Limited: In the wake of the Christchurch shooting, many large, established
online platforms adopted reforms under a framework that did improve the responsiveness of those
platforms and increase their effectiveness in identifying and removing content depicting the Buffalo
shooting, including the video of the shooting and the shooter’s manifesto. This effort has been led in part
by the Global Internet Forum to Counter Terrorism (GIFCT), of which most of these mainstream platforms
are members. But GIFCT’s current framework is an ineffective deterrent against white supremacist
extremists. It is voluntary and lacks an enforcement mechanism. Moreover, the platforms themselves
define what content constitutes objectionable content, within the framework of their private financial or
ideological interests, which lacks sufficient transparency and public input.

These and other findings detailed in this report demonstrate that more must be done to prevent future
violence arising from online radicalization. We must also reduce the ability of perpetrators of these crimes from
promoting their criminal acts and from promoting their hateful ideologies in the process, using the attention that
accompanies violence as a method of dissemination. My office has several recommendations set forth in this
report, both for lawmakers and for industry participants:

Criminal Liability for Perpetrator Created Homicide Videos: We recommend that New York (and other
states) impose criminal liability for the creation by the perpetrator of a homicide, or someone acting in
concert with the perpetrator, of images or videos depicting the homicide. Such videos are an extension of
the original criminal act and serve to incite or solicit additional criminal acts.

Civil Liability for Platforms and Individuals: In addition, we recommend the creation of new civil
liability—with significant deterrent penalties—for the transmission or distribution of video or images
captured by or created by the perpetrator of a homicide depicting the homicide. Such liability should
include sufficient exemptions to avoid overinclusion of content with social, educational, or historical
value. In combination with reform of Section 230, this liability would extend to online platforms that do
not take reasonable steps to prevent this content from appearing on their platforms.

Reform Section 230: Under Section 230 of the CDA, online platforms are generally spared liability for the
content posted by their users, regardless of their moderation practices. We recommend that Congress
reform the law to require an online platform to take reasonable steps to prevent unlawful violent criminal
content (and solicitation and incitement thereof) from appearing on the platform in order for it to reap
the benefits of Section 230.

Reasonable Steps Must Include Limits on Livestreaming: Livestreaming was used as a tool by the
Buffalo shooter, like previous white supremacist shooters, to instantaneously document and transmit
his violent acts in an effort to secure a measure of fame within that milieu and radicalize others.
Livestreaming on platforms should be subject to restrictions—including verification requirements
and tape delays—tailored to identify first-person violence before it can be widely disseminated. The
protections afforded by Section 230 should not extend to any platforms that do not take reasonable
steps to restrict livestreaming in this way.

4
Improved Industry Transparency: Online platforms must be more transparent about their content
moderation policies and how those policies are applied in practice, including those that are aimed at
addressing hateful, extremist, racist, and violent content. This will aid accountability and give the public
more clarity regarding the nature and success of industry moderation efforts.

Additional Industry Investment: Online platforms must invest in improving industry-wide content
moderation technology and procedures, including by expanding the types of content that can be
analyzed for violations of their policies, improving detection technology, and providing even more
efficient means to share information about the existence of violative off-platform content to which
users of their services attempt to direct people through the use of off-platform links. In addition, domain
registrars, web hosting companies, and other internet infrastructure and online service providers stand
in between online platforms and users. These companies should take a closer look at the platforms and
websites that repeatedly traffic in violent, hateful content, and refuse to service sites that perpetuate the
cycle of white supremacist violence.

We can no longer rely entirely on the industry to regulate itself through voluntary commitments. The cost is too
high, and we must take every step we can to prevent violence predicated on hatred and bigotry, fueled by the
unlimited consumption of vitriolic racist content online. This must, of course, be coupled with commonsense
reforms to address gun violence. We must take action now and march towards a future where online platforms
no longer enable horrific mass shootings and hate-based violence.

Letitia James
New York State Attorney General

5
II. Background

A. The OAG’s Investigation into Online Platforms


This report is drafted in accordance with the May 18, 2022 referral issued by Governor Kathy Hochul pursuant to
Executive Law § 63(8). Governor Hochul’s referral letter instructed Attorney General Letitia James to focus on “the
specific online platforms that were used to broadcast and amplify the acts and intentions of the mass shooting”
that occurred that day.

The Buffalo shooter belongs to a new category of extremists who become radicalized online. In his own words,
the internet was the source of his radical beliefs: “There was little to no influence on my personal beliefs by
people I met in person.” On online platforms, the Buffalo shooter found a community of hate that warped his
worldview with racist conspiracy theories, pseudoscience, and a cult of worshipping past mass murderers. These
communities thrive on all sorts of online platforms. Pursuant to the Governor’s instructions, rather than attempt a
survey of the state of white supremacy on the internet, this report focuses on the platforms known or believed to
have been central to the Buffalo shooter’s use of the internet to plan and broadcast his attack, and the platforms
that the investigation by the Office of the New York State Attorney General (OAG) found were used in some way
to spread graphic content arising from the shooting.

Some of these platforms, like Facebook, YouTube, and Twitter, are so ubiquitous as to be familiar to most readers
of this report. Others, like Discord, Reddit, and Twitch, are extremely popular and very well-trafficked, but contain
features and use lingo that may not be clearly comprehensible to those unacquainted with the services. Still
others, like 4chan and 8kun (and their derivatives), belong to a corner of the internet that may occasionally
reach the casual internet user, but likely do not have the reach to make their operation and usage widely familiar.
To assist the reader in understanding how these platforms played a role in this incident, the following is a brief
description of the platforms that were part of this investigation and that, in the OAG’s opinion, fall into the latter
two categories.

1. 4chan, 8kun, and their Derivatives


4chan is an anonymous imageboard, an online forum similar to a bulletin board system that centers around
the posting of images. 4chan is organized by a number of boards, with each board covering a topic area.
For instance, the board entitled “Politically Incorrect” is usually referred to as “/pol/”. The board dedicated to
weapons is referred to as “/k/”. Posts on 4chan are collected into “threads,” in which users may reply to the
original post. Unlike most bulletin board systems, 4chan does not have a registration system. Users are given the
option to enter a name when posting a reply, but need not do so; the default username given to any post where
the poster has not entered a name is “Anonymous.”

6
Each board contains ten pages of threads. Threads are “bumped” to the top of the board when a reply is
made, unless the thread has reached a set “bump limit.” When the bump limit is reached, the thread descends
through each page as new threads are created. When a new thread is created, the last thread on the last page is
“pruned,” or removed from the board index. The thread can still be accessed by users who had the thread open
on their page, but will be archived and eventually deleted after some time passes; that time is set differently for
each board, although in many cases that time is relatively short. In the case of /pol/, as of August 2021, archived
threads are deleted after 72 hours. By comparison, archived threads within the /v/ board, for discussion of video
games, are deleted after 170 hours. 4chan’s nearly complete anonymity and minimal moderation frequently
results in the posting of hate speech, conspiracy theories, and obscene and offensive images.

8kun, formerly known as 8chan, is an imageboard similar to 4chan. Some differences include the ability of any
8kun user to create their own board—users on 4chan must post within pre-determined boards—and the inability
to enter a username into a reply form when posting. 8kun was created as an alternative to 4chan and is well-
known for hosting posts from the individual or group at the center of the QAnon conspiracy theory.4 4chan and
8kun have spawned several imitators, at least one of which the shooter references in his writings.

The Buffalo shooter’s indoctrination into internet hate culture is most strongly linked to his use of 4chan. That site
is also correlated with the initial spread of graphic content related to the shooting. Although initial reports linked
the shooter to use of 8kun, references to the site are few and far between in his writings; instead, he seemed
to have intended to post a link to his attacks on an 8kun imitator, although because of the anonymous and
temporary nature of these boards, the OAG was unable to ascertain whether he ultimately did so.

4chan monetizes its site through two primary methods. First, it collects ad revenue, although reportedly, it has
historically faced difficulty in maintaining a steady stream of revenue from advertisers. Most recently, in June 2022
advertising company Bid Glass appears to have discontinued a two-year-long relationship with 4chan. In 2016,
the site’s operator claimed its ad revenue was too low to sustain operations indefinitely—although, of course,
it survived.5 When it has been able to lure advertisers, it has mostly relied on advertisements for pornography,
cryptocurrency, non-fungible tokens, and online video games.6 4chan also sells “4chan Passes” to users, which
allow users to make posts without having to complete a CAPTCHA challenge.

4
See Brandy Zadrozny & Ben Collins, How Three Conspiracy Theorists Took ‘Q’ and Sparked QAnon, NBC News (Aug. 14, 2018),
https://www.nbcnews.com/tech/tech-news/how-three-conspiracy-theorists-took-q-sparked-qanon-n900531.
See Jay Hathaway, Is 4chan about to go totally broke?, Daily Dot (Oct. 6, 2016), https://www.dailydot.com/unclick/4chan-broke-
5

malicious-ads-deleting-boards.
6
See Justin Ling, Who owns 4chan?, Ars Technica (May 29, 2022), https://arstechnica.com/tech-policy/2022/05/who-owns-4chan.

7
2. Discord
Discord is a text and voice chat platform which optionally allows its users to share their screen or livestream
video. The company’s history lies in connecting video game players; it is still frequently used in the gaming
community, but is used now by many kinds of communities, organizations, and individuals to connect with each
other, including law firms, university classes, and others. The platform is organized into servers—not actually
computer servers, but rather spaces on the platform created by specific communities and groups as hubs for
communication. Servers are further organized into text and voice channels, which are often dedicated to specific
topics. The majority of servers are small and invitation-only.

The platform is monetized entirely through a subscription model; there is no advertising. Although use of the
platform is free, users may pay a subscription fee to access additional on-platform benefits, such as the option to
customize their appearance on the platform and the ability to use custom emotes (or emojis) in any server they
join. Users may also pay a fee for a “Server Boost” which provides special perks to a server, like upgraded audio
and video quality and the ability to access more custom emotes for use within the server.

As described in greater detail below, the shooter kept a diary of sorts on a private Discord server to which he
restricted access only to himself, until he used the Discord server on the day of the shooting to rebroadcast the
livestream of his attack and invite others to watch the livestream and read his writings.

3. Reddit
Reddit is an aggregation and discussion site distinguished by the use of user-created boards referred to as
“subreddits,” each with its own subject matter. Users, identified by unique pseudonymous usernames, submit
posts to a subreddit (usually denominated with a preceding “r/”). Other users can respond to those posts as
comments and can choose to promote the content with upvotes or demote it with downvotes. Subreddits
may either be public or private. Reddit’s popularity translates into a vast amount of user-generated content.
According to the company’s own metrics, Reddit users created nearly 430 million posts and 2.7 billion comments
across the platform in 2021.7 Reddit monetizes its platform through a combination of contextual advertising and
subscription plans that permit ad-free browsing and other perks.

As described in greater detail below, the shooter used certain subreddits to engage in hateful dialogue and
educate himself about gear and tactics for use in his attack. After the attack, Reddit was used to facilitate the
spread of graphic content related to the shooting.

7
See Reddit, 2021 Transparency Report 3 (Feb. 16, 2022), https://www.redditinc.com/assets/images/site/Reddit-Transparency-
Report-2021.pdf.

8
4. Twitch
Twitch is an online platform owned by Amazon and built largely for interactive livestreaming. Content creators
can stream multiple feeds at once—for example, a typical livestream of a content creator playing a video game
might include a stream of a camera pointed at the broadcaster superimposed in a corner over a stream of that
person’s computer screen. The bottom of each livestream generally includes information about the stream, and
a chat box allows viewers and the broadcaster to interact with each other.

Twitch permits some creators to monetize their livestream channels through advertising, sponsorships,
subscriptions to that creator’s content, and the purchase of digital goods by viewers. Creators are only invited to
monetize their channels upon reaching certain threshold streaming and viewership metrics, and must register
with Twitch and agree to the platform’s monetization rules.8 The platform collects a portion of these payments.

As described below, the shooter used Twitch to livestream his attack.

B. The Shooting in Buffalo


On May 14, 2022, an 18-year-old white male drove from his home in Conklin, NY, to a Tops Friendly Market on
Buffalo’s East Side, with the intention of “killing as many blacks as possible.”9 According to his own writings,
he had chosen that supermarket because of its predominantly Black clientele and had staked it out at least
three times in the months and days before the shooting.10 He was armed with a Bushmaster XM-15 assault rifle,
a 12-gauge shotgun, a loaded bolt-action rifle, and several magazines of ammunition. He wore body armor,
camouflage fatigues, and a tactical-style ballistic helmet. He had illegally modified the Bushmaster to make it
more lethal and painted it with racist slogans. He planned to storm the supermarket, kill the security guard, and
“never stop firing” at innocent Black customers.11

Shortly before 2:00 p.m., the Buffalo shooter invited several users to a chat room on the online platform Discord,
where he posted a link to a livestream and the contents of a manifesto and personal diary he had written to
justify his violence and inspire future shootings. The shooter began livestreaming using the online platform
Twitch at approximately 2:08 p.m., using a GoPro video camera attached to his helmet. The 2019 livestreamed
Christchurch, New Zealand mosque shootings served as a model for the Buffalo shooter. He hoped that
broadcasting his brutal slayings would radicalize others into copycat acts of violence.12 His livestream lasted
approximately 24 minutes; for most of that time, the livestream consisted of the shooter driving in his car as he
prepared to perpetrate the shooting. Approximately 22 minutes into the livestream, he stepped out of his car and
began shooting. Twitch stopped the livestream approximately two minutes after the first person was shot.

8
For the sake of clarity, the Buffalo shooter was never eligible to—and did not—monetize any of his streams on Twitch.
9
Buffalo shooter’s manifesto at 58 [hereinafter “Manifesto”].
10
See Gunman Kills 10 at Buffalo Supermarket in Racist Attack, N.Y. Times (May 17, 2022), https://www.nytimes.com/live/2022/05/14/
nyregion/buffalo-shooting#the-accused-gunmans-racist-manifesto-outlined-a-plan-to-kill-blacks-and-referred-to-replacement-
theory; Manifesto at 57.
11
See Manifesto at 60.
12
Id. at 5.

9
Video captured from the livestream shows the shooter driving to Tops with his assault rifle intermittently visible in
the passenger seat and his tactical helmet intermittently visible in the rearview mirror.13 As he arrived in the Tops
parking lot he said aloud, to himself and his streaming audience, “I just gotta go for it right? It’s the end, right
here, I’m going in.” Between 22 and 28 people watched some part of the livestream.

The shooter stopped his car directly in front of the grocery store and immediately started shooting. He shot four
Black people outside of the store, knocking all four to the ground, killing three. As he walked toward the store, he
shot one of the fallen victims in the head to ensure they were dead, just as he threatened to do in his manifesto.14
The fourth victim lay motionless and pretended to be dead, escaping only once the shooter entered Tops.15

Once inside the store, he resumed firing and killed two Black customers. He stopped to reload his weapon, using
a detachable magazine that he had modified his assault rifle to accept—in violation of New York law. He once
again shot a fallen victim in the head. He turned a corner and exchanged fire with the store’s Black security
guard, a former Buffalo police officer named Aaron Salter. The shooter had prepared for that confrontation. At
a previous visit to the store, he had taken note of the security guard and his firearm, and had purchased body
armor specifically to protect against it. Salter shot the Buffalo shooter, but the shooter’s body armor left him
unharmed. He returned fire and killed Salter.16

He then pointed his rifle at a white male Tops employee who had been wounded during the shooting, but did
not fire. He instead apologized and continued searching for more Black people to kill. The white male employee
survived.17 He moved towards the pharmacy, shooting indiscriminately; during this period, he shot a white Tops
employee. She also survived. The shooter went to the store’s checkout area where he killed one Black person. He
headed to the aisles and killed three more Black people. Before the Buffalo Police Department arrived, reportedly
less than two minutes after the shooting began, the shooter fired his weapon approximately 60 times.18

During the attack, Tops employees and customers hid wherever they could—in a stock room, conference room,
freezer, and dairy cooler. Some were able to flee through the store’s rear door.19 Tops cashier Fragrance Harris
Stanfield recalls not knowing where she was running at first while trying to escape. She remembers getting
“knocked to the side by a customer.”20 Taisiah Stewart recalls losing his sandal as he ran barefoot out of the back
exit and almost three-quarters of a mile to escape the shooting.21

Surviving video clips show only the final seven minutes of the shooting livestream, and little is known for certain about the first 17
13

minutes. Contemporaneous posts on 4chan from purported witnesses describe the shooter “hyping” himself up for the violence to come.
14
See Manifesto at 60 (“I shoot all downed blacks twice in the head”).
15
See Criminal Complaint at 4, United States v. Gendron, No. 22-cr-00109 (W.D.N.Y. June 15, 2022) [hereinafter “Complaint”].
See Caitlin Yilek, Buffalo Shooting Supermarket Survivor Recalls Escape, CBS News (May 16, 2022),
16

https://www.cbsnews.com/news/buffalo-shooting-supermarket-survivor-recalls-escape.
17
See Complaint at 4-5.
18
Id. at 5.
19
Id.
20
Yilek, supra n. 16.
See Michael Ruiz, Buffalo Shooting Survivor Recounts Harrowing Escape After Witnessing Start of ‘Hate’-fueled Attack, Fox News
21

(May 16, 2022), https://www.foxnews.com/us/buffalo-shooting-hate-crime-survivor-escape.


10
C. The Shooting’s Impact on the Community
The Buffalo shooting wrought inconceivable damage on Buffalo’s Black community. In an attempt to convey that
experience, the OAG interviewed over a dozen witnesses, including victims’ family members, Tops employees,
and Buffalo community leaders. The enormity of their pain is not easily put into words. But the witnesses shared
how the massacre upended their lives, shattered Black Buffalo’s sense of safety, and left an indelible record—
the shooter’s video—that perpetuates their suffering. These individuals bore witness for their lost loved ones to
spur social change. Their perspectives differed, but nearly every witness demanded legislative action to prevent
further atrocities. It was their hope that their stories would give policymakers the courage to act decisively
against white supremacist violence online and in the real world.

1. Personal Stories from Victims’ Families


Kimberly Salter lost her husband, Aaron, a retired police officer and Tops security guard who died defending his
community from the shooter. In her words:
My best friend was taken from me. The love of my husband was taken from me. That person had no right
to do that. God gave my husband to me, and [the shooter] took him away from me and my family. I’m
hurt, beyond words, and I grieve daily.

The shooter came to harm, kill, and destroy, but God saw differently. Many lives were saved that day
because of what God called my husband to do.

The world looks at my husband as a hero, but that’s what my husband was every day of his life. He loved
God and served his community. He did what comes naturally. You can’t put on heroism for today and
take it off tomorrow. Aaron had on a polo shirt and shorts, and he went against a man with full armor
and an assault rifle. My husband had only a Glock, but my husband engaged that man. He once told me,
“I’m not trained to run. I’m trained to engage.”

There were times when I selfishly did not want my husband to go to work. We loved our life together. But I
could do nothing but raise my hands and let him do what God called him to do. He went to work that day
and God used him, and I live with that every day.

James Young lost his mother, Pearl Young, who was one of the first victims of the shooter’s rampage. In his words:
Pearl Young was just love. She loved everybody, accepted everybody. She was that one person. It didn’t
matter if she knew you or not, she would take you in. If you were hungry, she would feed you. If you
needed clothing, she would clothe you.

If I could speak to her today, she would say, “Forgive him.” That’s the hardest thing for me. I’ve been trying
to go along with what she would want, what she would do. She would not condemn the shooter. She
would just say, “Pray for him.” She believed in God and would believe God could change even him. It hurts
most because of knowing who she was.

11
Garnell Whitfield Jr. lost his mother, 86-year-old Ruth Whitfield, the oldest victim of the massacre. In his words:
It’s like somebody reached inside of me and pulled the best part of me out. I felt empty. I lost my mother,
the person who loved me more than anybody in this world. I lost the person who’s been my champion, my
caretaker, my advocate, my supporter. She was everything to me. I see my mortality much clearer now
than ever before. She would have been married 68 years to her husband, and she religiously visited him
every day in the nursing home.

My father has dementia, but he definitely knows that he doesn’t see her. I don’t know what to do. We can’t
do for him what she did. We can’t make him feel the way she made him feel.

Zeneta Everheart’s son Zaire, who worked at Tops, was shot but miraculously survived. In her words:
I had just left the house that Saturday, and Zaire called me. It wasn’t a call I was expecting. I answered
the phone, and Zaire was screaming. He doesn’t scream. He’s a gentle giant. He was screaming, “Mom,
Mom, get here now, I’ve been shot.” When I got to the hospital, it was bittersweet seeing him. There were
tubes everywhere, blood everywhere. The blood was just incredible. All over the floor. All over Zaire. I
could see the blood in his hair. The shooter shot the woman Zaire was helping point blank, and her brains
were all over him. He still has a lot of shrapnel inside him, and we don’t know what the long-term effects
will be.

I know he’s different. Zaire has always been a homebody, but it’s ten times multiplied now. I’m concerned
depression is setting in, that he has survivor’s guilt.

I go to therapy, but how do you get past your child almost dying? The doctor said this much in any other
direction and he would be dead. He would have bled out right there.

2. Effect on the Community


The attack rocked a closely knit community where most people knew somebody who died. Witnesses described
East Side Buffalo’s community spirit as both a strength and a vulnerability. “Black people—we love everybody, and
to an extent we love too much. As a result, our safety was compromised,” said Tops employee Naomi JeanPierre.
Pastor Tim Brown echoed that sentiment, “African Americans tend to be very accepting of whoever comes
in.” The result, in their view, was that the shooter could visit Tops several times, and take note of its layout and
security, without raising any suspicion. “We talked to that man. We treated him like a human being. He bleeds
like we do, and we treated him like a human being,” said Tops employee Lorraine Baker. “This young white
supremacist . . . went to one of the poorest neighborhoods in Buffalo, and he scouted, like an NFL coach scouting
another football team. There’s no talk about how easily he was able to go in there and just hunt,” said Mark
Talley, son of victim Geraldine Talley.

12
The shooting left the community feeling that further attacks can happen at any time. “You can feel the fear.
When people talk, there’s a fear in the conversation. It’s an undercurrent along with anger. The community
doesn’t feel safe. How can this guy get away with doing this—make this plan, get everything he needed, and
carry it out?” said James Young. Latisha S. Slaughter, daughter of slain former police officer Aaron Salter, told us,
“I have anxiety just putting gas in my car. I’m always looking over my shoulder. You don’t know people’s mindset,
what they are thinking.” April N.M. Baskin, Chairwoman of the Erie County Legislature, agreed: “The shooter tore
through the lives of dozens of families. The senselessness of the attack is something that we are all still coming to
grips with. There is still a sense of fear at public gatherings.”

By attacking the Tops Market—the lone grocery store in a food desert—the shooter left the community feeling
particularly vulnerable. “Grocery stores are something we all have in common. We all need groceries to feed our
family,” said Latisha Slaughter. Pastor Brown said, “We fought for those stores for years. We fought diligently to
feed the community. On any given Saturday, any one of us could have been in this store.” Tops employee Lorraine
Baker put it more bluntly, “Saturday is the day our babies are here.” The attack shuttered the Tops Market for
months and left many unwilling to go back even when it reopened. “He left the community starving. It was the
only supermarket in the area,” said Mark Talley.

Witnesses described the role that systemic racism and poverty played in leaving the community vulnerable.
“He traumatized an already traumatized community. East Side Buffalo has been hurting for so many different
reasons,” said Zeneta Everheart. “This did not begin on May 14,” said Garnell Whitfield. “Our community, Black
Buffalo, has been struggling for a long time—intentionally under-resourced, criminalized.” Whitfield noted
that the shooter easily found a concentrated group of Black targets at the Tops Market because decades
of underinvestment had left East Buffalo a food desert, with only one grocery store. He said that without
confronting such manifestations of racism, “nothing else matters.” In his words, “If you don’t deal with the root,
it’s going to keep growing up.” According to Majority Leader Crystal Peoples-Stokes, the massacre “highlighted
for the outside world the inequities, discrimination, and racist infrastructure baked into our segregated
neighborhoods and institutional systems.”

One refrain from witnesses was a lack of resources to understand and combat hate. “The community has
a better way of dealing with someone if they know how to stop them. There’s no way to know who is being
indoctrinated now,” said James Young. “We’re at war with an enemy we can’t see,” said Dr. Lavonne Ansari of
Community Health Center. “We are not trained in domestic terrorism or in the dark websites that radicalize. We
need more transparency on what does someone who has been radicalized look like—how do you recognize
him?” Buffalo Council President Darius G. Pridgen echoed that concern, “We need to ensure that individuals and
groups are being better monitored online and investigated so we can prevent further terrorism.”

13
Without underplaying the lasting harm, some witnesses sounded a hopeful note, describing the community’s
resilience and strength. “Everybody has a breaking point, and this broke a lot of people. But there’s a collective
feeling in the whole Buffalo community that we’ll rebuild and get better,” said Zeneta Everheart. “In times of
crisis, our people bind together. There’s been no outpouring of hate or reciprocal violence. I’m very proud of my
community,” said Garnell Whitfield. Mark Talley started a nonprofit, Agents for Advocacy, to fight food scarcity
and other manifestations of poverty—and to keep his mother’s memory alive. His organization provides school
supplies, feeds the hungry, and promotes health initiatives in the Black community. “I’m doing this organization
to combat why she died,” he said. Majority Leader Peoples-Stokes emphasized the need to continue to fight: “I
hope that the outpouring of support will extend to asking why there was only one supermarket, why he targeted
us, and how we can stay united against hate.”

3. Residents’ Views on the Responsibility of Social Media Companies


The online nature of the massacre preoccupied many witnesses, who described the pain of knowing the
shooter’s video will exist on the internet forever. “That video was everywhere in East Buffalo, and the families
have to continue to relive the tragedy. As soon as the video started coming across it should have been shut down
immediately,” said Pastor Brown. “It hurts that there is a video like that out there of my child,” echoed Zeneta
Everheart. “He streamed this live, and it looked like a f****** video game,” said James Young. “It hurt when my
grandson saw it. It hurt me to think about him, 15 years old in high school, and your classmates find out that your
great-grandmother was one of the victims,” he continued. Tops employee Melanie Jean-Pierre said, “The video is
in the back of my head. This is something I will never forget.” Kimberly Salter, whose husband’s death can be seen
on the video, said, “To know that it was filmed and that people were watching it like a movie is disgusting.” She
seeks accountability: “I want everybody who was a part of that—who filmed it, who let him film it—I want them all
to suffer consequences for the hate display.”

Nearly every witness echoed Salter’s entreaty to increase accountability for social media companies. “We need
to have stronger laws against preaching hate on public platforms. When you preach hate, the next thing that
comes is death. Social media sites that are allowed to do that, it’s almost like we’re enabling them to set up
the next tragic death,” said James Young. “The Attorney General has the responsibility to hold social media
platforms accountable,” said Pastor Brown. “People should have their opinions and free speech. But we should
draw a line in the sand when somebody says they are going to kill people because of who they are.” Zeneta
Everheart agreed, “Any platform that allowed the viewing, sharing, tagging of this video, should all be held
responsible. This hurts.” Dr. Ansari asked, “If Congress does not act, how can we hold the social media industry
accountable for the domestic terrorism that’s hitting our community every single moment—to the point where
somebody is coming in to blow us away?” Tops employee Melanie Jean-Pierre asked succinctly, “What is the
federal government going to do to protect Black people?”

14
Buffalo’s elected officials also called for increased accountability for social media companies. Mayor Byron
W. Brown said, “The 5/14 mass murderer researched his Buffalo target, spread this hateful manifesto, and
livestreamed his murders with disturbing ease. Much more must be done to ensure that there is no place on the
internet for hate speech, for hate indoctrination, for spreading hate manifestos.” Congressman Brian Higgins
agreed: “For too long, mega companies have sidestepped their responsibility to content moderation and have
not been held accountable for acting as a host for dangerous and criminal activity.” In the words of State Senator
Tim Kennedy, “Social media companies need to be held accountable. They must better monitor the hate speech
that pervades feeds and platforms and commit to removing threats and dangerous, aggressive propaganda.”
And Erie County Executive Mark C. Poloncarz concurred: “Social media organizations have a moral responsibility
to identify and remove threatening content, especially indirect and more discreet threats that are often more
difficult to flag and investigate than direct threats and clear statements of intended violence.”

D. The Buffalo Shooter’s Ideology: “White Genocide,” the “Great Replacement”


Theory, and Accelerationism
Shortly before the attack, the shooter shared a manifesto describing his racist beliefs with the explicit goal of
provoking future mass shootings. It offered a pastiche of different racist theories and memes largely copied from
extremist message boards and manifestos left by past mass shooters. It also included extensive discussions
of firearms, firearm modifications, and body armor as a way of giving practical advice to future killers. His
manifesto, like the manifestos of several previous white supremacist mass shooters, calls for future violence
against Black people and others perceived as non-white in order to protect an ideological white race.

The manifesto’s most prominent theme concerns the supposed decline of the white race and its replacement
by other demographic groups—what he calls “WHITE GENOCIDE.”22 The concept that white birth rates are
declining and being surpassed by those of other races—white demographic decline—has deep historical roots
reaching back to at least the 19th century. It motivated the 20th century eugenics movement in the United States
and Europe. And it has become an organizing principle of white supremacist extremists across the globe. In the
late 20th century, white supremacist extremists began propagating a theory of “white genocide” as a Jewish-
led conspiracy to lower white birthrates in order to replace white majorities in perceived white homelands with
other races.23 A version of this theory is known as the “Great Replacement” after a book by the same name.24
For a decade, this was a fringe theory touted by extremists. Recently, however, the “great replacement” has
entered mainstream political discourse through pundits such as Tucker Carlson of Fox News, who has repeatedly
referenced the theory, particularly with respect to Democratic national politics.25

22
See Manifesto at 2.
23
See, e.g., Sara Kamali, Homegrown Hate: Why White Nationalists and Militant Islamists are Waging War against the United States
113–15 (2021).
24
Renaud Camus, Le Grand Remplacement (2011).
25
See, e.g., Graig Graziosi, Video of Tucker Carlson Promoting ‘Great Replacement’ Theory Surfaces Again, Independent (May 16, 2022),
https://www.independent.co.uk/news/world/americas/us-politics/tucker-carlson-video-great-replacement-theory-b2080264.html
(collating Carlson’s references to the theory, including a September 22, 2021 clip from Carlson’s television show where he states—above
a chyron reading “An Unrelenting Stream of Immigration” and next to a photograph of President Joe Biden captioned with the phrase
“Mass Amnesty”—“In political terms this policy is called the Great Replacement, the replacement of legacy Americans with more
obedient people from faraway countries”).
15
Some white supremacist extremists believe that the only way to counteract “white genocide” and the “great
replacement” is through a violent confrontation between the races. Sometimes referred to as “racial holy war”
or “RAHOWA,” these extremists believe that white people’s future can only be guaranteed by an apocalyptic
battle that extirpates those viewed as non-white from perceived white homelands.26 Those who embrace this
idea also embrace the notion of “acceleration”: that acts of violence, especially racially motivated murder, will
hasten racial war. The goal of such violence is to “create immediate societal panic, inspire copycat actors, and
encourage reciprocal or revenge terror acts from affected groups.”27 The goal is to put into effect the “Fourteen
Words,” coined by white supremacist David Lane: “We must secure the existence of our people and a future for
white children.”28

The Buffalo shooter’s manifesto draws from those abhorrent traditions. He explicitly touted his fear of “ethnic
replacement,” “cultural replacement,” and “racial replacement” based on a combination of declining white birth
rates and “[m]ass immigration and higher fertility rates of immigrants.”29 Like many “white genocide” conspiracy
theorists, he portrayed Jews as ultimately responsible for white decline: “The real war I’m advocating for is the
gentiles vs the Jews.”30 Yet, he writes that targeting Black people is more urgent because of their perceived high
fertility rates: “[Jews] can be dealt with in time, but the high fertility replacers will destroy us now, it is a matter
of survival we destroy them first.”31 His aim was explicitly accelerationist. He wanted to “encourage further
attacks that will eventually start the war that will save the Western world.”32 He hoped his attack would inspire
others to commit similar acts and that his followers would have an easier time doing so by following his detailed
manifesto.33

The “Great Replacement” theory has long been a motivator for white nationalists promoting violence both in the
United States and globally. At the 2017 Unite the Right rally in Charlottesville, white nationalists chanted “Jews will
not replace us,” echoing the Great Replacement Theory.34 The individual responsible for the 2019 Christchurch
shooting that left 51 people dead feared that whites were being replaced by Muslim and other non-white
immigrants. The shooter in the 2019 Walmart shooting in El Paso, Texas was motivated by fear of a “Hispanic
invasion.”35

26
Kamali, supra n. 23, at 113–14.
27
Cynthia Miller-Idriss, Hate in the Homeland: The New Global Far Right 14 (2020).
28
Kamali, supra n. 23, at 41–42.
29
Manifesto at 1–2.
30
Id. at 24.
31
Id. at 12.
32
Id. at 4.
See Complaint at 7; Marilyn Mayo, Senior Rsch. Fellow, Ctr. on Extremism, Anti-Defamation League, Remarks at the ADL Webinar,
33

The Great Replacement Theory and How It Motivates Violent Extremists (June 14, 2022).
34
Jonathan Sarna, The Long, Ugly Antisemitic History of “Jews Will Not Replace Us,” Brandeis Univ. (Nov. 19, 2021),
https://www.brandeis.edu/jewish-experience/jewish-america/2021/november/replacement-antisemitism-sarna.html.
“The Great Replacement:” An Explainer, Anti-Defamation League (Apr. 19, 2021), https://www.adl.org/resources/backgrounders/the-
35

great-replacement-an-explainer.

16
E. The Christchurch Shooter and the Model of Virality
The Buffalo shooter may have acted alone, but he saw himself as following in the footsteps of others. In the past
two decades, violent white supremacists worldwide have ritualized a chilling sequence of events: commit a mass
shooting or another atrocity, publish a manifesto, and wait for the next mass casualty.36 The murders these
extremists commit and the manifestos they write are honed into a single weapon to radicalize others and bring
about future violence against those perceived as undesirable.

In 2011, in Oslo and Utøya, Norway, an attacker killed 77 people, injured hundreds more, and published an
online manifesto justifying the violence as a response to Muslim “replacers” and “invaders.” The manifesto
used a chatty question-and-answer style and shared details about the shooter’s life as well as his ideology and
calls to future violence. Researchers have dubbed this the “narcissistic format” of right-wing manifestoes and
speculated that it draws inspiration from celebrity profiles in magazines.37 The Norwegian manifesto included
explicit guidance to other extremists for how best to commit future mass atrocities. And followers have heeded
his advice—in the years since those murders, white supremacists across the globe have followed this playbook.38
The Buffalo shooter, like others inspired by those killings, explicitly referenced the Norwegian killer’s manifesto in
his own writings.

In 2019, in Christchurch, New Zealand, a shooter killed 51 people and injured 40 in attacks on two mosques.
The influence of the Norway shooter was clear—not only did the Christchurch manifesto explicitly reference
that shooter (and others) while adopting the same language drawn from “Great Replacement” theory, but
the Christchurch shooter also followed the Norway shooter’s advice for committing future atrocities by joining
a gym, bulking up with steroids, joining a rifle club, and cleaning up electronic devices to limit government
detection.39 He expressed a continuity of purpose and self-consciously modeled his actions on the Norway
shooter and other past perpetrators of mass violence.

But the Christchurch shooter also changed the playbook in new, deadlier ways. He was the first white
supremacist to livestream his attack, and the video of the shootings went viral. He deliberately sought to create
an online footprint that he hoped would be galvanizing and instructional to fellow right-wing extremists. These
digital artifacts have proved to be indelible and have radicalized others, including the Buffalo shooter, who
deliberately modeled his attack on the Christchurch shooter’s.

36
See Kamali, supra n. 23, at 235–36, 240–43.
37
Graham Macklin and Tore Bjørgo, Breivik’s Long Shadow? The Impact of the July 22, 2011 Attacks on the Modus Operandi of Extreme-right
Lone Actor Terrorists 15 Perspectives on Terrorism 14, 20 (2021).
38
See Kamali, supra n. 23, at 236, 240–43 (identifying five such examples).
39
Ko Tō Tātou Kāinga Tēnei: Report of the Royal Commission of Inquiry Into the Terrorist Attack on Christchurch Masjidain on 15 March 2019
197 (2020).

17
On March 15, 2019, the Christchurch shooter posted links to his Facebook livestream and a written manifesto
on Twitter and the /pol/ board on 8chan. He then began his livestream using a GoPro attached to his helmet,
recording himself putting his car into gear and driving to the Al Noor Mosque. About two hundred people
watched the livestream in real time. The video of the shooting then quickly spread on both mainstream and
fringe social media sites. According to Facebook, the website removed 1.5 million uploads of the video within
the first 24 hours,40 and the video was altered at least 800 times, likely by users trying to evade detection by
moderators.41

As his video went viral, the shooter was lauded as a cult hero in online right-wing extremist forums. Users in
8chan’s /pol/ board praised and quoted the Christchurch shooter’s video and manifesto, and many created
memes and images celebrating the shooting. Many also likened the perpetrator to a religious figure, creating
memes that transposed his face onto images of medieval saints and referring to him as a saint.42

The Christchurch shooter’s written manifesto is littered with memes and sarcasm, mirroring the style of the
right-wing extremist forums he frequented and ultimately catered to during the attack. “Shit posting” in these
forums involves packaging white supremacy, xenophobia, and racism in a medley of crude jokes, coded cultural
references, and meaningless content. Through this format, users can disseminate more direct, explicitly bigoted,
pseudo-factual appeals to violence while maintaining a veneer that the content is “just a joke” and was never
meant to be taken seriously. For example, in a “Q&A” section, the Christchurch shooter wrote that the video
game “Fortnite trained me to be a killer and to floss on the corpses of my enemies.”43 This wink to online gaming
communities—many of which are hotbeds of right-wing radicalization—gives adherents the sense that they are in
on the joke and cultivates solidarity with him.44

40
See Chris Sonderby, Update on New Zealand, Meta (Mar. 18, 2019), https://about.fb.com/news/2019/03/update-on-new-zealand.
See Elise Thomas, Manifestos, Memetic Mobilization and the Chan Boards in the Christchurch Shooting, Counterterrorism Yearbook 19,
41

20 (2020).
42
See Graham Macklin, The Christchurch Attacks: Livestream Terror in the Viral Video Age, 12 CTC Sentinel 18, 25 (2019).
43
Id. at 19.
44
See See Aja Romano, How the Christchurch Shooter Used Memes to Spread Hate, Vox (Mar. 16, 2019), https://www.vox.com/
culture/2019/3/16/18266930/christchurch-shooter-manifesto-memes-subscribe-to-pewdiepie; Linda Schlegel, Extremists’ Use of Gaming
(Adjacent) Platforms: Insights Regarding Primary and Secondary Prevention Measures (Sept. 21, 2021), https://home-affairs.ec.europa.eu/
whats-new/publications/extremists-use-gaming-adjacent-platforms-insights-regarding-primary-and-secondary-prevention_en.

18
The Christchurch shooter’s calls to violence begat more killings. A month later, on the last day of Passover in 2019,
a white supremacist livestreamed his assault on a synagogue in Poway, California. The Poway shooter killed one
and injured three, including the rabbi. His online manifesto claimed the Christchurch shooter “was a catalyst
for me personally.”45 Fewer than four months later, a racist shooter in El Paso, Texas, began his manifesto with
the words, “In general, I support the Christchurch shooter and his manifesto.”46 The El Paso shooter cited the
Christchurch manifesto as the turning point at which Hispanic immigrants became his “target.”47 He killed 22
people, injured 24 others, and claimed this was “just the beginning of the fight for America and Europe.”48

F. The Influence of Prior Extremist Writings on the Buffalo Shooter’s Writings


The Buffalo shooter explicitly claimed the Christchurch shooter as his direct inspiration. He claimed that the
Christchurch shooter’s livestream “started my real research into the problems with immigration and foreigners
in our White lands, without his livestream I would likely have no idea about the real problems the West is
facing.”49 According to his manifesto, the Buffalo shooter found a video of the Christchurch livestream and the
Christchurch shooter’s manifesto on the online image board 4chan in May 2020, a year after the attack. He
sought to “follow [the Christchurch shooter’s] lead and the attacks of so many others like him.”50

Like the Christchurch shooter, the Buffalo shooter used niche cultural references in his manifesto as a signal to
fellow online right-wing extremists, displaying antisemitic and racist memes that circulate widely on right-wing
extremist forums and joking that “covid vaccine juice” could be the reason he planned to commit the shooting.51

The Buffalo manifesto plagiarized liberally from the Christchurch manifesto. It overlaps 63 percent with the
earlier document, with 23 percent of the texts matching word-for-word. The Buffalo shooter also used the
same questions in the same order as the Christchurch shooter in the Q&A portion of his writings.52 The Buffalo
manifesto is significantly longer than the Christchurch manifesto: 47,000 words compared to 17,000 words.
Researchers at the Anti-Defamation League believe the Buffalo shooter used the Christchurch manifesto as
a starting point and then added to it, notably adding a new section blaming Jews for the demise of the white
race.53

45
Kamali, supra n. 23, at 242.
46
El Paso shooter’s manifesto at 1.
47
See id.
48
Kamali, supra n. 23, at 242–43.
49
Manifesto at 8.
50
Id. at 13.
51
See id. at 10.
52
See Striking Similarities Between Gendron and Tarrant Manifestos, Anti-Defamation League (May 24, 2022), https://www.adl.org/
resources/blog/striking-similarities-between-gendron-and-tarrant-manifestos.
53
Id.

19
Although the Buffalo manifesto does sometimes deviate from the Christchurch manifesto, these deviations often
resemble aspects of other right-wing extremist manifestos that pre-date the Christchurch shooting. For example,
his description of firearms and tactical gear mirrors the Norway attacker’s lengthy description of weapons and
body armor. Both shooters provided detailed information about the various weapons and armor available, and
even described the benefits and shortcomings of using different types of gear to carry out an attack.

His manifesto also shares striking similarities with that of the perpetrator of the 2015 Charleston AME Church
shooting. The Charleston manifesto argued that the “issues” with Jews are “not their blood,” but rooted in Jewish
religion and identity.54 The Buffalo shooter made a similar argument in his own manifesto, explaining his belief
that modern Jews are ethnically white, and that Jewish religion is the source of the “problems” with Jews, rather
than Jewish genetics.55 Additionally, both shooters express admiration for East Asians. The Charleston shooter
described his “respent (sic) for the East Asian Races,” and suggested that, if the white race were to come to an
end, East Asian people could “carry something on.”56 Similarly, the Buffalo shooter wrote that East Asians are
“quite admirable,” citing what he perceives as their “superior traditional values and genetics.”57

And the Buffalo shooter drew from previous shooters, who, like the Christchurch shooter, captured their violence
on video. When planning his Buffalo attack, he learned online that “a previous attack was recorded on Twitch
(Halle Synagogue Shooting) that lasted about 35 minutes, which for me shows that there is enough time to
capture everything important. This may not work as intended if it’s reported and taken down early.”58

G. The Buffalo Shooter’s Content Doubles as an Inspirational Guide and


Instructional Manual for the Next Mass Shooter
The Buffalo shooter’s writings show he understood that prior manifestos and graphic content exerted power
in recruiting others to commit mass violence. By creating similar content, the shooter sought to situate himself
within that tradition, attempting to build his legacy and pave the way for others to follow in his footsteps.

In addition to his manifesto, he also kept a personal diary in the months before the Buffalo attack that purports
to provide an intimate window into his ideological beliefs, daily activities, relationships, and psyche. He intended
this document to be read and circulated by others alongside his manifesto as a means of perpetuating his
legacy and provoking future violence. He kept his diary on Discord, a voice and instant messaging social
platform originally designed to facilitate conversation between remote video gamers in real time. He recorded
his daily activities on a personal server that he did not invite anyone to join—access to the content was
apparently restricted only to him. Just before executing his attack, he invited select individuals to view his Discord
logs and published an edited version of these logs—his diary—for an audience that he imagined would want to
understand and possibly mimic him.

54
See Charleston shooter’s manifesto at 4.
55
See Manifesto at 26-30.
56
Charleston shooter’s manifesto at 4.
57
Manifesto at 53.
58
Id. at 142.

20
The transcript of the edited Discord logs is approximately 700 pages, containing original posts written by him,
links to outside content, and memes and images pasted onto his server. Much of what was written in the Discord
logs was incorporated into his manifesto, evidencing his desire to write for posterity’s sake, even in his private
diary.

Nearly one hundred pages in his manifesto detail his research into which weapons, helmets, body armor, and
other tools he planned on buying. He compares prices on different websites and meticulously discusses pros and
cons of multiple models of the same material. He shares links to weapon tutorials and websites where he bought
equipment for the attack. His hope was that these detailed accounts of the materials he used, considered using,
and those which he deemed not good enough for a mass shooting would all collectively help the next shooter
gather the best materials in a much shorter amount of time.

He also meticulously recorded his plan for the attack in the Discord diary, leaving behind a manual for those
who he hoped would follow in his footsteps. The logs detail the process of funding and purchasing equipment,
selecting a target, and configuring technology for broadcasting the shooting. He funded his efforts primarily
through the sale of personal items and silver at flea markets and to buyers he met through 4chan and Reddit.59
Often he could not afford the quality of equipment he would have otherwise selected, and he wrote at length
about what he would have purchased with additional time and resources, attempting to establish himself
as a source of knowledge for future attackers. According to his own writings, his knowledge of weapons and
related equipment came from personal hunting experience, instructional YouTube videos like those of Garand
Thumb, and the 4chan weapons board “/k/”.60 He also used the /k/ board as a marketplace to purchase some
body armor, in addition to browsing gear exchanges, pawn shops, and eBay.61 He bought ammunition and the
rifle he used to commit his attack from McLain’s Sporting Goods and from Vintage Firearms, two gun shops
in New York.62 He also wrote of frequent trips to New York State public lands to practice shooting, detailing
his experiences and setbacks with various configurations of his equipment, and advocating to readers the
importance of training before an attack.

“I took a picture of my coins and I’m going to attempt to sell them all on r/pmsforsale tommorow; . . . r/GearTrade may also be a good
59

option to sell equipment.” Buffalo shooter’s Discord diary at 51, Jan. 13, 2022 [hereinafter “Discord Diary”].
60
“Heres my plateland RMA rant: // Can someone give me the rundown on RMA? Are they still doing sketchy tests on their plates? Are
the UHMWPE they claim actually just fiberglass? Why are their 1189 gen 1 series still available? Why do they make steel plates despite
knowing that they are dangerous? Or more importantly do I have to stop listening to whatever Anonymous says on /k/ // ‘To our
knowledge, our very own model 1189 level 4 hard armor plate is the strongest body armor in the world.’ And whats up with all this on
their blogs and item descriptions? Who makes these? Am I schizophrenic?” Discord Diary at 356, Mar. 16, 2022.
“Just ordered 2 pairs of darn tough T4021 desert tan large socks on eBay, let’s see if it was worth the hype, cost me $56 though.”
61

Discord Diary at 29, Jan. 4, 2022.


62
“Then I went back to Binghamton and went to McClain’s Sporting Goods, and looked at ammo prices for 5.56, I picked up 2 boxes
of BPS 12 gauge 00 Buck for ~18 dollars and change, then I went to Vintage // Firearms and investigated the AR they had very closely.”
Discord Diary at 64, Jan. 19, 2022.

21
The Buffalo shooter’s manifesto also appropriated, word-for-word, the Christchurch manifesto’s calls to future
violent or accelerationist action. In his section “Answers to my people/supporters[’] questions” he tells them to
make plans, form alliances, get equipped and then act.63 Doing so, he claims, will ensure that “one day our own
children can enjoy the rewards of our labor.”64

The Discord logs also provide some information on how and where he was radicalized (with the caveat that
his writings reflect his own agenda). His self identified radicalization coincided with the start of the COVID-19
pandemic. According to his diary, it was then that he began spending time on 4chan. Through posts on 4chan,
he wrote, his radical beliefs first formed.65 His indoctrination into white supremacy and belief in white genocide
was furthered by his activity on Reddit by browsing within certain subreddits and fueled by his research on
eugenics, the Great Replacement Theory, and other longstanding race theories.66

63
See Manifesto at 10.
64
Id. at 166.
65
See Discord Diary at 88, Jan. 30, 2022. See also “I only really turned racist when 4chan started giving me facts that they were
intellectually and emotionally inferior.” Discord Diary at 41, May 5, 2022.
66
“Of course, many of my beliefs come from reddit too. Many subreddits I joined have been banned but they show up on r/
AgainstHateSubreddits all the time. One’s [sic] that are still around include r/greentext, r/4chan, r/PoliticalCompassMemes, r/
SocialJusticeInAction, r/LoveForLandlords, and r/AntiHateCommunites, of which I am actually in their discord :)” Discord Diary at 41,
Jan. 30, 2022.

22
III. The Role of Online Platforms in the Buffalo Shooting

Despite the growing use of platform-specific content moderation policies to combat hateful content and the
coordination that some platforms have undertaken to refine those policies and practices, the Buffalo shooter
describes in his own words the racist and xenophobic content available online that radicalized him. He also
turned to online platforms to seek advice on assembling his lethal arsenal, and used yet other platforms in an
effort to disseminate his violence.

A. Online Memes Helped the Buffalo Shooter Learn about the Great Replacement
Theory and Express his Views in his Manifesto
Globally, internet memes have been an effective means to mainstream white supremacist extremism and
introduce it to new audiences. Memes are “typically visual cultural elements that use jokes . . . in ways that are
shared and altered repeatedly by other users online, usually in anonymous ways.”67 By using humor, memes can
soften extremist ideas and make them more palatable to outsiders, while simultaneously creating an in-group—a
community that understands the sometimes deeply encoded in-jokes.68 It also gives extremists cover while they
incite violence because they can always claim that they were just joking around.69

Like the Christchurch shooter before him, the Buffalo shooter peppered his manifesto with memes, in-jokes,
and slang common on extremist websites and message boards. The manifesto included memes about “black
privilege,” Jewish leaders in the media, Jewish religious ritual, Jewish involvement in U.S. slavery, the creation of
the COVID-19 vaccines, and the NAACP. He emphasized the power of using “infographics, shitposts, and memes
that the White race is dying out, that blacks are disproportionately killing Whites.”70 And he directed anticipated
future mass shooters to “create memes, post memes, and spread memes. Memes have done more for the ethno-
nationalist movement than any manifesto.”71

67
Miller-Idriss, supra n. 27, at 66.
68
See id.
69
Id.
70
Manifesto at 13.
71
Id. at 169.

23
B. Online Platforms Cited by Buffalo Shooter as Formative to Ideology of Hate
In his writings, he identifies the websites that contributed the most to his growing hatred of nonwhites. On
January 30, 2022, he wrote in his Discord diary that his “current beliefs started when I first started to use 4chan
a few months after covid [sic] started . . . [o]f course, many of my beliefs came from reddit too. Many subreddits
I joined have been banned but they show up on r/AgainstHateSubreddits all the time.” Similarly, early on in the
Manifesto, he again points to 4chan—specifically the site’s /pol/ board—as the primary catalyst for his inculcation
into Great Replacement Theory. He cites several fringe white supremacist websites that gave him greater
“exposure” to these ideas, but notes that he was led to these sites from 4chan. Indeed, he identifies 4chan as
the site where he first watched a portion of the Christchurch shooting, via “a short gif of a man walking into a
building and shooting a shotgun through a dark hallway.”

In his own words, he says that he began using 4chan regularly in May 2020. Because his Discord diary only dates
back to November 2021, the full progression of his radicalization is not known to be documented. Throughout
his diary he consistently uses hateful and violent language. He credits much of his ideology and knowledge
to 4chan, and those boards surely played a role in indoctrinating him into hateful, white supremacist internet
subcultures. In his own words just nine days before the attack: “These experiences didn’t make me racist against
blacks though, maybe uncomfortable around the majority of them, since I only relate them to trouble. I only
really turned racist when 4chan started giving me facts that they were intellectually and emotionally inferior.”72

1. 4chan “Politically Incorrect”


The /pol/ (“politically incorrect”) board on 4chan is nominally dedicated to political discussion, but frequently
includes white supremacist, antisemitic, and other types of hateful speech, as well as far-right extremism and
conspiracy theories. In the early stages of planning his attack, the Buffalo shooter wrote, “Every time I think
maybe I shouldn’t commit to an attack I spend 5 min of [sic] /pol/, then my motivation returns.”73 According to his
writings, most of the memes in his Discord diary are from /pol/.74 His Discord logs make it evident that he treated
the racist charts, graphs, and memes he found there as forms of news and proof of white genocide.75

To the extent that 4chan content is moderated, it is performed by volunteer moderators who review posts,
remove content violative of the website’s rules, ban users, and close threads. Additional volunteers referred to
as “janitors” are authorized only to delete posts and submit requests to ban users to moderators. Moderators
and janitors are selected from a pool of applications that are occasionally solicited by 4chan from regular users.
The website has publicly available rules, which include “Global Rules” that apply across all of (or most of) the
website’s boards, and additional board-specific rules. In general, however, the site largely permits any speech
unless it violates United States law or is considered off-topic within the board in which the post is made.

72
Discord Diary at 33, May 5, 2022.
73
Id. at 21, Dec. 27, 2021.
74
Id. at 194, Feb. 23, 2022 (“most of these pictures I get from /pol/ humor threads lel”).
75
See Discord Diary at 88, Jan. 30, 2022 (“Although many threads are utter shitposts of trash quality, I feel there are many problems of
the world being addressed that are not popular or talked about on regular media. White genocide is real when you look at data, but is
not talked about on popular media outlets.”).

24
The /pol/ board on 4chan sees tens of thousands of posts each day. According to one website that tracks usage
data on 4chan, the /pol/ board is the most popular board on 4chan and receives over 115 posts every minute.
The site’s extremely permissive moderation environment, automatic expiration of user-generated content, and
default anonymity contribute to an atmosphere that allows—and even encourages—racist, hateful speech.
Indeed, although 4chan’s publicly posted “Global Rules” dictate that users may not post “Racism” outside of the
anything-goes /b/ board (known as the “random” board, where virtually all posting rules are suspended), 4chan
conceded in a letter to the OAG that this particular rule “is not applied to /pol/.”

4chan also operates outside of the efforts by other online platform to rein in hate speech and graphic content
that contributes to the cycle of white supremacist violence. The company told the OAG, for instance, that they
“have not communicated with GIFCT,” “have not performed any internal investigation of the Buffalo Mass
Shooting beyond that required to respond” to the OAG, and indeed in the last six and a half years, despite
repeated references in that time to the use of the website by perpetrators of mass violence, have only performed
a single internal investigation into the use of their site to advance terrorism, violent extremism, mass shooting
events, or hate crimes—begun in response to an inquiry from the Select Committee to Investigate the January
6th Attack on the United States Capitol—and none at all into the use of the website to promote or facilitate
the sale of firearms, body armor, or other military weapons or equipment, or to illegally modify weapons.76 It is
perhaps no surprise given this hands-off attitude that threads on the /pol/ board regularly include links to white
supremacist, racist, and Nazi literature and ideology.

2. Reddit
Reddit is far more popular and mainstream than 4chan; according to online ranking data firm Semrush, it
is the 9th most-visited website on the internet. Reddit’s extraordinary success fosters innumerable types of
communities, some with anodyne or positive aims and vast membership. For instance, one popular subreddit,
r/AskReddit, describes itself as “the place to ask and answer thought-provoking questions” and claims 36.5
million members. Yet until 2020, the site had no clear platform-wide guidelines preventing hateful speech. As the
Stanford Internet Observatory has put it, the website’s permissive atmosphere and the posts on certain visible
communities gained the website a reputation as a “cesspool of racism.” On the same day that it implemented its
current policy prohibiting hateful content, it also removed over 2,000 subreddits for violating that policy.77

76
See Justin Ling, ‘Cheering Section’ for Violence: The Attacks That Show 4chan Is Still a Threat, Guardian (May 1, 2022), https://www.
theguardian.com/technology/2022/may/01/4chan-extremist-online-forum-raymond-spencer (discussing references to 4chan made in
the writings of a man in Toronto who killed several people in a “vehicle ramming rampage” in 2018, the Christchurch shooter,
and a man in Washington, D.C. suspected of shooting four people in the spring of 2022).
77
See Adriana Stephan, Comparing Platform Hate Speech Policies: Reddit’s Inevitable Evolution, Stan. Freeman Spogli Inst. for Int’l Stud.:
FSI News (Jul. 8, 2020), https://fsi.stanford.edu/news/reddit-hate-speech.

25
Content posted on Reddit is subject both to Reddit’s global Content Policy and to subreddit-specific rules.
Reddit’s platform-wide Content Policy now asserts with respect to hateful content that “[c]ommunities and
users that incite violence or that promote hate based on identity or vulnerability will be banned.” Each subreddit
may have its own specific rules, which may be different or stricter than the global Content Policy. Moderation is
conducted through several avenues: administrators enforce the Content Policy, and moderators patrol individual
subreddits and enforce the rules specific to each of those communities.78 A mixture of administrator/moderator
human review, user reports, and automated tools elevate potentially problematic posts for possible deletion and
further action. Most moderation actions, however, are performed by moderators—users drawn from the Reddit
community—without involvement from Reddit itself.79 The company asserts that this “self-moderation effort
at the community level continues to be the most scalable solution we’ve seen to the challenges of moderating
content online.”80

According to Reddit, 467 subreddits were removed from the site in 2021 for engaging in hateful conduct,
representing a 93% decrease from 2020 in subreddit removal for that reason.81 That apparent success may be at
least partially the result of Reddit’s mass suspension of thousands of subreddits in 2020 upon the implementation
of the website’s hate policy guidelines. Yet of the dozens of million pieces of content removed by administrators
and moderators in 2021, fewer than 40,000 were removed due to hateful content.82 Reddit’s own internal analysis
from the prior year suggested that 40,000 pieces of hateful content were posted to Reddit every day, and
estimated that 78% of “severe hate content” went unreported and unreviewed. Indeed, as the shooter’s Discord
logs indicate, Reddit’s own userbase has taken to policing the site in r/AgainstHateSubreddits and elsewhere.

Some of the subreddits specifically cited by the Buffalo shooter in January 2022 as sources of extremist content no
longer exist; an attempt to navigate to the subreddit’s URL now directs to a page that says, “This community was
banned for violating Reddit’s rule against promoting hate.” Yet the OAG’s investigation revealed, as described
below, that Reddit was an outlier among large, mainstream online platforms in how long it took the company to
remove posts linking to graphic video of the Buffalo shooting, even after the OAG alerted the site by submitting
user reports.

See Reddit, 2021 Transparency Report 4–8 (Feb. 16, 2022), https://www.redditinc.com/assets/images/site/Reddit-Transparency-
78

Report-2021.pdf (Content Removals).


79
Id.at 2 (Introduction).
80
Id.
81
Id. at 13 (Content Types).
82
Id. at 10.

26
C. Online Platform Used by the Shooter to Plan the Details of His Attack
As described above, the Buffalo shooter maintained his own Discord server which he used to keep a
comprehensive diary. On the day of the shooting, he sent out invitations to various individuals to access the
Discord server, which he had also set up to rebroadcast his Twitch livestream. By restricting access to the Discord
server only to himself until shortly before the attack, he ensured to near certainty that his ability to write would
not be impeded by Discord’s content moderation.

Discord’s content moderation operates dually at the individual user and server level, and generally across the
platform. The Buffalo shooter had no incentive to operate any server-level moderation tools to moderate his
own writing. But the platform’s scalable moderation tools also did not stop him from continuing to plan his mass
violence down to every last detail.

Discord does maintain a Community Guidelines policy prohibiting users from making threats of violence or
harm, or from organizing, promoting, or supporting violent extremism.83 But without users or moderators apart
from the shooter himself to view his writings, there could be no reports to the platform’s Trust and Safety Team.
In practice, he mocked the Community Guidelines, writing in January 2022, “Looks like this server may be in
violation of some Discord guidelines,” quoting the policy prohibiting the use of the platform for the organization,
promotion, or support of violent extremism, and commenting with evident sarcasm, “uh oh.”84 He continued
to write for more than three and a half more months in the Discord server, filling its virtual pages with specific
strategies for carrying out his murderous actions. Discord’s automatic tools either did not scan his writing, or
else were not calibrated to develop any assessment about the profusion of popular racist memes, inflammatory
rhetoric, and precise details about a planned mass shooting.

In addition to maintaining his own private diary to discuss his racial hatred and plan his attack, he also appears
to have frequented Discord servers maintained by others. In his writings, he refers to “the people in my discord
groups”85 and to a specific subreddit “of which I am actually in their discord.”86 The OAG has not assessed the
full breadth of his Discord usage, in part because the default private, invitation-only nature of most Discord
servers potentially implicates the protections of the Stored Communications Act (SCA), 18 U.S.C. §§ 2701 et seq.,
which prohibits disclosure by technology providers of the contents of their users’ electronic communications
absent a warrant, the consent of the sender or recipient, or other enumerated exceptions.87 Some of his Discord
usage was undoubtedly unremarkable. At one point in his writings, he mentions “my old discord rust group,”
apparently referring to the video game Rust, which he describes playing several times throughout the log.88 Yet
there is also evidence that he may have used the platform to develop his knowledge of body armor, the usage

83
See Discord Community Guidelines, Discord (Feb. 25, 2022), https://discord.com/guidelines.
84
Discord Diary at 75, Jan. 21, 2022.
85
Id. at 8, May 2, 2022.
86
Id. at 88, Jan 30, 2022.
For purposes of this report, the OAG is not taking a position on the applicability of the Stored Communications Act as to any
87

online platform.
88
Discord Diary at 199, Feb. 25, 2022.

27
of which was a key factor in allowing him to survive the exchange of gunfire with Tops’ security guard, Aaron
Salter.89 Perhaps most notably, he sent the invitation to view the livestream through Discord directly to a group
of users, suggesting that for some reason he may have perceived at least some of them to be receptive to his
extremist message, raising the possibility that he had previously expressed his hateful views to them in some
capacity on the platform.

D. Online Platforms Used by the Shooter to Equip His Arsenal


One of the shooter’s primary uses of various online platforms in the preparation of his assault was to educate
himself about, and in some cases acquire, the arms and armaments that he would eventually bring to Buffalo
in order to maximize the carnage he could inflict in a short timespan and minimize the risk to his own health.
His Discord logs reflect months of research he conducted on rifles, body armor, ballistic helmets, and other
equipment in order to carry out his explicit goals to “kill as many blacks as possible” and “avoid dying.”

Dozens of pages of his manifesto are dedicated to distilling the information he gleaned about rifles, firearms
components, and body armor. He asserts that his online research, including browsing instructional firearms
videos on YouTube, helped instruct him on how to convert the fixed magazine in the rifle he used in his massacre
to a base that could accept a detachable magazine, allowing for swift reloading and a higher ammunition
capacity. He used 4chan’s /k/ board, which is dedicated to firearms discussions, to seek advice about his gear
and to buy firearms components and equipment to facilitate his attack. Through his use of /k/, he also found a
Discord channel named Plate Land that he used to collect advice about ballistics and protective gear, the use of
which helped him survive an early exchange of gunfire with, and ultimately kill, Tops security guard Aaron Salter.

1. YouTube
Although the shooter asserts that he received instruction from YouTube videos in how to convert an AR-15-style
rifle with a fixed magazine to accept a detachable magazine—in contravention of New York law—the YouTube
videos he cites for this knowledge do not clearly provide specific direction on how to do so. One of the videos
actually demonstrates the use of an attachment to convert a rifle to use only a fixed magazine in order to comply
with New York and other states’ assault weapons bans. The presenter just happens to mention that the product
box itself notes that the device can be removed with a drill.

Another video cited by the shooter in the context of his illegal assault rifle conversion demonstrates the
installation of a different product created for the same purpose of fixing the magazine to the weapon,
purportedly in order to comply with New York law. The shooter’s notes on the video describing how he might
uninstall the product, which mention the possibility that he might need to “heat treat the part until breaks off,” do
not come from the video itself or the presenter’s remarks.

89
See Dan Feidt, Buffalo Mass Shooter Likely Sought Combat Gear Advice on Online Chats, Unicorn Riot (May 14, 2022),
https://unicornriot.ninja/2022/buffalo-mass-shooter-likely-sought-combat-gear-advice-on-online-chats/.

28
YouTube maintains a policy prohibiting videos that are “intended to sell firearms, instruct viewers on how to
make firearms, ammunition, and certain accessories, or instruct viewers on how to install those accessories.”
The OAG’s review of YouTube videos cited by the shooter in his writings did not uncover any videos that appear
to facially violate this policy. Rather, most of the firearms-related videos cited by the shooter concerned product
reviews, demonstrations, and maintenance, particularly for AR-15-style weapons and tactical protective gear
such as body armor, helmets, ear protectors, and tourniquets.

2. 4chan /k/ Board


4chan does not have any global policies applicable specifically to firearms. It does not address the use of the
board to facilitate illegal weaponry or sales of firearms, weapons, or components. On the contrary, the /k/
board, subtitled “Weapons,” has only one board-specific rule that makes it clear that virtually nothing related to
weapons is prohibited: “All weaponry is welcome. Military vehicles/knives/other weapons are included, this board
isn’t just for firearms!” In the course of the OAG’s investigation, 4chan indicated that it has never performed a
single investigation into the use of its site to promote or facilitate illegal weapons modifications, or the sale of
firearms, body armor, or other military weapons or equipment.

There is no indication that the shooter bought or sold any components through /k/ that are in and of themselves
illegal, or that if used would subject an otherwise legal weapon to a prohibition. Nor is there any indication
that he purchased any products through the platform in an illegal manner, such as purchasing a firearm from
an unlicensed seller or without undergoing a background check. Nevertheless, 4chan has made no public
statements of policy to dissuade others from doing exactly that.

3. Discord
The only publicly known use that the Buffalo shooter made of Discord in connection with equipment for his
attack, apart from using the platform to set out his personal thoughts on a private server, was to discuss the
merits of different kinds of protective gear in a server called Plate Land. On that server, reported to feature
discussions of guns and armor, he posted at least 83 messages in the channel known as “#bag-general” between
August 1, 2020 and January 1, 2021.90 In some of those messages, he asked specific questions about types of body
armor, for instance asking about a specific manufacturer’s steel body armor, “if it had a spall coating doesnt
[sic] it stop all fragmentations?” Other messages seemed calculated to find other sources of knowledge on the
platform, for example asking on August 1, 2020 whether there were any other Discord servers that talk about
tactical gear. Some indicated a desire to buy or sell body armor. None of the available messages indicated
anything about his intentions to cause violence or to incite racial hatred. One message used a derogatory slur
towards the LGBTQ+ community to refer to eBay, apparently because he had been reported for attempting to sell
body armor on the website and was unable to complete the sale.

90
Id.

29
Discord’s publicly posted Community Guidelines prohibit the sale or facilitation of sale of prohibited or potentially
dangerous goods, including firearms and ammunition—although that prohibition did not extend to body armor
or tactical gear. It is not clear whether Plate Land itself had any server-level rules about the sale of body armor
(although the shooter continued to post in the server apparently uninterrupted for months after first mentioning
his desire to sell body armor products). What is known is that the shooter’s Discord logs imply that Plate Land
was still active as of late April 2022.91

It is not against Discord policy to discuss the merits of various tactical products, even including firearms and
ammunition. It is also unclear whether the use of the slur would have violated Discord’s hate speech policy, which
states that it is “unacceptable to attack a person or a community based on [protected] attributes . . . or other
protected classifications.”92 The casual use of a slur to protest a company’s effective moderation of its platform
does not fall neatly into this rubric. In any event, the shooter continued to use Discord up until hours before the
shooting, and his account remained undisturbed through that time.

4. Reddit
Although the use of Reddit to acquire knowledge about tactical gear is not reflected in any significant way in
the Buffalo shooter’s writings, the shooter’s Reddit posts suggest that he did use the platform for that purpose.93
These posts indicate that he was active in subreddits dedicated to discussions of tactical gear and ammunition
contemporaneously with the planning of his attack.

Like the available Discord comments, the content of most of these Reddit posts is largely exchanging information
about the pros and cons of certain brands and types of body armor and ammunition. They generally lack
context from which it could have been apparent to a reader that the writer was planning a murderous rampage.
One comment, posted about a year ago, is chilling in retrospect; he asks with respect to dark-colored tactical
gear, “in low light situations such as before dusk after dawn and at nighttime it would provide good camouflage,
also maybe it would be also good for blending in in a city?” It is difficult to say, however, that this comment
should have been flagged at the time it was made. Like other platforms, however, although Reddit prohibits
the use of the platform to solicit or facilitate any transactions in firearms or ammunition, it does not prohibit the
discussion of body armor, tactical gear, or firearms.

91
“I can obviously add more armor and such but I’ll just rely on the boys at plateland to continue it.” Discord Diary at 511, Apr. 25, 2022.
92
Discord Community Guidelines, Discord (Feb. 25, 2022), https://discord.com/guidelines.
93
Reddit disabled the account following the shooting and the posts are no longer publicly available.

30
E. Online Platform Used by the Shooter to Broadcast His Violence
Steeped in the toxic white supremacist online milieu, the Buffalo shooter intuitively grasped the power that video
of his shooting would lend to his call for increased racial violence. He himself had been drawn into this world
by watching video of the Christchurch shooting. In his writings, he expressly asserts that part of his plan is that
“[v]ideo will be livestreamed and manifesto published online to increase coverage and spread my beliefs.”94 He
also believed the real-time nature of his planned broadcast would embolden him before and during the attack,
writing, “I think that live streaming this attack gives me some motivation in the way that I know that some people
will be cheering for me.”95

He briefly considered livestreaming on Facebook Live, but ultimately rejected it because it appeared to him as
though only Facebook accountholders could watch a livestream on the platform (in fact, Facebook does permit
non-accountholders to watch livestreamed video). He settled instead on the livestream platform Twitch, because
it had no user login requirement; anyone with an internet connection and the link to the Twitch stream could
watch and record it.96

Another factor that the shooter considered important in selecting a livestreaming platform to use was the
efficacy of the platform’s content moderation; he cited to a 2019 shooting in Germany that targeted a synagogue
and kebab shop, noting approvingly that the livestream of that attack on Twitch “lasted about 35 minutes, which
for me shows that there is enough time to capture everything important.”97 The platform’s moderation efforts
were of great importance to the perceived success of his plan: “This may not work as intended if it’s reported
and taken down early,” he wrote.98 He continued to dwell on that concern, writing in his Discord diary, “I hope the
jannies [i.e., janitors/moderators] on twitch dont [sic] cancel my stream before I do anything interesting.”99

94
Manifesto at 59.
95
Id. at 61.
96
See id. at 142.
97
Id. at 142.
98
Id. at 142.
99
Discord Diary at 318, Mar. 11, 2022.

31
Like Discord, Twitch conducts moderation of user-generated content both at the platform level and at the
individual channel level (for example, broadcasters may set their own channel rules, designate moderators
within the chat box, and leverage automated tools). Insofar as Twitch was used to broadcast the livestream,
the platform-level moderation is the most relevant aspect of Twitch’s content moderation here; the shooter’s
plans would not have involved strengthening any content moderation efforts on his own stream, as that would
have been contrary to his aim of spreading graphic content. Moderation at Twitch’s platform level includes the
imposition of platform-wide policies, safety review and enforcement, user reporting, and machine detection.
The platform has a nearly four-page standalone Hateful Conduct and Harassment policy, which prohibits,
among other proscriptions, the promotion, glorification, threat, or advocacy of violence or physical harm; the
use of hateful slurs, symbols, and images; speech or content that perpetuates negative stereotypes, expresses
inferiority, hate, content, or disgust based on a protected characteristic.100 Notably, Twitch also expressly warns
users that malicious off-platform conduct, including but not limited to instances of violent extremism, terrorist
activities or recruiting, membership in a known hate group, and explicit or credible threats of mass violence, can
result in the issuance of on-platform enforcements against user accounts.101 Severe cases may involve the use of
third-party legal experts to assist in investigations of off-platform conduct.102 In furtherance of this goal, Twitch
partnered in April 2021 with a private law firm that specializes in investigations and workplace assessments.

Prior to the day of the shooting, the shooter used Twitch to stream three times, for a combined total of
approximately 19 minutes. The OAG’s investigation has found no evidence that he violated Twitch platform rules
during those streams, was known to be a member of any offline hate group, or that any of his plans for the
shooting were known by anyone else prior to the date of the shooting.

On the day of the attack, he followed through with this plan, inviting Discord users he may have perceived as
potentially receptive to his hateful cause to view the Discord channel he had restricted to himself until then;
fifteen users reportedly accepted the invitation.103 The invitation also included a direct link to the Twitch stream.104
Between 20 and 28 users viewed the Twitch stream on that platform. Seven of those users were logged into the
platform and could thus choose to report the livestream; users in the United States who were not logged in did
not have the option of reporting.

100
See Hateful Conduct & Harassment, Twitch Safety Center, https://safety.twitch.tv/s/article/Harassment?language=en_US.
101
See Off Service Conduct, Twitch Safety Center, https://safety.twitch.tv/s/article/Off-Service-Conduct?language=en_US.
102
Id.
See Jon Swaine & Reed Albergotti, Just Before Buffalo Shooting, 15 Users Signed into Suspect’s Chatroom, Says Person Familiar with Review,
103

Wash. Post (May 19, 2022), https://www.washingtonpost.com/investigations/2022/05/19/payton-gendron-discord-buffalo-shootings.


104
Id.

32
He began broadcasting at approximately 2:08 p.m. Eastern time. The OAG’s investigation did not uncover
video of the first several minutes of the livestream. Based on contemporaneous posts from people viewing it,
the first twenty minutes or so consisted largely of the shooter’s drive to Tops, although his rifle and parts of his
tactical gear were visible at times. Viewers of the early minutes of the livestream were not logged-in to Twitch.
At 2:26:59 p.m., the first viewer who was logged into Twitch entered the livestream. At 2:29:49 p.m., that user
also issued the first report about the livestream to Twitch Safety Ops, with the description “about to shoot up a
store.” The shooting began within approximately one minute of that report. Two other users submitted reports
by approximately 2:30 p.m. At 2:31:29 p.m., less than two minutes after the first user report concerning the
imminence of violence, Twitch caused the livestream to stop and permanently banned the shooter from any
future use of their platform.

Prior to the initial user report, Twitch did deploy at least some automated tools on the shooter’s stream. As an
infrequent streamer who had not met certain qualifying streaming metrics, Twitch automatically used machine
learning tools to visually inspect the shooter’s broadcast periodically prior to the commencement of violence.
These tools did not detect any content violative of the platform’s policies.

In addition to broadcasting directly through Twitch, the shooter used his Discord channel to rebroadcast the
Twitch livestream. The invitations that he sent to his perceived friends and allies on Discord to view his channel
permitted them to watch the livestream and read his writings within the Discord platform.105 However, because
the livestream on Discord was simply rebroadcasting the feed from Twitch, the actions Twitch took to stop the
livestream simultaneously ended the stream on Discord.

105
See Manifesto at 142.

33
IV. Online Platforms’ Response to the Buffalo Shooting

A. The Spread of Graphic Content Related to the Buffalo Shooting


The first few minutes of the attack unfolded in real time via a livestream on Twitch. Although only a small number
of Twitch users witnessed it at the time, copies of the recorded video were subsequently posted online. Notably,
the video of the shooting that has taken root online appears to have originated from a single individual using
footage recorded from the livestream through Discord. That individual, a resident of Washington State, appears
to have uploaded the video to an obscure file-sharing site and posted a link to the uploaded video on 4chan
at approximately 10:45 PM. The post was removed by a 4chan moderator approximately 30 minutes later. By
that time, however, other 4chan users had reposted the link approximately 75 times. Only a handful of these
additional posts were removed by moderators.

The link quickly spread to other websites. Approximately four minutes after the link was first posted on 4chan,
a user posted the link on another fringe site, kiwifarms.net.106 Shortly thereafter, the link began appearing on
mainstream websites, including on Twitter within 17 minutes and on Reddit within an hour. In the following days,
the link was posted and reposted on these and other sites thousands of times.

Copies of the video itself also spread quickly. Within 30 minutes of when the link to the video was first shared,
another 4chan user had downloaded the video from the file sharing site, uploaded it to a video sharing platform,
and posted a link to the video sharing platform on 4chan. A condensed version of the video was uploaded to
Reddit’s own video sharing service less than an hour and a half later. Links to these videos, as well as to other
copies of the video, similarly spread.

B. The Global Internet Forum to Counter Terrorism’s Response to the Tragedy in


Buffalo
The Global Internet Forum to Counter Terrorism, or GIFCT, was founded in 2017 by Facebook, Microsoft, Twitter,
and YouTube. GIFCT describes its aim to “bring[] together the technology industry, government, civil society,
and academia to foster collaboration and information-sharing to counter terrorist and violent extremist
activity online.”107 Since its inception, GIFCT’s membership has expanded beyond these founding companies to
include over a dozen platforms.108 In response to the Christchurch attack, the founding members announced in
September 2019 that GIFCT would evolve from a consortium of technology companies to an independent non-
profit organization with its own dedicated technology, counterterrorism, and operations teams.

See On September 3, 2022, Cloudflare—one of the largest providers of internet infrastructure services—blocked kiwifarms.net,
106

essentially taking the website off of the internet temporarily, citing a sudden escalation in the use of the site in a manner that posed
imminent threats to human life. See Matthew Prince, Blocking Kiwifarms, Cloudfare: The Cloudfare Blog (Sept. 3, 2022),
https://blog.cloudflare.com/kiwifarms-blocked.
107
About, Glob. Internet Forum to Counter Terrorism, https://gifct.org/about.
108
See Membership, Glob. Internet Forum to Counter Terrorism, https://gifct.org/membership.

34
GIFCT’s chief role in the event of a violent event such as the Buffalo shooting is to serve an information sharing
tool for member platforms. Current technology allows certain digital content to be “hashed” or assigned a
digital value that can be used by a computer to immediately identify the content. GIFCT maintains a database of
these hashes that can be used by member platforms to identify content that is visually similar to content that has
been removed by other member platforms.

On May 14, 2022, at 4:52pm EDT, approximately 2 hours and 22 minutes after the attack, GIFCT activated its
Content Incident Protocol (CIP). GIFCT actions included alerting all GIFCT members that the CIP had been
activated, enabling members to share hashes of the perpetrator-produced content depicting the attack, and
alerting the U.S. government—as the impacted national government in this incident—that the CIP had been
activated in response to the shooting. Between when GIFCT activated the CIP and its conclusion, its members
added approximately 870 visually distinct items (740 images and 130 videos) to the GIFCT hash-sharing
database, to aid in content moderation. All of the mainstream platforms described above designated the
shooting video and the manifesto as a violation of their policies and attempted to remove it through content
moderation processes. The success of these attempts varied significantly.

C. Mainstream Platforms’ Moderation of Graphic Content Related to the


Shooting
For the period May 20, 2022 to June 20, 2022, OAG investigators searched a number of mainstream social
networks and related sites for the manifesto and video of the shooting. Despite the efforts these platforms made
at moderating this content, we repeatedly found copies of the video and manifesto, and links to both, on some
of the platforms even weeks after the shooting. The OAG’s findings most likely represent a mere fraction of the
graphic content actually posted, or attempted to be posted, to these platforms. For example, during the course
of nine weeks immediately following the attacks, Meta automatically detected and removed approximately
1 million pieces of content related to the Buffalo shooting across its Facebook and Instagram platforms.109
Similarly, Twitter took action on approximately 5,500 Tweets in the two weeks following the attacks that included
still images or videos of the Buffalo shooting, links to still images and videos, or the shooter’s manifesto. Of those,
Twitter took action on more than 4,600 Tweets within the first 48 hours of the attack.110

109
This automated detection is inherently both underinclusive and overinclusive (for instance, the tools may take action on content
that looks similar to the Buffalo shooting but is not related, and may not take action on new content that has not been determined
to be related).
The 5,500 Tweets were comprised of: (i) approximately 2,700 Tweets associated with a user report that were escalated or removed for
110

featuring portions of the video or graphic still images; (ii) approximately 2,000 Tweets that were blocked at either the point of upload
or within one or two seconds of posting for containing other video of the shooting or graphic still images; and (iii) approximately 800
Tweets that were escalated or removed for including links to third-party websites that hosted the manifesto or video. Twitter also
blocked links to over 75 unique URLs known to host the manifesto or video, preventing users from being redirected off-platform
to those websites.

35
When we found graphic content as part of these efforts, we reported it through user reporting tools as a
violation of the platform’s policy. Among large, mainstream platforms, we found the most content containing
video of the shooting, or links to video of the shooting, on Reddit (17 instances), followed by Instagram (7
instances) and Twitter (2 instances) during our review period. We also found links to the manifesto on Reddit (19
instances), the video sharing site Rumble (14 instances), Facebook (5 instances), YouTube (3 instances), TikTok (1
instance), and Twitter (1 instance). Response time varied from a maximum of eight days for Reddit to take down
violative content to a minimum of one day for Facebook and YouTube to do so.

We did not find any of this content on the other popular online platforms we examined for such content, which
included Pinterest, Quora, Twitch, Discord, Snapchat, and Telegram, during our review period. That is not to
say, however, that it does not exist on those platforms. Some of those platforms offer comprehensive non-public
communications channels outside the scope of the OAG’s search for purposes of this investigation.

As demonstrated by the persistence of this content, current moderation techniques are far from perfect. User
reporting relies on individual users talking action to moderate their peers. Users are not experts in content
moderation and have no special training. At times, offending content may be circulated among a cohort of
users who share the views of the person publishing such content and who accordingly may not want to enforce
the platform’s moderation policies. In addition, user reporting is prone to error and manipulation, including the
weaponization of violative content reporting to harass protected groups.

Paid human moderation by the platforms also struggles at scale. No human content-moderation team will be
able to review and respond to all user generated content in real time. It can also be inconsistent, as moderators
are faced with a high volume of content and may struggle to apply guidelines consistently. Human moderation
has severe psychological costs as well, sometimes resulting in significant trauma to mental health. In 2020, for
example, Facebook agreed to pay $52 million to current and former moderators to compensate them for mental
health issues that the moderators alleged were caused by persistent exposure to extreme content as part of their
job.111 TikTok is currently facing a similar lawsuit.112

Although automated review holds significant promise, it also has a number of limitations, including:

• Under-inclusivity: Most media is not pre-scanned prior to posting. Thus, even violative content can
be posted, and subject to user viewing (and downloading and reposting), before an automated
review occurs.

• Difficulty in identifying altered media: Media identification is largely accomplished by assigning hash
values to offending content and comparing the hash numbers. Altering the media, such as adding or
removing a few frames, can change the hash value and thus delay detection of the media. For example,
one video OAG investigators found on Twitter had black bars added to the frame of the video by the
poster to avoid automated detection.

111
See Scola v. Facebook, Inc., No. 18-civ-05135 (Cal. Super. Ct. filed Sept. 21, 2018).
112
See Young v. Bytedance Inc., No. 3:22-cv-01883 (N.D. Cal. filed Mar. 24, 2022).

36
• Links to third-party sites: A link may redirect a user to content on a third-party site that has different
or less restrictive content moderation policies. Thus, the platform must scan content hosted elsewhere,
or resort to manually collecting a list of links to the violative content, largely through user reporting, for
identification and removal. We attribute the large number of links to videos found on Reddit, and the
links on Instagram, to this limitation.

OAG investigators also found graphic video of the shooting on many fringe websites and less-trafficked
platforms. Where possible, we reported the content. With one exception, the videos were not taken down and
are still live on the websites.

In particular, links to the graphic video of the shooting and the shooter’s writings circulated widely on 4chan.
Between May 14, 2022, the day of the shooting, and July 8, 2022, links to video of the shooting were posted on
4chan 382 times, links to the manifesto were posted 179 times, and links to the shooter’s Discord diary were
posted 53 times. In most cases, these posts were not removed because 4chan does not prohibit violent or hateful
content. Indeed, when discussing policy on another mass shooting video, a head moderator stated that “if it’s on
a board like /b/ or /pol/ it’s not even against the rules” and “the footage itself isn’t illegal, any more than footage
of any act of violence is illegal.”

D. Improved Moderation Policies Reduced Prevalence of Violent Content, but


Additional Improvement is Needed
Although our findings show a concerning amount of persistence of extremist and violent content, some of the
results are better than in prior incidents. The Christchurch shooter livestreamed on Facebook for 17 uninterrupted
minutes; Facebook only took action after the livestream had ended and a user eventually reported it to
Facebook.113 Facebook noted the day after the Christchurch shootings that it had removed 1.5 million videos of
the attack, 1.2 million of which were blocked upon upload, marking a 20% failure rate.114 Indeed, 6 months after
the attack, investigators found over a dozen copies of the video still on Facebook.115 A year after the attack,
researchers at the Counter Extremism Project identified at least 14 different websites where footage of the
Christchurch terror attack could be accessed.116

See Christchurch Mosque Shootings: Gunman Livestreamed 17 Minutes of Shooting Terror, NZ Herald (Mar. 15, 2019), https://www.nzherald.
113

co.nz/nz/christchurch-mosque-shootings-gunman-livestreamed-17-minutes-of-shooting-terror/BLRK6K4XBTOIS7EQCZW24GFAPM.
114
See Chris Sonderby, Update on New Zealand, Meta (Mar. 18, 2019), https://about.fb.com/news/2019/03/update-on-new-zealand.
See Olivia Solon, Six Months After Christchurch Shootings, Videos of Attack Are Still on Facebook, NBC News (Sept. 20, 2019),
115

https://www.nbcnews.com/tech/tech-news/six-months-after-christchurch-shootings-videos-attack-are-still-facebook-n1056691.
See Christchurch Terrorist Video Remains Online, Counter Extremism Project (Mar. 13, 2020), https://www.counterextremism.com/press/
116

christchurch-terrorist-video-remains-online.

37
Here, Twitch shut down the Buffalo shooter livestream within two minutes of the violence and the first user report.
It is unclear, however, how long the livestream would have continued had it not been reported until after the
shooting ended, as was the case with the Christchurch shooting livestream. Moreover, the challenge is that just
a single minute of violence, once broadcast, can live forever in the current online ecosystem, evading even well-
intentioned content moderation policies. Even a short video surviving on the internet serves the shooter’s criminal
purpose and may be used to inspire others to follow in the shooter’s footsteps.

Apart from technological advances, the relative improvements in content moderation may be partially
attributed to coordinated efforts by technology companies to rein in extremist content, such as those
spearheaded by GIFCT. As described above, GIFCT activated the CIP and alerted all GIFCT members that the
CIP had been activated. This allowed its member companies to share hashes of the perpetrator-produced
content depicting the attack, in video and image form, along with content featuring the manifesto. The CIP was
activated from approximately 4:52pm EDT on Saturday, May 14 through 6:31pm EDT on Sunday, May 15. During
this time, members added approximately 870 visually distinct items to the GIFCT hash-sharing database. These
related to approximately 740 visually distinct images and approximately 130 visually distinct videos.

The GIFCT protocols—like other voluntary coordination efforts—need improvement, particularly in the absence
of legislative reform. Although the initial video of the shooting was taken down more quickly than in previous
mass shootings, it is not clear that can be attributed to any content moderation improvements by the platforms,
and it was not quick enough to prevent proliferation of the content in the days after the shooting. Moreover,
written content like the manifesto is not subject to the same information sharing protocols of the GIFCT. This
shortcoming is particularly problematic in the context of white supremacist violence, where attackers receive
inspiration from the writings of previous mass shooters and seek to inspire others the same way. Allowing
terrorist manifestos to proliferate across the internet contributes to violent extremism and heightens the risk of
future attacks.

Additionally, there is no systematic sharing of links that can direct a user off-platform to violative content such as
the Buffalo shooting video. Websites are largely on their own to identify a link on their site that leads to violative
content and remove it.

Importantly, because frameworks like the GIFCT are voluntary, their effectiveness is thwarted by nonparticipating
platforms where extremist and violent content continues to proliferate. These voluntary and self-regulatory
frameworks also lack enforcement, allowing the companies themselves to determine how to strike the balance
between investment in compliance and other business objectives.

38
E. Online Platforms’ Advertisements and Monetization of Graphic Content
Related to the Shooting
Early reports suggested that online platforms were running advertisements next to footage of the shooting.
One report noted that on Facebook, “searches for terms associated with footage of the shooting have been
accompanied by ads for a horror film, clothing companies and video streaming services.”117 In some cases,
Facebook recommended certain search terms about the video, noting that they were “popular now” on the
platform. Facebook admitted that in the days immediately after the incident, ads could be on the same page as
video of the shooting if the content was not flagged for removal.

Facebook did eventually turn off banner advertising for searches related to the Buffalo shooting, consistent
with their public-facing policy.118 After Facebook did so, search results for Buffalo content on its platforms did
not include advertisements, but it is unclear how much advertising revenue Facebook generated from this
horrific event before they made this adjustment. They also turned off the “auto-suggest” and “auto-completion”
functionality within their search bars, whereby the platform suggest or offer to complete search phrases after a
user has only partially filled out a search request.

However, other platforms performed even worse. For example, weeks after the attack, Twitter was “auto-
suggesting” to users a search for “buffalo live stream video” after a user entered only the partial query “buffal”
[sic], both on a desktop browser and on its mobile app:119

Figure 1: Partial screenshot of Twitter search bar, captured June 22, 2022 (mobile version)

Ryan Mac, Facebook Has Been Monetizing Searches for the Buffalo Shooting Video, N.Y. Times (May 19, 2022), https://www.nytimes.
117

com/2022/05/19/technology/buffalo-shooting-facebook-ads.html.
118
See Content Monetization Policies, Meta, https://www.facebook.com/business/help/1348682518563619?id=2520940424820218.
See Amanda Silberling, Facebook and Twitter Still Can’t Contain the Buffalo Shooting Video, TechCrunch (May 17, 2022),
119

https://techcrunch.com/2022/05/17/buffaloshooting-footage-facebook-twitter-moderation.

39
Twitter was also serving ads (with promoted content) next to search results where the Buffalo shooting video
was a result (as indicated below with a red circle):

Figure 2: Partial screenshot of Twitter search results, captured June 22, 2022 (web version)

TikTok continued for weeks to auto-suggest the shooting video as well, although without the advertising.

40
V. Recommendations
Part of our mandate pursuant to the Governor’s referral is to “determine whether specific companies have civil or
criminal liability for their role in promoting, facilitating, or providing a platform to plan and promote violence.”120
Our determination is that no such liability likely exists under these facts, given the present state of the law. As
this report describes in detail, the shooter made extensive use of online platforms to immerse himself in the rank
hatred of the online white supremacist milieu, to plan his attack in great detail, and to livestream his shooting.
He documented his actions for posterity, relying on the power of the internet and online platforms to perpetuate
his disturbing deeds. Indeed, within hours of the shooting, graphic videos and images of his attack were widely
disseminated across all corners of the internet. Platforms had varying degrees of success in removing that
objectionable content—where they even tried to do so at all.

Despite the integral nature of online platforms in this and previous mass shootings, however, it is extremely
unlikely that any of them—even the worst offenders who enforce virtually no content moderation—can face any
sort of legal liability. There are no laws on the books that directly address the conduct at issue here, not even
for the distribution of a graphic, uncensored video created by an attacker killing another person in cold blood.
Even if such laws existed, platforms would likely claim First Amendment and Section 230 protections to insulate
themselves from any liability based on the content posted by their users. That state of affairs, however, need not
remain static. Congress, the State Legislature, and platforms themselves can and must take action if anything is
to change.

A. Legislative Proposals
Legislative reform is necessary to make meaningful progress in preventing online platforms from being the
vehicle for radicalization and dissemination of violent criminal activity.

1. New Affirmative Liability


We recommend that New York (and other states) impose criminal liability for the creation by the perpetrator, or
someone acting in concert with the perpetrator, of images or videos of a homicide. Such videos are an extension
of the original criminal act and serve to incite or solicit additional criminal acts. In addition, these videos are
obscene on their face. Importantly, appropriate legislation should avoid covering videos created by bystanders
or passively, such as those captured by police officers’ body-worn cameras.

Letter from Gov. Kathy Hochul to Att’y Gen. Letitia James 1 (May 18, 2022), https://www.governor.ny.gov/sites/default/files/2022-05/
120

Executive_Law_63_Referral.pdf.

41
We also recommend imposing civil liability for the distribution and transmission of this content, including making
liable online platforms that fail to take reasonable steps to prevent unlawful violent criminal content from
appearing on the platform. Significant penalties, sufficient to realize the goal of deterrence, should be levied
in cases where an online platform fails both to take such reasonable steps and to prevent the transmission of
content that is captured by or created by the perpetrator of a homicide, or one working in concert with the
perpetrator of a homicide, and that depicts a homicide.

Some categories of objectionable content are already subject to restrictions even more stringent and wide-
ranging. Tech companies have taken extraordinary measures to comply with federal law criminalizing the
creation, distribution, and possession of child sexual abuse material (CSAM). They have made great strides in
the effort to fully eradicate CSAM on both the mainstream internet and even on fringe sites, such as 4chan, that
otherwise do little content moderation. Anti-CSAM laws have survived First Amendment scrutiny because they
criminalize a category of speech widely viewed as obscene. The distribution of CSAM material has been upheld
as speech integral to illegal conduct—without a market for CSAM material, there would be no motivation to
create such material.

A restriction on creating and distributing this material should be drafted in a manner that ensures conformity
with the First Amendment. Critically, the rationale of preventing murderers from promoting their crimes—
especially racially motivated mass-murderers—is a governmental interest of the highest order, and there is no
societal benefit to the dissemination of such videos by their perpetrators. Accordingly, any such law must be
tailored specifically to reach only those videos where such serious crime occurs. Any such law should also aim
to avoid imposing any penalty for videos that have educational, historical, or social benefits. At a minimum,
state attorneys general and other law enforcement agencies should be given explicit authority to enforce
these provisions with steep penalties. Further consideration should be given whether the public (or some group
of directly affected individuals) should have a private right of action tailored to promote these compelling
government interests.

These are critical first steps; indeed, moderators on 4chan have justified allowing video of violent crimes to
remain on the platform because the videos themselves are not illegal. However, promulgating new law that
establishes these categories of liability does not fully address the policy goal of breaking the cycle of online white
supremacist radicalization and violence, because platforms’ enablement of the distribution of the depictions of
racial violence committed by these attackers is still subject to Section 230 defenses.

42
2. Reforming CDA Section 230
We hold manufacturers responsible for their products and subject entire industries, like telecommunications,
to complex regulatory regimes intended to rein in excesses and abuses. Yet tech companies are largely free to
create and operate online platforms without legal consequences for the negative outcomes of their products
because of Section 230. The drafters of Section 230 wanted to encourage content moderation by providing
protections to online platforms acting as “Good Samaritans” in their attempts to monitor and remove offensive
content on their platforms.121 However, the internet has changed dramatically since 1996, and in that time,
courts have found that Section 230(c)(1),122 which protects companies from liability for the under-removal of
user content, “should be construed broadly in favor of immunity.”123 Although there are some discrete areas
where Section 230 protections do not apply,124 by and large the Section 230 case law has made it difficult to
hold companies responsible for their failure to take even basic steps to monitor and ensure the safety of their
platforms.125 Therefore, we are calling on Congress to reform Section 230.

While broader reform may be necessary, our proposal focuses on the violent and extremist content at issue here.
We recommend that Congress rethink the ready availability of Section 230 as a complete defense for online
platforms’ content moderation practices. Instead, we recommend an approach that requires an online platform
that wishes to retain Section 230(c)(1)’s protections to take reasonable steps to prevent unlawful violent criminal
content from appearing on the platform.126 Significantly, this proposal changes the default. Instead of simply
being able to assert protection under Section 230, a defendant company has the initial burden of establishing
that its policies and practices were reasonably designed to address unlawful content. This reform proposal
would incentivize companies to establish robust content moderation programs, but would not penalize those
companies if violative content slips through despite such programs. Reasonableness must account for the
prevailing technology, including whether companies are making investments in and deploying the same level
of sophisticated technology to content moderation as they are for other parts of their business. This would help
establish a baseline of content moderation across online platforms, helping to ensure a safer experience for all
users.

See Danielle Keats Citron, How to Fix Section 230, B.U. L. Rev. 7–9 (forthcoming) (Mar. 10, 2022), https://papers.ssrn.com/sol3/papers.
121

cfm?abstract_id=4054906.
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by
122

another information content provider.” 47 U.S.C. § 230(c)(1).


Force v. Facebook, Inc., 934 F.3d 53, 64 (2d Cir. 2019) (cert denied); see also Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997);
123

Almeida v. Amazon.com, Inc., 456 F.3d 1316, 1321 (11th Cir. 2006); Marshall’s Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263, 1267 (D.C. Cir. 2019);
Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 18 (1st Cir. 2016); Fair Hous. Council v. Roommates.Com, LLC, 521 F.3d 1157, 1174 (9th Cir. 2008);
Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 408 (6th Cir. 2014).
Courts have declined to extend Section 230 protections when a defendant “assist[s] in the development of what ma[kes] the content
124

unlawful.” Fed. Trade Comm’n v. LeadClick Media, LLC, 838 F.3d 158,174 (2d Cir. 2016). Courts have also declined to grant Section 230
protections where a lawsuit seeks to hold an online platform accountable for activities, such as selling or designing a product, separate
from its role as a publisher. See, e.g., Erie Insurance Co. v. Amazon.com, Inc., 925 F.3d 135, 139–40 (4th Cir. 2019) (“[T]he Communications
Decency Act . . . does not protect them from liability as the seller of a defective product.”); Lemmon v. Snap, Inc., 995 F.3d 1085, 1093 (9th Cir.
2021) (“The duty to design a reasonably safe product is fully independent of Snap’s role in monitoring or publishing third-party content.”).
The Supreme Court recently granted certiorari to review the scope of Section 230 protection when a company makes targeted
125

recommendations of third-party content. See Gonzalez v. Google LLC, 2 F.4th 871 (9th Cir. 2021), cert. granted, No. 21-1333, 2022 WL 4651229
(U.S. Oct. 3, 2022).
126
In this regard, our recommendation is similar to a proposal authored by Professor Danielle Citron. See supra, n. 121.

43
As part of this framework, Congress should authorize a federal agency, such as the United States Federal Trade
Commission, or create a new agency dedicated to regulating online platforms, to draft regulations providing
guidance on the reasonable steps that an online platform must take to obtain Section 230 protection. At a
minimum, reasonable steps should include efforts to remove unlawful violent criminal content and content likely
to incite or solicit violent crime, measures to prevent the platform from being used to encourage or plan acts of
violence, and limits on live-streaming technology designed to prevent its use to further criminal acts and incite or
solicit violent crimes.

Many of the mainstream platforms we have reviewed in preparing this report have taken important steps to
prevent such misuse of their services; they have promulgated policies prohibiting extremist and violent content,
implemented automated tools to review certain types of content, engaged large numbers of people to review
potentially violative content, and enabled their users to police their platforms in furtherance of that goal. But our
investigation has also revealed various points at which companies could have taken more robust measures to
prevent misuse of their online platforms. A key problem with today’s legal regime is that the strength of content
moderation policies is based entirely on voluntary efforts without accountability or any enforcement mechanism.

The standard for reasonableness should not simply defer to the business decisions of the largest platforms and
how they have chosen to balance investment in compliance against investments in other parts of their business.
Instead, an appropriate reasonableness standard should be based on the technology available and the risks
that were known or should have been known to online platforms. The standard for reasonableness should also
take into account lessons from the Buffalo shooting and this report. For example, the livestream of the Buffalo
shooting cut off within two minutes of when the violence began. Although that is an improvement over prior
livestreams of mass shootings, as discussed in more detail below, two minutes is too much, and more must
be done to prevent live streaming from being used to further the goals of mass shooters. Likewise, the shooter
documented his plans in detail over the course of several months, but these logs were not flagged. Drafters
of reasonableness guidelines should consider whether a platform should implement automated scanning of
content to identify violent plans at a moment when long-standing private content is suddenly distributed to a
larger group of people. Furthermore, videos of the violence were visible on both fringe platforms and certain
mainstream platforms after the shooting. At a minimum, any reasonableness standard should include robust
measures to remove content depicting violent crimes and any other content that violates a platform’s content
moderation policies.

The fringe websites, like 4chan and others, present substantially clearer cases under a reasonableness standard.
These platforms have by and large abdicated any responsibility to moderate content. That approach to content
moderation has endured even as these sites have become havens for hateful, violent content. Such platforms
should not be entitled to claim a defense against the distribution of unlawful violent criminal content that their
absenteeism encourages.

44
3. Restrictions on Livestreaming
Livestreaming requires a special mention for its repeated use by hate-fueled mass shooters to broadcast
their massacres. Livestreaming undoubtedly has many legitimate use cases. At the same time, the future of
livestreaming needs to grapple with how this service has been used to broadcast these acts of terror, becoming
an extension of the criminal act, further terrorizing the targeted community and serving to promote the shooter’s
ideology. As detailed above, the Buffalo shooter considered the instantaneous transmission of video available
through livestreaming to be a centrally motivating factor in his shooting, both because of the intangible support
he felt he would receive through it and because he hoped it would inspire others, just as he had been inspired by
video of the Christchurch shooter. Although Twitch stopped the Buffalo shooter’s livestream within two minutes
of the first gunshot, an improvement over Facebook’s response to the Christchurch shooter’s livestream, two
minutes is still too much. Even a short video of a mass shooting can be used to incite others to engage in copycat
crimes and serve the criminal goals of the perpetrator.

Reasonable measures to prevent criminal violence on a platform must include additional protections to limit
the ability of livestreaming to be used to promote and inspire violent crimes. Platforms that permit livestreaming
and want the benefit of continued Section 230 protection should be required to adopt reasonable procedures
commensurate with identifiable risk factors. Such platforms should delay livestreams for users that are not
verified or that fail to meet other trust factors, such as livestreams from users without a documented history of
streaming within the platform’s policies, or with few friends/followers. Such tape delays, which are standard
in live television broadcasts, would permit livestreaming services to apply automated technology to detect
violent crimes, including gunshot detection technology. Likewise, these platforms should be required to restrict
algorithmic promotion of such livestreams, at least without conducting some affirmative review to ensure the
streamer is legitimate.

B. Recommendations for Industry

1. Online Platforms Should Strengthen and Share Their Moderation Technologies


Simply put, online platforms need to invest even more in content moderation. In the years since the Christchurch
shooting, content moderation practices and technologies have improved. However, not all platforms have
made great strides. For example, as our investigation proceeded, we found that Reddit had difficulty effectively
identifying links to videos of the Buffalo shooting carried on other sites, even when reported by users or this office.

Mainstream platforms should devote time, energy, and money to developing alternative technology to image/
video hashing. Hashing works by assigning a number to an offending image or video and comparing the
number with a recent upload on a website. But this approach is subject to problems both at the outset of the
process and as it runs. At the outset, it requires offending content to actually be identified and hashed, which
typically requires human review and causes delay. And if the offending image or video is changed after it has
been assigned a hash value, the value can change, allowing individuals to subvert the process of hash-based
identification.

45
Platforms with mature content moderation practices and technologies should also share with other platforms
what they have learned. Additionally, platforms should not countenance delay in foreclosing monetization
opportunities arising from tragic events like the Buffalo shooting. They should immediately turn off advertising
displayed in conjunction with violative content or in response to searches that appear to be looking for violative
content, and should immediately cease auto-suggesting terms that lead to searches for violative content. These
actions should be undertaken as soon as possible after the event.

Online platforms may have realized improvements in content moderation thanks to their ability to coordinate
and share information through groups like GIFCT. But as the organization itself has acknowledged, GIFCT’s
current hash-sharing database is less effective against white supremacist extremists working outside of
established terrorist organizations. It is working to close that gap: on September 20, 2022, GIFCT announced
that it has begun to add hashes of PDFs of attacker manifestos and terrorist publications to its hash-sharing
database.127 That is a positive step. Online platforms should continue to identify opportunities to improve
coordination. This should include adding text and audio to cross-platform databases, like GIFCT’s, that are
currently limited to images and videos, in order to help platforms identify terrorist manifestos and other
problematic content. Additionally, platforms should catalogue and share URLs to offending content so that
platforms can receive the benefit of understanding to which sites users across social media are linking people to
offending content. Most of the videos of the Buffalo shooting that the OAG identified on mainstream websites
were links to content maintained on fringe websites and obscure cloud storage services. Finally, platforms should
open up a “read-only” level of database access to a wider variety of platforms than those consisting of GIFCT’s
present membership. This could help balance any concerns about non-members who might seek to weaponize
the database to censor legitimate speech with the simple fact that a greater reach is likely to lead to more
effective removal of terroristic content from the visible internet.

2. Online Platforms Should Increase Transparency.


Governments and users need greater transparency into the policies that online platforms have adopted to
address hateful, extremist, or racist content, and how those policies are applied in practice. Online platforms
should publicly disclose this information, including the amount and type of content, their policies regarding
this type of content, user reports and complaints about the content, actions the platform has taken, and any
delay in the takedown.128 This increased transparency will make it easier for interested parties to hold platforms
accountable and craft future policy.

Expanding our Collective Capacity: GIFCT’s Progress Continues, Glob. Internet Forum to Counter Terrorism (Sept. 20, 2022),
127

https://gifct.org/ 2022/09/20/expanding-our-collective-capacity-gifcts-progress-continues.
In September 2022, California enacted a new law requiring certain social media companies to post terms of service and submit a
128

detailed semiannual report to the California Attorney General describing, among other things, the platform’s approach to content
moderation with respect to categories including hate speech or racism, and extremism or radicalization. See Cal. Bus. & Prof.
Code § 22675 et seq.

46
3. Internet Service and Infrastructure Providers Can and Should Do More
Although all of the mainstream sites visited by the OAG have content moderation policies and devote resources
to removal, many of the fringe sites do nothing. Indeed, much of the content that slips through mainstream
site moderation are links to the content hosted on these fringe sites. Service providers can and should do more
to foster an internet that is inhospitable to violent extremists and does not further their accelerationist aims
and recruiting efforts. There are examples of private entities stepping forward to help achieve those goals. For
instance, in the aftermath of the Buffalo shooting, Bid Glass, an advertising service provider, terminated service
for 4chan.129 In a series of prior mass shootings, the site 8chan was taken off the visible internet, when network
infrastructure provider Cloudflare stopped providing their content delivery network (CDN) service. Voxility, a web
services company that had been renting servers to Epik, the site’s new domain registrar, as well as Epik’s CDN
provider subsidiary BitMitigate, also terminated service.130

A similar opportunity is presented here. Service providers, not bound by any legal requirement to facilitate the
survival of websites that exist largely to promote hate and violence, should take similar actions and help prevent
future violent attacks by refusing to service sites that perpetuate the cycle of white supremacist violence.

129
See Notice, Bid Glass, https://archive.ph/Ksi9w.
See Graham Kates, U.S. 8chan Sputters Back to Life with New Name, CBS News (Nov. 4, 2019), https://www.cbsnews.com/news/8chan-
130

new-name-8kun-webforum-hate-speech.

47

You might also like