What is Discord, the chat app used by the Buffalo shooter?

Before he allegedly killed 10 people in a Buffalo supermarket on May 14, the 18-year-old suspected shooter left a string of racist writings in online forum Discord, a popular voice, video and text chat app.

As early as November, the teen wrote messages on Discord documenting his plan to allegedly murder Black people in a mass shooting, according to a compilation of messages reviewed by The Washington Post. Discord has since said the messages were only visible to the suspect until he shared them with others the day of the attack.

Subscribe to The Post Most newsletter for the most important and interesting stories from The Washington Post.

Along with WhatsApp, Telegram and Signal, Discord has come up in connection to racist violence before. White supremacists who attended a "Unite the Right" rally in Charlottesville, Va., in August 2017, at which multiple counterprotesters were injured or killed, also used Discord private chats to organize before the event.

But the service is much bigger - and more complicated - than those dark moments suggest. Discord developed a reputation as a haven for Generation Z and gamers after its launch in 2015. Today, many of its more than 150 million monthly active users don't talk about games at all. And some say Discord's setup allows for healthier and more engaged online communities.

Claire Bourdon, a software developer in Indiana, joined the app after someone on Reddit mentioned a Discord "server," or community chat, that helps people outside the tech industry learn to code.

Now, she visits the server every day to help newbie developers. The age range is broad, she said - from high-schoolers to people in their 60s - and a handful of moderation tools keep the vibes healthy, she said. (Bourdon's group has a "no foul language" policy, and she's seen people get immediately booted for making offensive remarks, she said.)

"It's the most wholesome place I've ever been on the internet," she said.

Q: How does Discord work?

A: Discord is a desktop, web and mobile app for messaging, usually through written chats, similar to workplace communication app Slack. The company says most servers are private, meaning you need an invite to join. Admins and moderators - the people who run a particular server - can invite new people by sending an in-app invite directly to them or by posting an invite link in a public place like a Twitter bio. Other servers are open, public communities dedicated to everything from obscure hobbies to internet personalities.

It's also common for a single Discord server to host multiple chat "channels" devoted to different topics, and many servers also have voice channels where exchanges can unfold out loud. Users can turn on their cameras and chat face to face - or something like it - in these voice channels, as well as through direct video calls to individuals and small groups.

Discord has a "17+" age rating on the Apple App Store. Discord says users must be 13 or over, but age is self-reported. As on other social apps, users don't have to verify their age and can register using pseudonyms.

To join a new server without an invite, users can browse existing public servers from the Discord homepage or create their own - among the app's suggested templates are "school club" and "study group."

Q: Do algorithms influence what people see?

A: Discord doesn't have a feed like Facebook or Twitter, and conversations unfold in real time - no algorithms involved. That means kids and teens won't encounter algorithmically-amplified harmful content like the kind experts point out on Instagram.

But because Discord communities are generally based on shared interests, it's still possible to find content that is dangerous, hateful or age-inappropriate, and there's little preventing minors from interacting with strangers.

"We work relentlessly to keep bad actors off our service and we take the safety of all Discord users, especially our younger users, incredibly seriously," a spokeswoman for Discord said.

Q: Is Discord a destination for hate groups?

A: Conversations about white supremacy and other hate ideologies unfold across the internet, and as a platform designed for easy and anonymous interaction, Discord is no exception.

Kathleen Blee, distinguished professor of sociology at the University of Pittsburgh, said that some of the tools that helped Discord build a broad audience make it attractive to extremists. Beyond the ability to create chat servers that are largely invisible to outsiders, the fact that one server can host many conversations over multiple chat channels contributes to a sense of community - for better or worse.

If you were, for example, a white supremacist leader, "what you want is a self-reinforcing kind of spiral of hate, and that can sometimes be most effective in the smaller channels," Blee said.

Discord's focus on interest-based communities means teens could become radicalized after receiving invites from server members to join smaller, more extreme offshoots, says Sean Clifford, CEO of parenting app Canopy.

Discord says 15% of all its employees work on efforts related to safety and its team "investigates and takes prompt action when we receive a report about illegal activity or policy violations."

Q: How can parents help kids stay safe?

A: Isolation during the pandemic helped make Discord an important gathering place for young people, says Clifford.

But Discord's historically "laissez-faire" approach to content moderation made it a "wild West" for young people, he added. The company has made it clear that explicit images are allowed as long as they're shared with consent, and even its community guidelines operate more like suggestions for server moderators to enforce if they want, he said.

Discord has some safety settings, but no clear way for parents to set controls and prevent kids from changing them back. These settings live in the account tab that looks like a smiley face in the bottom right corner of the app. Tap the icon, then go to "Privacy & Safety." The "Keep me safe" setting under "Safe Direct Messaging" means the app will scan direct messages for explicit images.

Below that, teens can toggle off the setting that allows direct messages from new connections. And under "Who can add you as a friend," they'll see options for "everyone," "friends of friends" and "server members." Go with "server members" for the safest option.

To set limits for how much time kids can spend on a given app, turn to device-level settings such as Apple's parental controls.

Blee also recommends that parents sit their kids down for a nuanced conversation about the way hateful ideologies spread online.

"It should be 'There are people out there who have horrible ideas and agendas about harming other kinds of people, and that material might come to you as you sit online and talk to your friends,'" she said. " 'And it's not your fault, but you need to know how to deal with it.' "

Most importantly: Talk to your kid about why people are drawn to hate communities and what they get out of participating.

"Don't start with the rules: don't do this, don't do that. Start with: 'I'm talking to you about this because I want you to be as happy as possible.' " Clifford said. "Tell them what it might look like if they land [in a hate community] and what will happen to them over time."

Related Content

Should Elvis Presley's legacy live on?

Two years after Floyd's death, protesters reflect on what changed

Two years after Floyd's death, Black Minnesotans say little has changed