Free speech restrictions on social media could squash harm reduction and addiction recovery efforts


When Chad Sabora started working in harm reduction, he worked out of his car on the streets of St. Louis, Mo. Sabora's beat-up sedan was a familiar sight in neighborhoods frequented by people who use drugs. Sabora, an attorney and former prosecutor in Chicago, had been in recovery for years and experienced addiction firsthand. Based on decades of research and his own experience, he knew sterile syringes prevented infectious disease transmission, naloxone saves lives by reversing overdoses, and that a well-timed pep-talk or caring gesture could profoundly help someone in the throes of addiction. He took a boots-on-the ground approach to helping others in his hometown.

As America's unprecedented overdose crisis became a national issue, Sabora thought of ways to scale-up his operation. Like many people do, he took to social media, where he tried spreading the gospel of harm reduction and sharing simple strategies to help people survive their substance use disorder. Never use alone. Carry naloxone. Use new syringes. Statistically speaking, there are millions of drug users and people with addiction online. Tragically, over 200 people die from drug overdoses every day in America, and over 100,000 Americans died in the last year alone. But on Facebook, Sabora felt something was keeping him from reaching the masses. Then he noticed his posts ran afoul of the almighty algorithm.

"I've been put in time-out just for posting about naloxone," Sabora said. When he created educational posts about the risks of illicit fentanyl, teaching people how to use fentanyl test strips, his account would be disabled. He realized that by mentioning drugs, his account was dinged by Facebook's automated content censors meant to curb drug sales on social media platforms. The algorithm couldn't distinguish his content from that of a suspected drug dealer. The algorithm picks up particular words, phrases, or speech patterns that are flagged and suppressed. Entire groups of harm reduction activists have disappeared, along with scores of informational posts and threads. Some accounts have been banned for life.

Sabora was confident he could use social media tools to make a difference and help educate people about harm reduction. Instead, he found himself silenced by social media censors.

An obscure regulation called Section 230 shields social media companies from being held liable for the questionable content generated by users. Naturally, some politicians and activists are calling to rewrite Section 230 in order to incentivize tech giants to do a better job at moderating content that users post. While there is unquestionably a credible argument to do so, we must also be careful. Re-writing Section 230 could backfire. Instead of ending online drug sales, these new rules could further censor activists like Sabora who are trying to use social media to save lives during an overdose crisis. Congress must be cautious when crafting content moderation regulations around substance use disorder-as companies are likely to shut down all related conversations to avoid liability.

Section 230 is a decades-old law that regulates speech online and governs nearly every interaction on social media. The law is part of the United States Communications Decency Act of 1996. Section 230 also protects social media platforms from being held responsible for the content users post. For example, if a QAnon group plans and enacts a traitorous insurrection in Washington, DC, the website that hosted this group has immunity. They can't be prosecuted for what people post online. However, advocates have often tried to alter Section 230 to support their own political aims.

Sex trafficking is one of the most recent and thorny instances of Congress rewriting Section 230. Claiming to want to protect children and vulnerable people from being abducted and trafficked, advocates pressured lawmakers to pass a package of laws known as FOSTA/SESTA. This law amended Section 230 by holding websites and online platforms responsible for user content that might facilitate "sexual exploitation." Although the Department of Justice went on record warning that FOSTA/SESTA would make it more difficult to prosecute sex trafficking cases, it was passed anyway. Disaster ensued. Instant crackdowns were implemented by websites, and some websites that were a safe haven for sex workers to vet their clients shutdown entirely. These measures failed to slow sex trafficking. In fact, the law has only been used once by federal prosecutors who said they didn't really need it; they were able to use other, already existing laws to prosecute sex-trafficking offenses in the past. While FOSTA/SESTA did nothing to help potential victims or catch traffickers, it had an immediate, negative effect on another vulnerable group: sex workers.

A similar crackdown could harm people who use drugs and harm reduction advocates like Sabora who are trying to broadcast lifesaving information. Just as advocates urged Congress to rewrite Section 230 to prevent sexual exploitation-a similar campaign is underway to prevent drug sales and curb America's soaring overdose death rate. Horrific stories involving young adults buying drugs on Snapchat and TikTok abound. Some parents and advocates want Section 230 rewritten to increase liability of social media companies for drug sales on their platforms. But efforts to clamp down on online drug sales through Section 230 carve outs are somewhat misguided. Without careful considerations, these reforms would endanger the recovery community and harm reduction advocates-and threaten to stifle productive speech that is critical for progress to combat the overdose crisis. Current proposed 230 carve-outs could undermine access to lifesaving resources, mandating takedowns of broad categories of content, and forcing vulnerable populations, including those navigating supportive services, off-platform. For criminalized communities, the risk for exploitation and harm offline is significant, and support and resources can be limited.

Harm reduction efforts-and conversations-are often nuanced and specific to the individual, aiming to minimize harms of substance use. Blanket content bans, prescribed without consideration of context and nuance, could punish those seeking help-hamstringing legitimate, proven approaches to combatting overdoses.

Instead of broadly crushing free speech and pushing social media companies to eliminate our ability to share resources, the U.S. government should focus its efforts on things that work. To save lives, policymakers must develop a realistic national strategy to combat the overdose crisis, including implementing evidence-based prevention, harm reduction, treatment, and recovery support services on the community-based level. Don't kill the conversation. Instead, we need to coordinate with localities to identify authentic places for support. Most leading platforms where these conversations take place have clear rules prohibiting the online sale and promotion of drugs and controlled substances, and companies must do a better job at policing those efforts. The federal government must work together with online platforms to coordinate a more effective strategy to remove bad actors, and work with law enforcement to prosecute drug traffickers.

There's a world of difference between a syringe exchange and a drug deal. Until our government-and our social media companies-recognize that, we will continue to lose friends, loved ones, neighbors, and family members to preventable overdoses. Not because they wanted to die. But because they were silenced-and separated from the people who were trying to help them.

Ryan Hampton is a nationally recognized recovery advocate, community organizer, and person in long-term recovery from addiction. He is the author of "Unsettled: How the Purdue Pharma Bankruptcy Failed the Victims of the American Overdose Crisis." Follow him on Twitter: @RyanForRecovery