Meta to hide content on self-harm, eating disorders from teens' Facebook, Instagram feeds

UPI
Meta announced Tuesday that it will implement new policies restricting access to content related to self-harm for teen users. File Photo by Terry Schmitt/UPI

Jan. 9 (UPI) -- Meta on Tuesday announced new policies to restrict teens from viewing content related to self-harm and eating disorders on Facebook and Instagram.

Beginning Tuesday teens younger than 18 will no longer be shown content featuring self-harm and "other types of age-inappropriate content" in their feeds even if it is posted by a user they follow.

Meta said the changes, which expand on policies that prevent such content from being recommended to teens in algorithmically suggested "Reels" and "Explore" pages will be fully implemented "in the coming months."

The company said that though people posting content about their struggles with self-harm can be helpful on the whole, content of that type is not suitable for younger users.

"While we allow people to share content discussing their own struggles with suicide, self-harm and eating disorders, our policy is not to recommend this content and we have been focused on ways to make it harder to find," the blog post read.

Meta will also automatically place teens already using the app in its most restrictive content control setting on both Instagram and Facebook.

Meta said teens who are new to the app are already automatically placed under the recommendation controls that make it more difficult for "potentially sensitive content or accounts" to appear in their search results or on pages featuring suggested content.

Additionally, teens will be notified to update their safety and privacy settings and given the option to "turn on recommended settings" that will automatically restrict who can interact with their accounts by reposting their content, tag them in posts or message them, while also hiding "offensive comments."

Meta added it will also roll out a change for all users "over the coming weeks" that will hide search results related to suicide, self-harm and eating disorders and direct users who search for these topics to "expert resources" where they can receive help.

The company said that while posts containing people discussing their experiences with these issues are still allowed, Facebook and Instagram will continue to share resources from organizations such as the National Alliance on Mental Illness on that type of content.

Meta has faced criticism for the type of content that reaches teens on its platforms and in 2021 a whistleblower came forward claiming Meta leadership was aware of the problem and not doing enough to stop it.

"The company's leadership knows how to make Facebook and Instagram safer but won't make the necessary changes," former Meta employee Frances Haugen told Congress in 2021.

In October, a group of 42 Attorneys General announced that they will sue Meta, arguing that the company's platforms are causing mental health harm to kids.