Parents told to delete social media apps to prevent kids from seeing Hamas atrocities

American and Israeli parents say they have received messages from schools, temples, synagogues and peers following the Hamas terror attack urging them to delete social media applications off their kids’ phones.

The warning came after the military wing of Hamas threatened to kill an Israeli hostage with every Israeli “targeting” of civilians in Gaza, and then broadcast the executions “in audio and video.”

“It has come to our attention that deeply disturbing videos, including footage of hostages, may be spread across social media in the near future,” a principal at a public school in New York City said in an email this week, quoting from a message she said had been forwarded to her.

“These videos and images will likely be shared through Instagram, TikTok and other social media outlets,” the email went on to say. “We strongly encourage you to consider having your children delete these apps for the time being, putting up additional parental controls, and/or to assist them in exercising extreme discretion around social media.”

The messages have been sent to parents in other states that are home to sizable Jewish populations, including Maryland and New Jersey. It was not immediately clear whether a particular organization was encouraging schools to send out warnings or whether schools were acting independently.

David Lange, a resident of Israel who runs the Israeli advocacy group Israellycool, posted on X what he said was a screenshot of a message from parents at his daughter’s school. The message, written in Hebrew and shared via WhatsApp, warned that Hamas could soon distribute hostage videos and urged parents to get rid of TikTok from their kids’ devices.

The warnings extended beyond the U.S. and Israel. JFS (also known as the Jewish Free School), a British secondary school in London, sent an email this week informing parents that administrators had warned students that Hamas could release disturbing images on social media, and suggested students delete TikTok and Instagram.

“In personal safety assemblies today, we have asked students to delete these applications from their phones and it is something you may wish to follow up at home,” said the email, which was reviewed by NBC News. (JFS did not immediately respond to a request for comment.)

The exact number of Israelis abducted by Hamas gunmen remained unclear Wednesday afternoon. Hamas militants have claimed more than 100 people were captured; the Israel Defense Forces said Wednesday that 60 people were held by Hamas in Gaza.

NBC News has not independently verified either of those claims.

The messages from schools and Jewish religious institutions underscore the sense of fear that has taken root worldwide after Hamas terrorists stormed into Israel on Saturday, killing hundreds of people. Israel’s counteroffensive has killed hundreds of people in Gaza and the Israeli-occupied West Bank.

In the U.S. and abroad, the war between Israel and Hamas has more generally stoked anxieties about outbreaks of violence at temples, synagogues, mosques and other institutions associated with both Judaism and Islam. Police departments nationwide are ratcheting up security at potential targets.

Facebook, X, TikTok and other social media services have been filled with graphic imagery out of Israel and Gaza since the weekend’s violence, including videos of Israelis being kidnapped from their kibbutzim and photos of Palestinian civilians killed at their homes in Gaza.

In one post on TikTok, a parent said that she has deleted any apps that could potentially lead her kids to disturbing videos, including YouTube and Apple’s Safari browser.

“I had a talk with the kids, too, about why we were doing that, that there could be scary things coming at them through social media and through YouTube,” she said in the video. “If they do see something, they are not in trouble and it is not their fault, and they need to talk to us about it.”

TikTok, which has community guidelines barring violent content, plans to add another layer of protection to the platform amid the conflict, including additional moderation resources, blocking hashtags that promote violence and proactive fact-checking around misleading narratives, according to a spokesperson for the Chinese-owned service.

X and Meta, the parent company of Facebook, did not immediately respond to emails requesting comment Wednesday.

In recent years, leading social media platforms such as Facebook and Twitter have come under intense scrutiny for hosting video, photo and audio content that might be harmful to teenagers and children, including images related to terrorism, gun violence, suicide and self-harm.

In response, some platforms have rolled out stricter parental controls, assembled moderation teams and invested in automated systems designed to quickly spot harmful content. But debates over content moderation still roil the technology industry, intensified in part by growing politicization.

This article was originally published on NBCNews.com