Social media firms have 'lost control' of self harm material, Children's Commissioner warns

Anne Longfield accused tech firms of treating children's safety as an 'afterthought' - Jeff Gilbert
Anne Longfield accused tech firms of treating children's safety as an 'afterthought' - Jeff Gilbert

Social media firms have "lost control" of self harm material being promoted on their platforms, the Children's Commissioner has warned.

In an open letter, Anne Longfield accused tech companies of treating young people’s safety as an “afterthought” and said they should not let children use their apps if they could not control what images and videos they saw.

As a result, the Commissioner is calling for a new independent Digital Ombudsman, financed by the tech companies, to regulate the industry. She also backed the Telegraph’s campaign for social media firms to be subject to a statutory duty of care to protect children.

Ms Longfield’s letter comes after a father accused Instagram of “helping to kill” his 14-year-old daughter, Molly Russell, after he found she had been looking at suicide and depression images before taking her own life.

Ian Russell has since called on social media companies to “stand up and take more responsibility” for what young people view on their networks.

Molly Russell had been looking at suicide posts on social media before taking her own life in 2017
Molly Russell had been looking at suicide posts on social media before taking her own life in 2017

In the wake of his comments, the Health Secretary Matt Hancock has written to social media and tech companies saying he is “appalled” at how easy suicide material is to find on their sites.

He also warned that Parliament could “ban” access to them if they did not act.

In her letter, Ms Longfield said: “The tragic suicide of Molly Russell and her father’s appalled response to the material she was viewing on social media before her death have again highlighted the horrific amount of disturbing content that children are accessing online.

“I do not think it is going too far to question whether even you, the owners (of tech companies), any longer have any control over their content.

“If that is the case, then children should not be accessing your services at all, and parents should be aware that the idea of any authority overseeing algorithms and content is a mirage.”

She said in recent years she had discussed with tech firms ways they could protect children but she was not convinced that they are taking the efforts seriously.

“I have been reassured time and time again that this is an issue taken seriously,” said Ms Longfield.

“However, I believe that there is still a failure to engage and that children remain an afterthought.”

The Children’s Commissioner for England has statutory powers to protect children, including to demand data and information from public bodies.

Although her powers do not extend to private companies, Ms Longfield called on social media companies to voluntarily reveal the extent of self harm material being published on their sites as well as how many children and teenagers are viewing it.

Earlier this week Sir Nick Clegg, who has recently joined Instagram’s parent company Facebook as vice president of global affairs and communications, defended its approach to suicide and self harm material.

He said the company had been advised not to take down all such images as a matter of course and that Facebook had save thousands of lives by highlighting users suicidal behaviour to mental health charities and authorities.

Responding to the letter from the Children’s Commissioner, Facebook said: “We have a huge responsibility to make sure young people are safe on our platforms and working together with the Government, the Children's Commissioner and other companies is the only way to make sure we get this right.”

The children's charity the NSPCC welcomed the commissioner's call for an independent regulator.  Andy Burrows, Associate Head of Child Safety Online, said: “It is good to see that the Children’s Commissioner is backing the NSPCC’s proposal for a statutory duty of care. Social networks have repeatedly shown they are incapable of regulating themselves and that they need to be forced to protect children.

“But it is absolutely imperative we get this right, if children are going to be truly protected. The NSPCC’s Wild West Web campaign is calling for the government to bring in an independent regulator with statutory powers, which will require transparency and force social networks to take proactive steps to tackle grooming.”