Social media chiefs face jail if they refuse to hand over secret algorithms

Twitter, Instagram and Facebook apps - Matthew Vincent/PA
Twitter, Instagram and Facebook apps - Matthew Vincent/PA

Social media executives could face up to two years in jail if they refuse to hand over their secret algorithms to the watchdog responsible for protecting children from online harms.

A draft bill published on Wednesday gives the watchdog Ofcom powers to require tech giants to provide information for any investigation into breaches of the new duty of care laws – including the algorithms blamed for driving harmful content to vulnerable children.

It will also be able to search their offices after securing a warrant and giving the social media giants seven days’ notice of the “raid”.

Ofcom will initially have powers to fine tech giants who refuse to cooperate or hand over any relevant information.

Fines are to be set at a maximum of 10 per cent of global turnover (£6 billion for Facebook) or £18 million – or whichever is higher.

But the bill proposes that ministers should have “reserve” powers to criminalise the refusal to hand over the data.

This would mean a senior manager designated by their company as responsible for complying with the duty of care laws could be prosecuted with a maximum penalty of up to two years in jail.

The new powers on investigation are seen as critical by parents whose children have suffered after finding themselves inundated with harmful content.

In the past, tech giants have been highly reluctant to share insights into the precise workings of their powerful algorithms that decide what billions of users see in their apps.

The family of Molly Russell, the 14-year-old schoolgirl who took her life in 2017 after viewing self-harm material on Instagram and other sites, has fought a long battle to gain access to their daughter’s account.

Facebook only handed over 10,000 pages of material from her Instagram account after legal orders requiring them to do so in advance of the inquest last December. Even then, the workings of the algorithms are likely to remain hidden.

Her father, Ian, warned this week that the bill would “fail” if it did not clamp down on social media algorithms that bombard users with harmful content.

Speaking to The Telegraph, he said: “If that is watered down in any way and if harmful content continues to be promoted and pushed by tech companies’ algorithms, then the bill will have failed and people will remain at risk.

“It has got to stop algorithmic exploitation, which is essentially these companies' business model and what drives their profits. But it is also killing vulnerable young people and the bill has to do something to end this misery.”

Campaigners including the charity NSPCC are expected to push for criminal sanctions to be brought forward rather than held in reserve. The Government has so far only committed to a review of the Act and a decision on whether to enact criminal sanctions two years after it takes effect.

The bill sets out a tough regime to combat illegal harms such as child abuse and terrorism but it leaves open what legal but harmful material, such as self-harm and eating disorder posts, should be policed by Ofcom.

MPs are concerned that the bill leaves it to ministers to set them out in secondary legislation but digital minister Caroline Dinenage promised MPs they would have a vote.

“We want to make sure Parliament is very firmly included in this. We didn’t want to put the names of the harms in the primary legislation because they change so much,” she said.

“Ten years ago I don’t think any of us here had heard the term upskirting or deep fakes, for example, and now they are everyday parlance. That is why we need to make sure this is a bill that is going to stand the test of time.”

The draft bill will now be scrutinised by a Parliamentary committee for 12 weeks before a final bill is published around the turn of the year. Campaigners fear it may not get royal assent until 2023.