TikTok to make child accounts private by default – NSPCC calls on other social giants to follow suit

Mike Wright
·3 min read
TikTok - Drew Angerer /Getty Images 
TikTok - Drew Angerer /Getty Images

TikTok is to make the accounts of under 16s private by default, the tech firm has announced, as the NSPCC called on other social media giants to follow suit.

The Chinese-owned app, which has become hugely popular with teenagers due to its short videos set to music, said it is now restricting who could watch videos made by those younger than 16.

The move was welcomed by child protection charities, who warned that paedophiles are exploiting the pandemic to find and groom children on social media sites.

Most major social media companies make users’ accounts public by default, meaning anyone can see them unless they actively change their profile settings.

Meanwhile, TikTok’s announcement also comes ahead of incoming child protection regulations that could see social media firms fined billions if they don’t shield child users from harm.

In September, new child data privacy laws are due to come into force and the Government has also pledged to impose a statutory duty of care on tech firms to better protect vulnerable users such as children, a measure The Telegraph has campaigned for since 2018.

TikTok said from this week all new and existing accounts of under 16-year-olds would be made private by default, meaning only people they had approved will be able to see their profiles.

Under 16s will still be able to switch their accounts back to public if they wish, but will have to actively choose to do so.

The company is also banning people from being able to download videos posted by under 16s and restricting who can comment on their posts to just approved friends.

However, TikTok’s new measures will be able to be circumvented by child users as the app has no meaningful age checks.

Andy Burrows, the NSPCC’s head of child safety online policy, described TikTok’s new child privacy measures as “bold” and “hugely welcome”.

He said: “It comes as abusers are taking advantage of the pandemic to target children spending more time online and we urge other platforms to be similarly proactive rather than wait for regulation to come into effect which will place a Duty of Care on tech firms to protect users.

“We know police recorded 1,220 grooming offences in the first three months of the pandemic in England and Wales so it’s never too soon for tech firms to start making their sites safer.”

TikTok's move was also welcomed by Baroness Kidron, the architect of the Age Appropriate Design Code child privacy rules that are due to come into force in September.

The rules, which will be enforced by the Information Commissioner's Office (ICO), will see tech first face fines running into the billions if they are found to have misused or illegally collected data on children.

The cross-bench peer and founder of 5Rights children's charity said: “Congratulations to the ICO, whose robust Children’s Code has shown TikTok how to provide greater protection for children’s privacy online.

“It is good to see them acting ahead of the September deadline. These changes are a big step forward for children, and important step on the way to building the digital world children deserve.”

The news comes as a new report found that vulnerable children were more likely to cyberbullied and groomed online than other children.

A study of interviews with more than 6,500 11 to 17-year-olds conducted by Internet Matters and charity Youthworks, found that 43 percent children and teenagers with eating disorders had reported attempts to groom them online, compared to 3 percent of other school children.

The report also found that teenagers with eating disorders were seven times more likely to be victims of revenge porn attacks.

Carolyn Bunting, CEO of Internet Matters, said the findings showed that online education and outreach had to improve for vulnerable children. She said: “Online safety education as currently delivered doesn’t work for vulnerable children – and now we have the data to allow meaningful conversations to take place between them and trusted adults.”