Investigation finds more than 100 convicted paedophiles openly using Instagram

Instagram - Jenny Kane /AP
Instagram - Jenny Kane /AP

Instagram has been branded a “disgrace” by Britain’s top child protection police officer, after a Telegraph investigation found more than 100 convicted paedophiles openly using the social network.

Chief constable Simon Bailey accused social media companies of putting profit before their “social and moral responsibility” to protect children, as this newspaper found known abusers following young schoolchildren as well as posting topless selfies of themselves on the app.

A number of paedophiles uncovered on Instagram, which allows users as young as 13 on the service, were the most serious offenders, including two serving 20 years in jail for raping young children.

Instagram said it bans all sex offenders when it finds them or is nofitied by the authorities and that it has trained police forces to flag known abusers once convicted.

The company also said it had removed all the accounts flagged to it by The Telegraph’s investigation.

Police are now also examining the accounts uncovered by the paper with an active investigation under way into at least one suspect as a result.

In an exclusive oped (below), Mr Bailey, who is the National Police Chiefs Council’s child protection lead as well as chief constable for Norfolk Constabulary, compared social media sites to a high street shop that invites children in but has paedophiles hiding in the corner ready to abuse them.

He said: “The fact that this paper has been able to identify and provide me with the details of 100 convicted sex offenders who have Instagram account contrary to the company’s policy speaks volumes about their commitment to policing their own site.

“It is a disgrace that the social and moral responsibility of these companies are simply ignored for profit and for the benefit of shareholders.”

In recent weeks, the investigation uncovered numerous accounts with the same name and photo as scores of paedophiles who have been convicted and jailed for serious offences over the last decade.

Among them was an account for Allen Cain from Chester, who was 29 in 2019 when he was jailed for 20 years for raping a 12-year-old girl, who he claimed “seduced” him.

Another account, linked to Aaron Shelton, from Derby, who was placed on the sex offenders’ register in 2019 aged 19 for trying to groom underage children over social media, was found following more than 1,300 other people on Instagram, including young schoolgirls.

Instagram profiles of Aaron Shelton (L) and Allen Cain (R) - Instagram
Instagram profiles of Aaron Shelton (L) and Allen Cain (R) - Instagram

One convicted offender even described himself as a “social media marketer” on his Instagram profile and followed more than 5,000 people on the network.

In its community rules, Instagram, which is owned by Facebook, asks other users to report any known paedophiles by submitting links to news or court stories to prove their offence.

Sources at Instagram said they removed all the offenders the police notified them of, but because the sex offenders’ register is confidential it meant the company often relied on being informed directly by forces when a paedophile is conviction.

Following the investigation, a spokesperson for Instagram said: “We do not allow convicted sex offenders on Instagram and have removed the accounts brought to our attention. We consult with specialist UK law enforcement teams to detect and ban accounts which may be used to exploit or endanger children, which includes training for the police on how to report convicted sex offender accounts directly to us.

"We take down all accounts they raise through this process. We have built industry-leading technology and a team of over 35,000 people to keep our platforms safe, and in Q4 2020 97 percent of the child nudity and exploitative content removed from Instagram was taken down before anyone reported it to us. We also report instances of child sexual exploitation to law enforcement via (the US child abuse watchdog) NCMEC.”

The investigation's findings come just days after the Government outlined plans for a tough new regime that could see tech giants fined billions or their senior executives even jailed if they fail to enforce their own rules.

On Wednesday, Culture Secretary Oliver Dowden told social media companies they have “no more excuses”, as he outlined plans to impose a statutory duty of care on them to better protect their users online.

Under the proposed regime, which the Telegraph has campaigned for since 2018, Ofcom will be given muscular powers to fine companies up to 10 per cent of their global revenue or block them from the UK if they are found to have breached the legal duty. The regulator will also be given reserve powers to jail tech executives for up to two years for serious failings or if they fail to disclose information about their secretive algorithms to Ofcom investigators.

The NSPCC said The Telegraph’s findings highlighted the urgent need for Ofcom to be quickly established as the online watchdog.

Andy Burrows, NSPCC head of child safety online policy, said: “Yet again we see another example of Facebook failing to follow their own rules and it underlines precisely why an effective Duty of Care is urgently needed.”

A spokesman for Instagram said: “We do not allow convicted sex offenders on Instagram and have removed the accounts brought to our attention. We consult with specialist UK law enforcement teams to detect and ban accounts which may be used to exploit or endanger children, which includes training for the police on how to report convicted sex offender accounts directly to us.

"We take down all accounts they raise through this process. We have built industry-leading technology and a team of over 35,000 people to keep our platforms safe, and in Q4 2020 97 percent of the child nudity and exploitative content removed from Instagram was taken down before anyone reported it to us. We also report instances of child sexual exploitation to law enforcement via (the US child abuse watchdog) NCMEC.”

We can fly a drone on Mars but we cannot prevent the uploading and sharing of images of child abuse

Imagine a shop opens up on the high street designed for children to be able to go and socialise, where they share personal details about themselves and are encouraged to spend their money, but hiding in the shadows of the store are paedophiles waiting to sexually abuse them.

And available in that shop to these paedophiles at no charge there are also millions of high definition images of children and babies being sexually abused. The outcry once the public became aware would be impossible to ignore and public outrage would drive the store to close its doors forever.

Unfortunately, the shop I have described is a reality, it is exactly what is taking place every day on the web and on social media platforms. It is a shop that never shuts its doors and doesn’t close at the end of the day.

In the UK we are acknowledged as being the best in the world at tackling the online threat to children and over 10,000 offenders are being dealt with every year for viewing indecent images and grooming children online. Yet despite that world leading response, the number of referrals of child abuse law enforcement receives continues to grow at an alarming rate and there is no suggestion this trend is going to slow down.

It is to their eternal shame that the tech industry and the providers of these sites know exactly what is taking place and only pay lip service to tackling the threat. We can fly a drone on Mars but we cannot prevent the uploading and sharing of images of child abuse.

I do not believe this is the case for one minute and despite the industry being encouraged to do more, too little has been done and it is certainly too late for the children who have been abused and whose 17 million images populate the Child Abuse Image Database.

The fact that this paper has been able to identify and provide me with the details of 100 convicted sex offenders who have Instagram account contrary to the company’s policy speaks volumes about their commitment to policing their own site.

It is a disgrace that the social and moral responsibility of these companies are simply ignored for profit and for the benefit of shareholders. It is unfortunate and sad that as a result of the abrogation of these responsibilities that the Government has had to introduce the Online Safety Bill and to say enough is enough.

Industry has ignored all our attempts to encourage them to ensure children are safe online and to create a hostile environment for paedophiles voluntarily, therefore legislation is the only way I believe they will take notice and invest in the technology that is readily available to deliver this. It is time that the tech companies are held to account and time for them to protect our children from harm.

Simon Bailey is chief constable of Norfolk Constabulary and child protection lead for the National Police Chiefs Council