EDITORIAL: Every image of child sex abuse is a depiction of a crime in progress

Mar. 28—The BDN Editorial Board operates independently from the newsroom, and does not set policies or contribute to reporting or editing aticles elsewhere in the newspaper or on bangordailynews.com.

If you are concerned about a child being neglected or abused, call Maine's 24-hour hotline at 800-452-1999 or 711 to speak with a child protective specialist. Calls may be made anonymously. For more information, visit this link.

Last week's arrest of two-time gubernatorial candidate Eliot Cutler sent shockwaves through Maine. Cutler, 75, was charged with possessing sexually explicit material involving a child under 12. More charges could be forthcoming.

Over the coming days, weeks and months, many more details of the case will emerge. And, already, political operatives are busy either distancing themselves from Cutler, who was a significant donor to many campaigns and charities, or trying to tie their opponents to him. The Bangor Daily News endorsed Cutler in both of his campaigns for governor.

But people shouldn't lose sight of who is most hurt by this sexual exploitation: the children. For these images to be produced and shared, children were sexually abused. Perpetrators of this abuse, including those who produce, distribute, share and view the images, are committing heinous crimes that deserve severe punishment.

In 2020, nearly 22 million reports of suspected child sexual exploitation were made to the National Center for Missing and Exploited Children's tipline, a nearly one-third increase over the previous year. These reports include the sharing of sexually explicit materials, child molestation, child sex trafficking as well as online efforts to lure children for sex, which nearly doubled between 2019 and 2020, according to the center.

The increase in child sex abuse imagery, largely fueled by the internet, is stomach churning. In 1998, there were 3,000 reports of such imagery. By 2014, there were more than 1.4 million reports. In 2018, there were 18.4 million reports, the New York Times reported in 2019 after reviewing police and court documents. That year, tech companies reported more than 45 million images and videos flagged as child sex abuse.

"Each and every image is a depiction of a crime in progress," Sgt. Jeff Swanson, a task force commander in Kansas, told the Times. "The violence inflicted on these kids is unimaginable."

The problem is global but the U.S. is a top country where such images are created and viewed. Efforts to stop the production and sharing of child sex abuse imagery, including a 2008 law passed by Congress, have failed to help.

Recent studies have found that the majority of victims are girls, although when boys are victimized it is often more egregious. Prepubescent children are the most common victims. Globally, the vast majority of victims are white.

It is difficult to gather information about the perpetrators of the abuse — who often go to great lengths to keep their identity hidden. In cases when the offender's identity is known, nearly 93 percent were male.

Survivors of child sex abuse imagery report long-lasting trauma because images of their abuse often reappear, again and again.

"There is a lot I don't remember, but now I can't forget because the disgusting images of what he did to me are still out there on the Internet," one survivor said in a victim's impact statement included in a National Center for Missing and Exploited Children report. "It hurts me to know someone is looking at them — at me — when I was just a little girl being abused for the camera. I did not choose to be there, but now I am there forever in pictures that people are using to do sick things. I want it all erased. I want it all stopped. But I am powerless to stop it."

The proliferation of child sex abuse imagery is a massive problem. But we aren't powerless to stop it. Congress can pass stricter laws and appropriate enough money to ensure law enforcement can implement them. Even without stricter laws from Congress, tech companies must do more to track and remove these images from their platforms and networks.