A UK-based safety watchdog that tackles online child abuse has warned it is “overwhelmingly” dealing with images taken by children themselves, usually after being groomed.
It comes as a Northern Ireland man, who exploited more than 70 children online, was given a life sentence with a minimum of 20 years in jail on Friday.
Catfish paedophile Alexander McCartney, 26, persuaded scores of children to send him images via social media, then used these to blackmail them into sending more graphic material.
McCartney, who posed as a teenage girl to befriend young females on Snapchat, admitted 185 charges, including the manslaughter of 12-year-old Cimarron Thomas, who took her own life in May 2018.
Unable to live with the loss of his daughter, her father Ben Thomas also died by suicide 18 months later.
“This is not a future issue for us. This is a very much a now issue,” Dan Sexton, chief technology officer of the Internet Watch Foundation (IWF), told Sky News.
“The internet has removed the barriers [to children for paedophiles]. Smartphones and generally camera-enabled web devices are in the hands of children which has made it much easier for people to access children, coerce children, groom children,” he said.
Last year, the watchdog dealt with more than 254,000 “self-generated” child abuse images, making up 92% of the images it tackled.
Mr Sexton said he is frustrated because this kind of child abuse should be more preventable.
“There are many more points of intervention to prevent it getting to that point, in a way that is much harder with contact abuse [where an abuser is physically making the images of the child].”
He wants better safeguards by companies running social media platforms, as well as better education for teachers, parents and children.
If you’re a worried parent
The IWF’s advice to worried parents is to remember the acronym TALK.
• T: Talk to your child about online sexual abuse and listen to their concerns
• A: Agree rules around the use of technology as a family
• L: Learn about the platforms and apps that your child uses
• K: Know how to use the privacy settings and tools within those apps to make sure they’re all set correctly for your child’s safety
If you’re under 18 and worried
“The power an extortionist has is that threat of sharing imagery, saying, ‘I’m going to share your images unless you give me more imagery or provide payment’,” according to Mr Sexton.
“One of the ways of addressing that is to take that power away.”
The IWF, NSPCC and Childline launched the Report Remove tool in 2021 which gives children in the UK the power to get abuse content removed.
Childline passes the report to the IWF which then works to get the image removed but also tagged on databases to make sure it can never be uploaded again.
“If we can do that, that takes that power away from the extortionist saying, ‘I’m going to share the imagery’. That child can know that that imagery cannot be shared because it’s been blocked,” said Mr Sexton.
What are tech platforms doing?
Nude images sent via direct message on Instagram will now be automatically blurred, and soon, users won’t be able to screenshot some images and videos sent via DM.
Meta has also restricted access and increased privacy for accounts of under-18s.
Apple is testing out a feature in Australia to allow children to report nude images and video being sent to them directly to the company, which could then report the messages to police.
Earlier this year, Snapchat added warnings on some messages if they come from someone reported or blocked by others.
Friend requests from users on the app without mutual connections and with a history linked to scamming activities will also be blocked.
:: Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.