
Commissioner calls for ban on apps that make deepfake nude images of children
Artificial intelligence 'nudification' apps that create deepfake sexual images of children should be immediately banned, amid growing fears among teenage girls that they could fall victim, the children's commissioner for England is warning.
Girls said they were stopping posting images of themselves on social media out of a fear that generative AI tools could be used to digitally remove their clothes or sexualise them, according to the commissioner's report on the tools, drawing on children's experiences. Although it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal, the report noted.
'Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps,' the commissioner, Dame Rachel de Souza, said.
'The online world is revolutionary and quickly evolving, but there is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I'm calling on the government to take decisive action to ban them, instead of allowing them to go unchecked with extreme real-world consequences.'
De Souza urged the government to introduce an AI bill that would require developers of GenAI tools to address the risks their products pose, and to roll out effective systems to remove sexually explicit deepfake images of children. This should be underpinned by policymaking that recognises deepfake sexual abuse as a form of violence against women and girls, she suggested.
In the meantime, the report urges Ofcom to ensure that age verification on nudification apps is properly enforced and that social media platforms prevent sexually explicit deepfake tools being promoted to children, in line with the Online Safety Act.
The report cited a 2025 survey by Girlguiding, which found that 26% of respondents aged 13 to 18 had seen a sexually explicit deepfake image of a celebrity, a friend, a teacher, or themselves.
Many AI tools appear to only work on female bodies, which the report warned is fuelling a growing culture of misogyny.
One 18-year-old girl told the commissioner: 'The narrative of Andrew Tate and influencers like that … backed by a quite violent and becoming more influential porn industry is making it seem that AI is something that you can use so that you can always pressure people into going out with you or doing sexual acts with you.'
The report noted that there is a link between deepfake abuse and suicidal ideation and PTSD, for example in the case of Mia Janin, who died by suicide in March 2021.
De Souza wrote in the report that the new technology 'confronts children with concepts they cannot yet understand', and is changing 'at such scale and speed that it can be overwhelming to try and get a grip on the danger they present'.
Lawyers told the Guardian that they were seeing this reflected in an increase in cases of teenage boys getting arrested for sexual offences because they did not understand the consequences of what they were doing, for example experimenting with deepfakes, being in a WhatsApp chat where explicit images are circulating, or looking up porn featuring children their own age.
Danielle Reece-Greenhalgh, a partner at the law firm Corker Binning who specialises in sexual offences and possession of indecent images, said the law was 'trying to keep up with the explosion in accessible deepfake technology', which was already posing 'a huge problem for law enforcement trying to identify and protect victims of abuse'.
She noted that app bans were 'likely to stir up debate around internet freedoms', and could have a 'disproportionate impact on young men' who were playing around with AI software unaware of the consequences.
Reece-Greenhalgh said that although the criminal justice system tried to take a 'commonsense view and avoid criminalising young people for crimes that resemble normal teenage behaviour … that might previously have happened behind a bike shed', arrests could be traumatic experiences and have consequences at school or in the community, as well as longer-term repercussions such as needing to be declared on an Esta form to enter the US or showing up on an advanced DBS check.
Matt Hardcastle, a partner at Kingsley Napley, said there was a 'minefield for young people online' around accessing unlawful sexual and violent content. He said many parents were unaware how easy it was for children to 'access things that take them into a dark place quickly', for example nudification apps.
'They're looking at it through the eyes of a child. They're not able to see that what they're doing is potentially illegal, as well as quite harmful to you and other people as well,' he said. 'Children's brains are still developing. They have a completely different approach to risk-taking.'
Marcus Johnstone, a criminal solicitor specialising in sexual offences, said he was working with an 'ever-increasing number of young people' who were drawn into these crimes. 'Often parents had no idea what was going on. They're usually young men, very rarely young females, locked away in their bedrooms and their parents think they're gaming,' he said. 'These offences didn't exist before the internet, now most sex crimes are committed online. It's created a forum for children to become criminals.'
A government spokesperson said: 'Creating, possessing or distributing child sexual abuse material, including AI-generated images, is abhorrent and illegal. Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines.
'The UK is the first country in the world to introduce further AI child sexual abuse offences, making it illegal to possess, create or distribute AI tools designed to generate heinous child sexual abuse material.'
In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Finextra
2 days ago
- Finextra
How To Co-Design a Gen AI Co-Pilot
Joining the FinextraTV studio at Temenos Community Forum 2025, Barb Morgan, Chief Product & Technology Officer, Temenos and Christine Huberty, Deputy Chief Information Officer, BIL discuss the unique and collaborative process behind co-designing a Gen AI Co-Pilot. They explain their approach to co-innovation alongside Microsoft, as well as the benefits and use cases of Gen AI in banking.


The Sun
2 days ago
- The Sun
The secret settings to ‘scam-proof' your phone, stop crooks stealing your private photos & £1000s from your bank account
THOUSANDS of pounds gone in seconds, your private pics in the hands of strangers, and your loved ones ripped off too – these are just a few of the nightmare consequences of a smartphone scam. I've spent years writing about how to dodge increasingly convincing online cons that cost Brits millions a year - and it only takes a few seconds to avoid disaster by changing settings on your iPhone or Android. Online scams are big business and a recent Ofcom report warned that almost half of Brits have been drawn in by clever scammers. A quarter of those had lost money, with a fifth losing at least £1,000. They can come over the phone, text, or email, or through websites and apps. They may rush you into making an investment, trick you into thinking you're chatting with a loved one, or hoodwink you with the promise of a prize, tech support, or a refund. Now, crooks are even using AI to dish out more convincing scams faster than ever, so it's important you have the absolute best phone settings to avoid being another victim parted from their savings. Here are the iPhone and Android tricks you need to activate immediately to keep yourself and you cash safe. IPHONE TRICK #1 – SILENCE UNKNOWN NUMBERS Cold calls are a classic way for crooks to target you with a scam. So go to Settings > Apps > Phone > Calls > Silence Unknown Callers, then turn it on. This will send any calls from unknown callers straight to voicemail. If it's important, they'll leave a voicemail. You can also read transcriptions of your voicemail in real time if you've got iOS 17 or later (go to Settings > General > Software Update to check). IPHONE TRICK #2 – CALL SCREENING This one isn't out yet, but it'll land on your iPhone with the imminent iOS 26 update (likely in September). Make sure to install that (in Settings > General > Software Update), and then a robot will answer your phone calls for you. 16 It'll ask their name and reason for calling, and write out the answer in real-time on your screen. Then you can decide whether to answer or if it sounds like a scam. This also prevents scammers from being able to hear your voice and cloning it for future cons. IPHONE TRICK #3 – LOCKDOWN MODE One extreme option is to turn on Lockdown Mode – but only do this if you think you're definitely being targeted by scammers. It'll block most message attachment types, as well as links and link previews, as well as incoming FaceTime calls from unknown numbers. 16 You'll also be prevented from automatically connecting to unsafe "non-secure" WiFi networks – and you'll be blocked from adding configuration profiles that let other people change settings on your phone. Your iPhone won't work to its full potential with Lockdown Mode on, so keep that in mind: this is only for extreme cases. To switch it on, just go to Settings > Privacy & Security > Lockdown Mode > Turn On Lockdown Mode > Turn On Lockdown Mode, then follow the steps to restart your phone. IPHONE TRICK #4 – AUTHENTICATOR If you've accidentally given up your password, it's not the end of the world – as long as your online accounts also require a special authentication code to log in. You've probably had these coming via SMS, but you can also get them via an app. Authenticator codes from an app are far safer than the ones that come over text, because SMS messages are more easily hacked. And your iPhone has an authenticator code generator built in as standard. Go into the account settings for an app, then choose the option to add two-factor verification. Then scan the code with your iPhone camera, and Apple's iCloud Keychain will suggest adding it to your account. Then when you sign in, iCloud Keychain will automatically fill in the code for you. You can also manually add it by going into the Passwords app, and looking under the Codes tab. Then if you ever accidentally give up your password to an account, crooks still won't be able to log in. IPHONE TRICK #5 – CALL BLOCKING There are lots of apps that maintain giant lists of phone numbers that are known to be spam or scams. And your iPhone can use these apps to automatically block calls from these numbers. First, go to the Apple App Store in the UK, and download some call identification apps, like Truecaller or Whoscall. Then go to Settings > Apps > Phone > Call Blocking & Identification. This will only appear if you've installed some ID apps. 16 16 Now, choose Silence Junk Callers, which will silence calls identified by your phone network as potential spam or fraud. This option may not appear if your phone network hasn't activated it. Then activate any Call Identification apps that you've installed. ANDROID TRICK #1 – SCAM DETECTION Millions of Android phones now have a feature called Scam Detection inside Google Messages. Make sure to update to the latest version of Android. Then when you're receiving texts, AI on your phone will scan your texts to look for "suspicious" signs that you're being scammed. It only works on conversations with non-contacts, and your texts won't be send to Google (unless you report the conversation). You'll see it working if a big red alert pops up that says "Likely Scam" – and then you'll be able report and block the sender. ANDROID TRICK #2 – APP SCANNER Dodgy apps are another way criminals will try to scam you. These apps might even seem legitimate, but they're actually silently harvesting your info or money in the background. Google automatically scans the apps you download from its Play Store – but often the most dangerous apps come are ones that you've installed from elsewhere. Go to the Google Play Store > Profile > Play Protect > Settings > Improve Harmful App Detection and turn it on. This will let Google scan apps that you've downloaded from outside of its own Play Store, potentially spotting scam apps that you've installed. Then you'll get the warnings and can delete them from your phone. ANOTHER SAFETY TRICK – CALL YOUR FAMILY! The Sun's tech editor Sean Keach has another useful scam-busting trick that requires no changes to settings at all... One of the most devastating scams out there right now is the "Hi mum" con. This sees crooks texting parents or their kids posing as the other – saying they're reaching out from a new number and need help. It'll start innocently, but they'll quickly ask for cash for an urgent bill, taxi, or fine. Brits have lost thousands in one go to this brutal con. So call your loved ones today and set up a safe word or phrase. That way, if any of you asks the other for money, you'll be able to verify the request with a safe word. Don't pick something obvious like a place or birthday – and certainly don't write it down anywhere online or on your devices. This is the easiest way to avoid being caught up in a "Hi mum" scam, so don't delay – call your family today. ANDROID TRICK #3 – CALL BLOCKING One option is to outright block calls from unknown numbers. Just go to Phone > More > Settings > Blocked Numbers > Unknown, which blocks calls from private or unidentified numbers. You'll still get calls from phone numbers not stored in your contacts though. And like iPhone, you can also use spam-busting apps to identify scam callers too. Just download an app like Hiya or Truecaller, both of which maintain giant lists of known spammers and scammers. That way, you can easily avoid dodgy calls. ANDROID TRICK #4 – AUTHENTICATOR Like with iPhone, Android phones also have a way to serve up log-in codes. These give you an extra layer of protection on top of your password – and they're safer than the ones you get via text. On Android, the standard option is the Google Authenticator app. It's totally free and will generate log-in codes for apps and websites whenever you need them. That way, even if you accidentally give your password to a scammer, you'll still be protected. ANDROID TRICK #5 – ADVANCED PROTECTION MODE The iPhone's Lockdown Mode doesn't exist on Android – but there's something very similar called Advanced Protection Mode. You'll need to update to the latest Android 16 software (just go to Settings > System > Software Update and follow the instructions). 16 It's designed for VIPs who might be targeted by crooks – like celebs or politicians – but anyone can use it for free. If you toggle the button on, it'll activate a long list of hardcore security features to help you dodge scams and boost your security. That includes preventing your phone from being hacked by a physical cable, using AI to detect scam calls live as they're happening, saving you from unsafe Wi-Fi, and blocking you from unsafe websites and apps. You'll find it in Settings > Google > Personal & Device Safety > Advanced Protection > Device Protection. 16


The Herald Scotland
2 days ago
- The Herald Scotland
Scotland's future according to politicians sounds bleak
From a lack of bold visions to talks of increasing misogyny, politicians weren't exactly awe-inspiring at an event on Scotland's future yesterday. Scotland 2050. An event you may think promises chat about flying cars and robots taking over our jobs. In earlier discussions there were certainly mentions of AI and technological change but the pull for Scottish political correspondents like me was politicians setting out their visions for Scotland in the future. This offered the likes of the First Minister and the Scottish Labour leader a chance to set out bold and ambitious plans. The need for politicians to do so is becoming all the more pressing. Recent studies such as the Youth Poll show that young people are losing faith in the political systems and are worried about their future. Yet, listening to politicians yesterday I was left more fearful and depressed about where we will be in 25 years. READ MORE: Why FM is suddenly talking about Scottish independence John Swinney looks to the past as he plans for the future The Herald's Unspun Live heads to the Edinburgh Fringe In conversation with Cherie Blair, Kate Forbes told us the "destructive nature" of social media "cannot be overstated". The deputy First Minister branded actor Rupert Everett's recent description of Nicola Sturgeon as a 'witch' as 'abhorrent' and 'totally misogynistic'. Ms Forbes went on to say that we have been talking about the issue of misogyny for the last six or seven years and 'it's only got worse in that time'. As commendable as it is for Ms Forbes to speak out against this, this reality may push more women, who fear the ever-growing threat of online abuse, away from public life. How we prevent this still remains uncertain with a preventative measure- the Online Safety Act - yet to be fully implemented. It's not as if political leaders are cloth-eared when it comes to paying heed to issues such as youth apathy. When Anas Sarwar began his keynote speech with talk of what sort of Scotland he wants to see for his 16 year-old son, he understood the assignment. The future is the younger generation. But then a myriad of questions followed. 'What does a prosperous, hopeful and thriving Scotland mean for the NHS, housing, workforce, skills, education etc?' he posed. 'Care to tell us, Mr Sarwar?', I thought. (Image: staff) Tickets for Unspun Live at the Edinburgh Fringe are available now — click here to book your place. Five minutes in, he spoke of using Scotland's 'maximum levers' to 'make it fit for future generations'. He's also said we need to 'do things differently' to achieve 'positive outcomes', as he mentioned cutting taxes. 'Tinkering around the edges is not going to work', he insisted. And I agreed but my brow furrowed as I tried to work out what his plans for a future Scotland were, away from abstract language... Setting out his vision, the First Minister spoke of the growing risks to democracy because of misinformation, more frequent conflicts, increasing inequalities and climate change. Although an impending reality it's not exactly the most stirring way to begin a speech on our future. 'The Scotland of 2050 will be shaped by a series of unpredictable forces', he warned as if leading an intergalactic space mission into the ominous unknown. For Mr Swinney, a prosperous future will ultimately be achieved through Scotland becoming an independent country and rejoining the EU. Yet with no clear strategy to achieve this, I cannot imagine his words are setting the heather alight amongst Indy supporters and others. After the speech, a former SNP MP told me the First Minister needs to be given space to set out his plans for a future Scotland. But with 11 months to go till voters head to the polls, politicians really need to unveil plans to inspire future generations soon. Although important to raise, nobody will be inspired by talk dominated by a 1984-esque dystopian world view full of Andrew Tate-bots that awaits us.