Britain will become the first country to introduce laws against AI tools used to generate sexual abuse images, the government announced late Saturday.
The government will make it illegal to possess, create or distribute AI tools designed to generate sexualised images of children, punishable by up to five years in prison, interior minister Yvette Cooper revealed.
It will also be illegal to possess AI “paedophile manuals” which teach people how to use AI to sexually abuse children, punishable by up to three years in prison.
“This is a real disturbing phenomenon. Online child sexual abuse material is growing, but also the grooming of children teenagers online. And what’s now happening is that AI is putting this on steroids,” interior minster Yvette Cooper told Sky News on Sunday.
She said AI tools were making it easier for perpetrators “to groom children, and it’s also meaning that they are manipulating images of children and then using them to draw and to blackmail young people into further abuse.
“It’s just the most vile of crimes,” she added.
The new law would include banning “some of the AI models being used for child abuse,” said the minister.
“Other countries are not yet doing this, but I hope everyone else will follow,” she added.
AI tools are being used to generate child sexual abuse images by “nudeifying” real life images of children or by “stitching the faces of other children onto existing images,” said the government.
The new laws will also criminalise “predators who run websites designed for other paedophiles to share vile child sexual abuse content or advice on how to groom children,” punishable by up to ten years in prison, said the government.
Cooper told the BBC on Sunday that a recent inquiry had found that around 500,000 children across the UK are victims of child abuse of some form each year, “and the online aspect of that is an increasing and growing part of it”.
The measures will be introduced as part of the Crime and Policing Bill when it comes to parliament.
The Internet Watch Foundation (IWF) has warned of the growing number of sexual abuse AI images of children being produced.
Over a 30-day period in 2024, IWF analysts identified 3,512 AI child abuse images on a single dark web site.
The number of the most serious category of images also rose by 10 percent in a year, it found.
(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)