Law enforcement agencies across the U.S. are cracking down on a troubling spread of child sexual abuse imagery created through artificial intelligence technology — from manipulated photos of real children to graphic depictions of computer-generated kids. Justice Department officials say they’re aggressively going after offenders who exploit AI tools, while states are racing to ensure people generating “deepfakes” and other harmful imagery of kids can be prosecuted under their laws. With the recent significant advances in AI, it can be difficult if not impossible for law enforcement officials to distinguish between images of real and fake children. Lawmakers, meanwhile, are passing a flurry of legislation to ensure local prosecutors can bring charges under state laws for AI-generated “deepfakes” and other sexually explicit images of kids. Governors in more than a dozen states have signed laws this year cracking down on digitally created or altered child sexual abuse imagery, according to a review by The National Center for Missing & Exploited Children.
Think Before You Share
Justice Department officials say they already have the tools under federal law to go after offenders for such imagery. Open-source AI-models that users can download on their computers are known to be favored by offenders, who can further train or modify the tools to churn out explicit depictions of children, experts say. Abusers trade tips in dark web child porn communities about how to manipulate AI tools to create such content, officials say. The law also bans the act of providing child porn to many or unspecified individuals, or displaying it in public, on websites or by other means.
‘Toxic cocktail of risks’
Investigators collected 1,517 items of evidence, including laptops, computers and mobile phones, as well as 94 boxes filled with video cassettes and DVDs. Police arrested the men in nationwide raids carried out at the end of September, seizing huge amounts of material. The men, aged 43 to 69, were suspected of being the “leading figures behind the Dark Web platform”, police in the North Rhine-Westphalia region said in a statement. To get away with such crimes, he says predators use sophisticated tools such as encryption, virtual private networks (VPNs), and cryptocurrency to cover their tracks.
Ex-Chief Justice DY Chandrachud joins Law University as Distinguished Professor
Even if meant to be shared between other young people, it is illegal for anyone to possess, distribute, or manufacture sexual content involving anyone younger than 18. Even minors found distributing or possessing such images can and have faced legal consequences. AI-generated child sexual abuse images can be used to groom children, law enforcement officials say. And even if they aren’t physically abused, kids can be deeply impacted when their image is morphed to appear sexually explicit. The Justice Department says existing federal laws clearly apply to such content, and recently brought what’s believed to be the first federal case involving purely AI-generated imagery — meaning the children depicted are not real but virtual. In another case, federal authorities in August arrested a U.S. soldier stationed in Alaska accused of running innocent pictures of real children he knew through an AI chatbot to make the images sexually explicit.
“Welcome to Video” operated on the so-called “dark web”, which can only be accessed by special software and is widely used to traffic various illegal content and products. The number of child victims is up 2.3 times from a decade ago, while the number of cases detected by police increased by 1.8 times. Officials did not provide an estimate for the number of victims affected but said the abusive material shared on the site exclusively depicted girls. He warned that many children unknowingly expose themselves to danger simply by sharing explicit pictures either with a partner or friend. They feel violated but struggle to share their experience because they fear no one will believe them. These perpetrators use psychological manipulation to weaken their victims, gradually pulling them from one stage to the next.
“Others described their occupation as accountant, architect, clerk, general manager, quality technician and self-employed,” the report said. Find research, guidance, summaries of case reviews and resources in the UK’s largest collection of child protection publications. Find out how the child protection system works in England, Northern Ireland, Scotland and Wales. PAPS officials said the group has received several requests concerning the online marketplace targeted by the latest police action. Sellers set the prices for their videos and other products, which they uploaded. If so, easy access to generative AI tools is likely to force the courts to grapple with the issue.
- It also, in turn, helps to diminish the crime and perpetuate the abuse by mutualising the experience of both the perpetrator and the victim involved.
- Officials did not provide an estimate for the number of victims affected but said the abusive material shared on the site exclusively depicted girls.
- Under-18s have used fake identification to set up accounts, and police say a 14-year-old used a grandmother’s passport.
- Most of the Category A material involved children penetrating themselves, or another child.
- She told her mum she originally intended to only post pictures of her feet after making money selling them on Snapchat.
Those numbers may be an undercount, however, as the images are so realistic it’s often difficult to tell whether they were AI-generated, experts say. But experts say more should have been done at the outset to prevent misuse before the technology became widely available. And steps companies are taking now to make it harder to abuse future versions of AI tools “will do little to prevent” offenders from running older versions of models on their computer “without detection,” a Justice Department prosecutor noted in recent court papers. According to Aichi prefectural police, online porn video marketplaces operated on servers abroad are difficult to regulate or find facts about.