April 2025 / Reading Time: 2 minutes

STUDY F: Legal challenges in tackling AI-generated child sexual abuse material within the UK - REPORT

Introduction

This report critically reviews the regulatory context of the UK on the topic of accountability around child sexual abuse material (CSAM) created via generative Artificial Intelligence (gen-AI) on a reserved and devolved level. 

The legislation that is relevant for cases of AI-generated CSAM focuses on two types of indecent images. One type is indecent pseudo-photographs, meaning photographs as well as videos and films made by computer graphics that have a photo-realistic appearance, i.e., they look real. The other type covers prohibited pseudo-images that do not look realistic, such as drawings and cartoons or hentai and manga. 

We found that there is good coverage around a series of offences with regards to pseudo-photographs, i.e., the acts of making, taking, possessing, and disseminating pseudo-photographs are all covered, irrespective of the technology used, thus covering cases of generative AI or deepfakes. This is evidenced by the first emerging case in England, where an offender who created CSAM using generative AI was arrested, charged, and sentenced to 18 years in prison. That said, the law is silent on whether the criminalisation of indecent pseudo-photographs of children extends in cases of fictitious children. 

This protection tends to be less efficient with regards to non-realistic images, including cartoons, manga and drawings. While possession and dissemination are criminalised for real and fictitious children, the act of making such imagery is not criminalised in England and Wales. The largest gap in protection with regards to the above imagery exists in Scotland, as the making and possession of indecent non-realistic images in Scotland is not criminalised. The nation that seems to have the most robust legislative framework against all acts related to indecent non-realistic images is Northern Ireland. 

This report also discusses the topic of paedophile manuals, Online Safety Act, case law, compensation and AI tools utilised to create pseudo-CSAM. 
 

Share This Report:

If you have been affected by exploitation or abuse and need support, please visit