April 2025 / Reading Time: 2 minutes

STUDY F: Legal challenges in tackling AI-generated child sexual abuse material within the USA - REPORT

This report critically reviews the regulatory context of the USA on the topic of accountability around child sexual abuse material (CSAM) created via generative Artificial Intelligence (gen-AI) on a federal and state level. 

Federal CSAM statutes, together with case law, criminalise several categories of harmful material. Still, a significant number of vague points persist. Federal laws are relatively robust, but there is a gap with regards to the criminalisation of artificial CSAM that depicts purely fictitious children. Civil remedies, although significant, are limited in scope. Copyright and consumer protection laws offer some avenues for redress, but they are also limited. Prosecutors typically require concrete evidence to prosecute, such as incriminating communication or attempts to sell or trade material. 

These are often hard to obtain. This challenge is increased by more advanced AI models that generate hyper-realistic CSAM without training on authentic abuse imagery. This means that even if we start regulating how AI models are trained, these advanced AI models that can create realistic CSAM without the need for training will be evading regulation.

Drawing on copyright law, platforms and developers can be held liable if they knowingly contribute to the sharing of harmful content. However, online platforms are protected from civil liability for user-generated content, complicating efforts to hold them accountable for hosting AI-generated CSAM. Despite efforts to change the law on this, balancing platform liability with the protection of free speech is a major challenge. 

The legal landscape is even more fragmented on a state level due to several outdated pieces of legislation, which fail to address newer forms of technology-facilitated child sexual abuse (TF-CSEA). State-level civil remedies are often inadequate, leaving gaps in accountability for users, developers, distributors and third-party beneficiaries.

Share This Report:

If you have been affected by exploitation or abuse and need support, please visit