Child sexual abuse and exploitation is moving into new virtual environments and there are gaps in protecting children in these spaces as shown in our comprehensive legislative and policy review.
Extended Reality: The Implications for Legislation & Policies
Through a scoping review, legislative review, as well as in-depth qualitative interviews with key stakeholders with expertise in relevant areas, this study examines to what extent legislation and policies across the United Kingdom (UK) are prepared for the risks emerging Extended Reality (XR) technologies bring with them in terms of technology-facilitated child sexual exploitation and abuse. There are a few notable exceptions that require more focused attention:
- Definitions and the legal accommodation of CSEA in XR environments
- Haptics and the legal challenges that exist around penalising CSEA that is conducted via haptic suits as a contact sexual offence
- Avatars and criminalisation of artificial or AI-generated child sexual abuse and exploitation material (CSAEM)
- Encryption and privacy rights concerns, and
- Child users’ age considerations and standards
Definitions: legal accommodation of CSEA in XR environments
Currently, there is no legal provision within UK legislation that expressly and specifically defines XR within the context of CSEA offences or CSAEM. Current legislation accommodates XR environments through two perspectives:
- Use of XR environments on online platforms that are subject to regulation, and
- Use of XR as a means to commit CSEA offences
"The risks are so much greater and children are always accessible online. They can be isolated in those worlds. And I think that's a real danger, definitely within this particular space." (CSEA Policy Adviser)
Haptics and penalising extended reality child sexual abuse
Haptic devices or teledildonics enable a more immersive experience in the perpetration of CSEA offences in simulated environments. The evolution of this technology has very real consequences for children. It allows such behaviour to be exercised on artificial representations of real or imaginary children or even on real children themselves from a distance.
“It will basically allow people to do everything they do in the real world, just via an interface, with the added risk that they can record it, store it and duplicate it multiple times.” (Futures and Emerging Technology Analyst)
Avatars and AI generated child sexual abuse material
Another concerning way to use XR is the application of CSEA behaviours on avatars which can take the form of a child or portray child-like characteristics. CSEA as an imagery offence includes the criminalisation of possession, making, and distribution of indecent pseudo-photographs of children, described to include computer-generated photographs.
However, there appears to be a gap with respect to the criminalisation of the making, distribution, or advertising of sexually explicit child avatars or generative-AI created CSAM that falls under the category of still or moving images. The Coroners and Justice Act 2003 criminalises only the possession of prohibited images, which are more extensively defined in that law. The term ‘photographs’ can be limited in scope, and appear to exclude avatars and images that do not take the nature of what we might deem conventionsal ‘photographs’.
“I think potentially there is a risk that it will enable greater offending. Because, to all extents and purposes, something that is in a virtual world could look completely innocent.” (CSEA Prosecutor bringing the example of a childlike avatar)
Encryption: privacy rights concerns
The issue of encryption triggers conversations and debates with respect to the right to privacy and freedom of expression of the users of regulated platforms and how we balance this with the safety of users and children. This seems to be particularly applicable in section 121 of the Online Safety Act (OSA), which provides OFCOM with the power to require a regulated service to employ accredited technology to identify and swifty take down CSEA content communicated privately and prevent users from encountering such content.
Technology will be accredited by OFCOM (or a third party appointed by OFCOM) as meeting minimum standards of accuracy approved and published by the Secretary of State, following advice from OFCOM. OFCOM can only require the use of accredited technology under section 121 if it considers it necessary and proportionate to do so. The OSA lists a number of matters which OFCOM must consider when deciding if it is necessary and proportionate in a particular case. Additionally, OFCOM would also need to be satisfied that the use of accredited technology by the service is technically feasible.
“The privacy of the child that is in the imagery, that is a privacy violation as well. So, for someone to take non-consensual image[s], sexual imagery of anyone, of a child and redistribute that, that is a privacy breach every single time. But even then... you can stop that from happening without impinging on anyone's privacy.”
(Chief Technology Officer at Child Protection Charity)
It is apparent that the OSA attempts to balance the regulation of illegal CSEA content on regulated services with the right to privacy of the users. However, there is a need to guide OFCOM on how it can apply the overarching OSA objective of providing a higher standard of protection for children than adults, when the safety interests and privacy rights of children clash with privacy rights claims of adult users.
Child users’ age considerations
The lack of more sophisticated age verification mechanisms and processes creates risk for children accessing and utilising online environments. Arguably, where age verification strategies do exist that they are very easily bypassed. The implications include that children will have access to adult online spaces and virtual worlds which are not necessarily illegal but which may have sexual themes, thus posing harm to them.”
Significant work is underway to develop age verification international standards with the Institute for Electrical and Electronics Engineers (IEEE) and the International Organisation for Standardization (ISO). ISO has also accepted a proposal from the UK, supported by the Department of Digital, Culture, Media and Sport (DCMS), to define an ISO standard for age verification.
"So, if you can actually genuinely gatekeep those experiences to ensure that one is for only for adults, and one is applicable for children, that massively changes the safety measures that you require to make those safe spaces." (Child Protection Charity Chief Technology Officer)
This study is the first to comprehensively examine whether UK legislation is future-proofed for protection of children through new technologies. Only by addressing these key areas can we ensure a safe future-proofed environment for children to grow up in the UK.
More information
Fry, D., Gaitis, K.K., Landrigan, M.P. and Vermeulen, I. Extended Reality: The Implications for Legislation & Policies. In Searchlight 2023 - Childlight's Annual Flagship Report. Childlight – Global Child Safety Institute: Edinburgh, 2023.
- Researchers: Prof Deborah Fry, Dr Konstantinos Kosmas Gaitis, Maranatha Praise Landrigan, Dr Inga Vermeulen, Sarah Guthrie, James Stevenson, Maria Lamond, Sham-Una Yakubu and Carleigh Slater.
- Ethics Approval: University of Edinburgh
- Registered protocol: https://osf.io/jhy95/