Glossary
This section sets out definitions of key terms used throughout the Into the Light 2025 Index. We draw from agreed terminology in the field following the Second Edition of the Terminology Guidelines for the Protection of Children from Sexual Exploitation and Abuse (ECPAT International, 2025) as well from key definitions from the UN International Classification on Violence Against Children (UNICEF, 2023b). We also include definitions and recommendations for data enhancement in the CSEA field drawing on work by the UK Statistics Authority (UKSA, 2021) Inclusive Data Taskforce and from the independent Sullivan Review of data, statistics and research on sex and gender (Department for Science, Innovation & Technology, 2025) More details on all these definitions, including underpinning conceptual frameworks, can be found in our Index Technical Note.
Baseline CSAM
A term created by INTERPOL, the international policing body, to define what is considered internationally illegal child sexual abuse material (CSAM).
Child
A term which means every human below the age of eighteen.
Child helpline tags
With Child Helpline International (CHI) data, we present child helpline tags, or how the child helpline staff document the contact (which can be calls, text or other forms of communication), into categories (e.g., sexual violence) within their data system. A contact does not always translate to a case, as there may be multiple contacts (i.e., calls) from one child or someone contacting the helpline on behalf of a child, which may be tagged into different categories. The tags are the number of times that helpline staff tag a category (one contact may be represented by multiple tags or categories, as the contact may report multiple issues). These data are directly reported by child helplines members to the umbrella organisation, CHI, via an annual survey.
Child sexual abuse material (CSAM)
Images, image collections, videos and stills that capture the sexual exploitation and abuse of children. This material represents the evidence of past sexual abuse, as well as ongoing harm to the children and survivors depicted in the material.
Child sexual abuse material/image-based sexual abuse (CSAM/IBSA)
For survey data, this combined term captures non-consensual image or video making, taking and/or sharing by an adult or another child. It refers to having sexual images taken when a child was unconscious, intoxicated, distracted, or unable to consent. This subtype also includes non-consensual sharing of images/videos of a child via mobile phone or internet. It could also include so-called deepfake images in which a child’s head or likeness was imposed on a sexual image of someone else, as well as AI-generated images. We decided to use the CSAM/IBSA term to avoid confusion across agencies in interpreting the findings and to indicate the possibility of increased use of both terms in future research.
Child sexual exploitation and abuse (CSEA)
At Childlight, we use CSEA as an umbrella term, because it recognises that abuse and exploitation can take different forms and require different approaches to prevention, safeguarding, and data collection. The term covers situations involving child sexual abuse in which a child is involved in sexual activity that they do not understand, cannot consent to, are not developmentally ready for, or where an imbalance of power, trust, or authority is exploited. It also includes sexual exploitation when a child is manipulated, coerced, or forced into sexual activity in exchange for something, such as money, gifts, protection, or promises, which can involve situations like sex trafficking or sexual extortion. In relation to CHI data, CSEA encompasses the categories sexual violence (offline); commercial sexual exploitation (offline); and technology-facilitated child sexual exploitation and abuse (TF-CSEA).
Commercial sexual exploitation (offline)
This category is used by CHI to categorise their data. Their definition is as follows: A child performing a sexual act in exchange for (a promise of) something of value (including, but not limited to, money, objects, shelter, food, drugs, etc.). The use, procuring or offering of a child for prostitution, for the production of child sexual abuse material or for sexual performances. It can involve the trafficking of children for commercial sexual exploitation. It can also take place in the context of travel and/or tourism. In these cases, the offence can be committed by either foreign or domestic tourists and travellers, and long-term visitors.
Confidence interval (CI)
A range of values within which the true prevalence is likely to fall. A narrower interval indicates greater precision and reliability of the estimate, while a wider interval suggests more uncertainty – often due to smaller sample sizes or variation across studies.
End-to-end encryption (E2EE)
End-to-end encryption is a technology that makes messages, images, calls, and other communications accessible only to the sender and the intended recipient. From a CSEA perspective, this means that the content is completely hidden, even from the platform hosting the service, making it much hard for authorities, platforms, or safeguarding teams to detect, prevent, or investigate TF-CSEA.
Exact matches (cryptographic)
Images that have been previously allocated an alphanumeric hash value with software and, therefore, match the hash value in a hash list possessed by an organisation. In verb form, the term is ‘exact matching’.
Exposure to unwanted sexual content
A type of technology-facilitated child sexual victimisation that includes the unwanted exposure of a child to pornographic material (e.g., forcing a child to watch videos or pictures containing nudity or sending a child a link to a pornographic website). Unwanted exposure to sexual content occurs often while surfing or scrolling through social media. This type of exposure may or may not be a precursor to a request for reciprocity. Including exposure to unwanted sexual content (including pornography) is important because, as suggested by the growing body of literature, it plays a significant, but often overlooked, role in both the risk factors and developmental consequences of abuse. Including exposure to sexually explicit content in TF-CSA discussions ensures a more holistic understanding of how technology can harm children, not just through direct abuse, but through the gradual erosion of boundaries, consent, and safety. It also helps shape better prevention strategies, education programmes, and support systems for children and families.
Familial CSEA
Sexual abuse or exploitation of a child that occurs within the family environment, perpetrated by biological relatives (such as parents, siblings, grandparents, aunts or uncles) or individuals in a familial-like role (e.g., foster carers or a parent’s partner). Often also referred to as ‘intrafamilial CSEA’, we use the term familial CSEA for ease in communicating to a variety of audiences.
First sighted
The first known location of CSAM that has been reported. This does not mean that it is the first or only place it was uploaded.
Frontline data
Data collected by child protection system actors (e.g., police, health, education, social care, and justice systems) and civil society actors (e.g., some child helplines) while providing services to victims/survivors or bringing perpetrators to justice. Often referred to as ‘administrative data’, Childlight uses the easier term ‘frontline data’ to refer to data that is not collected for research purposes, but gathered while providing services or fulfilling child protection duties.
Gender
Gender refers to the socially constructed norms, roles, behaviours, and relationships associated with being female, male, or another gender, which can vary across societies and change over time. Gender identity is distinct from biological sex and reflects an individual’s internal experience of gender, which may or may not correspond with their sex. Also see definitions for ‘Sex’, ‘Sex and/or gender’ and ‘Male, female, non-binary and unknown’ in this glossary.
Hash value
A unique alphanumeric code assigned to every individual instance of known CSAM. Some CSAM data collection organisation have their own hash lists to compare reported CSAM to, but this is not this case for all of them. It is dependent on the organisation whether these hash lists are shared with other key stakeholders.
Helpline
A reporting and support service that is available to children, parents, caregivers and the public to report concerns pertaining to children needing direct assistance. Helplines often operate in partnership with key referral services such as hospitals, law enforcement agencies, judicial services, shelters and other child-related services.
Hotline
A reporting service that allows the public to anonymously share material they believe to be illegal or harmful to children online. These services often send removal notices to electronic service providers and/or share reported concerns with law enforcement agencies.
Internet Protocol (IP) address
A unique identifying number assigned to all devices that connect to the internet, including phones, laptops, tablets, modems and servers.
Lifetime prevalence
Experiences that occurred at any point during childhood (i.e., before age 18).
Male, female, non-binary and unknown
CHI uses the terms boy, girl, non-binary and unknown to classify the gender of those who contact their helplines. As the helpline data also contains data on individuals up to the age of 24, Childlight has chosen to use the terms male, female, non-binary and unknown to refer to gender and/or sex. In CHI’s glossary, their definition of non-binary is: “[t]he child or young person does not identify primarily as female or male, or identifies as non-binary” (CHI, 2025, p.8). CHI’s definition of unknown is: “[t]he gender of the child or young person could not be identified for various reasons” (CHI, 2025, p.8). Also see ‘Gender’ in this glossary.
Meta-analysis
A statistical technique used to combine the results of several different studies on the same topic. By pooling data from multiple studies, a meta-analysis can give a more accurate estimate of overall effects or patterns than any single study alone.
Offline CSEA
Instances of CSEA that occur through direct, in-person interaction between the perpetrator and the child, without the involvement of technology-facilitated means. This includes acts such as rape, sexual assault and other forms of sexual abuse. While offline CSEA can include non-contact verbal sexual abuse and exhibitionism, the data presented in this report focuses specifically on in-person contact abuse involving rape or sexual assault.
Online sexual exploitation
Includes all acts of a sexually exploitative nature carried out against a child that have, at some stage, a connection to the digital environment. It includes any use of technology that results in sexual exploitation or causes a child to be sexually exploited or that results in or causes images or other material documenting such sexual exploitation to be produced, bought, sold, possessed, distributed, or transmitted The terms ‘ICT-facilitated’ and ‘cyber-enabled’ child sexual exploitation are sometimes used as alternatives to define these practices.
Online solicitation
A range of unwanted or pressured sexual interactions, which may include casual sexual inquiries via mobile phone or the internet, long-lasting sexual conversations that can lead to the exchange of sexual texts/pictures/videos or exposure of intimate body parts. All types of online solicitation may come from peers as well as adult perpetrators.
Past year prevalence
Experiences that occurred within the 12 months prior to when the survey was undertaken.
Prevalence estimate
The proportion of individuals in a population who have experienced CSEA. In this report, it represents the statistical outcome of a meta-analysis. Estimates are reported for specific recall periods (e.g., past year or lifetime before age 18) and by gender where possible.
Rape
Vaginal, anal or oral penetration of a sexual nature of the body of a child with any bodily part or object, with or without the use of force and without consent, because the child is too young to consent or consent is not given.
‘Self-generated’ CSAM
A type of media showing individuals who have physical control of their recording device (i.e., selfies, self-recordings from their computers, etc.), which may have been shared directly or captured indirectly by other means. This is often created due to the grooming, deception or extortion of a child by an offender. Due to lack of agreement on preferred terminology, we have used single quotes throughout the document to note the limitations of this terminology.
Sex
A binary classification based on biological factors (male and female).
Sex and/or gender
The term sex and/or gender is used in this report because many data sources do not clearly distinguish between the two. In surveys, questions may ask about sex, gender, or attempt to capture both, leaving it unclear what is being measured or reported. Administrative data vary widely, with categories recorded either through standard questions or inferred by those collecting the data, creating inconsistency across organisations. Big data sources, such as those derived from images or videos, generally only record sex, which reflects the type of information these methods are designed to capture. Also see ‘Gender’ and ‘Male, female, non-binary and unknown’ in this glossary.
Sexual assault
Unwanted groping, fondling or other touching of the private parts of a child or making a child touch the private parts of someone else (excluding penetration), with or without the use of force and without consent, because the child is too young to consent or consent is not given.
Sexual extortion
A form of blackmail that involves threatening to share an individual’s intimate image or video online unless they comply with certain demands, such as for money, gift cards, other items of monetary worth, additional pictures or other sexual acts. The term also includes sexual acts on webcam coerced by a perpetrator.
Sexual violence (offline)
This is a category used by CHI to categorise their data. It is defined as: “[f]orcing or coercing a child to engage in sexual activity, whether they are aware of what is happening or not, or if they are able to articulate what is unwanted or not” (CHI, 2025, p.39).
South Asia
Based on UNICEF’s regional classification, South Asia refers to eight countries: Afghanistan, Bangladesh, Bhutan, India, Sri Lanka, Maldives, Nepal and Pakistan (UNICEF, 2023b).
Systematic review
A research method used to find, assess and summarise all relevant studies on a specific topic or question. It follows a clear and structured process to reduce bias and ensure that the findings are reliable.
Technology-facilitated CSEA (TF-CSEA)
A range of sexually harmful behaviours that occur online or through the use of other digital technologies and include online solicitation, non-consensual image taking and sharing, forced exposure to pornography/unwanted sexual content, and livestreaming of child sexual abuse, sexual exploitation or sexual extortion.
This is also a category used by CHI to categorise their data. Their definition is: “[c]hild sexual abuse becomes technology-facilitated child sexual abuse when it has occurred on social media or other online channels, or has a direct link to the online environment [...]. Technology-facilitated child sexual exploitation includes all acts of a sexually exploitative nature carried out against a child that is at some stage connected to the online environment. This can be distinguished from technology-facilitated sexual abuse [sic] by an underlying notion of exchange, for example, money, food, accommodation, drugs, affection, gifts, etc” (CHI, 2025, p.36).
Victimisation
CSEA represents forms of victimisation whereby the child is the victim of the exploitation/abuse. Victimisation tends to refer to a process more than to a single act. The term is used as a category by which to group indicators primarily from representative surveys in the Into the Light Index (ITL) Index 2025.
Victim/survivor
A combined term referring to children/adults who have experienced or are experiencing sexual violence, to reflect both the terminology used in legally binding instruments and an individual’s choice to identify themselves as they wish to be identified.
Violence
When we use the term violence in this report, we are specifically referring to a category from Child Helpline International that is defined as “the maltreatment (improper and/or harmful treatment) of a child. Violence can take a number of forms, including emotional, physical, and sexual. Isolation and exclusion are also a form of violence. Violence can occur in many settings, including, but not limited to, at home, at school, in the neighbourhood, and online. The perpetrators can be members of the family, peers, other adults known to the child, or strangers. The present category also involves the presence of violence in the child’s environment” (CHI, 2025). The category of CSEA (which encompasses the categories sexual violence [offline], commercial sexual exploitation [offline], and TF-CSEA) falls under the violence category.
Visually similar
Images which, to the eye, appear to be the same image, but may in reality have differences in aspects such as size, colours or layout (e.g., mirrored). Due to these differences, the images may not be matched by cryptographic ‘exact’ hashing, despite being the same in content. They can be matched using software such as PhotoDNA by Microsoft, which enables a more accurate count of image content. Visually similar images are imperative to track in order to attain a more accurate understanding of the number of images that are actually new and the extent to which certain images are being reuploaded, shared or generally disseminated.
Western Europe
Based on UNICEF’s regional classification, Western Europe refers to 33 countries: Andorra, Austria, Belgium, Cyprus, Czechia, Denmark, Estonia, Finland, France, Germany, Greece, Holy See, Hungary, Iceland, Ireland, Italy, Latvia, Liechtenstein, Lithuania, Luxembourg, Malta, Monaco, Netherlands (Kingdom of the), Norway, Poland, Portugal, San Marino, Slovakia, Slovenia, Spain, Sweden, Switzerland and the United Kingdom.
Youth
This term generally defines a 15–24-year-old age group. In this report, the term is used primarily with child helpline data for which some services are provided across a large age spectrum and in relation to some CSAM data when precise ages are hard to determine from images and videos.
Youth-produced images
Images that appear to be produced by children and/or youth due to the perspective or framing of the image. The term does not consider the process by which or context in which the image was created (e.g., coercion by an adult or peer, created for a romantic partner or taken under duress).