December 10 / 2025 / Reading Time: 3 minutes

Childlight endorses world’s first minimum age law for social media use

Childlight, the global child safety institute, has welcomed Wednesday’s introduction of the world’s first minimum age law for social media use as ground-breaking legislation takes effect in Australia.

The institute, hosted by the University of Edinburgh and University of New South Wales in Sydney, said the Australian Government’s age verification requirement is an unfortunate but necessary step to protect children from escalating levels of online sexual abuse and exploitation.

The move means social media platforms will have to take reasonable steps to prevent Australians under the age of 16 from creating or keeping an account. It comes after Childlight research indicated that over 300 million children experience online sexual abuse each year, with most of this occurring on social media platforms.

For over fifteen years, social media companies have promoted their sites to children and parents as fun, informative and, above all, safe. But Childlight says the evidence tells a different story.

Reports of online child sexual abuse have increased every year for two decades, reflecting both the scale of harm and the persistent failure of technology companies to implement basic safety measures.

Despite repeated warnings from governments, experts and survivor advocates, social media companies have consistently prioritised growth and engagement over child protection.

Essential safeguards – such as age assurance, proactive detection of grooming and child sexual abuse material, and meaningful privacy protections – have not been reliably implemented or are entirely absent. This has left children exposed to industrial-scale exploitation and placed an unacceptable burden on families, educators and frontline responders.

Childlight supports regulation that puts children’s rights and safety first. Age restrictions are not a silver bullet, nor are they a substitute for comprehensive regulation and proactive platform responsibility. But they are a necessary circuit-breaker in a sector where voluntary industry action has demonstrably failed.

We urge the government to accompany the age verification with robust investment in:

  • Strong regulatory frameworks that oblige platforms to detect, prevent and transparently report child sexual exploitation,
  • An enforceable “duty of care” for online services that prioritises the needs and rights of children,
  • Comprehensive health, welfare and safety support services for child victims and adult survivors of technology facilitated child sexual abuse,
  • The targeted removal of child sexual abuse material of Australian children that continues to proliferate online,
  • Outreach and support services for children whose social connection or wellbeing may be affected by the ban, and,
  • Ongoing evaluation to ensure the policy is effective and does not create unintended harms.

Children have a right to participate in online life. For too long, that right has been compromised by social media companies unwilling to meet their basic child protection obligations. Wednesday’s measure is a necessary step toward resetting that balance.

Childlight CEO Paul Stanfield said: “Children are growing up in an online world that has become increasingly hostile, and the level of sexual abuse leaves governments with little choice but to act. I support this move in principle as a step to upholding children’s rights to safe digital environments, and keeping it under review to ensure it truly protects children.”

Professor Debi Fry, Childlight global director of data and Professor of International Child Protection Research at the University of Edinburgh, said: “This is an important preventative, public-health intervention when exposure to unsafe digital environments has been linked to significant lifelong impacts in terms of physical and mental health. It is a necessary measure to ensure children’s rights and wellbeing are prioritised over commercial incentives while more comprehensive, systemic protections are put in place.”

“Australia’s move recognises that voluntary industry approaches have consistently fallen short of what the empirical evidence demands. It represents a precautionary step while the structural conditions enabling harm are addressed.”

For media enquiries, contact childlight.comms@ed.ac.uk

Notes to editors

Further information on Australia’s new law is here.

Age-restricted social media platforms will require to take reasonable steps to prevent Australians under 16 years old from having accounts on their platforms.

A court can order civil penalties for platforms that don’t take reasonable steps to prevent underage users from having accounts on their platforms. This includes court-imposed fines currently equivalent to a total of $49.5 million Australian dollars.

Share This Article:

If you have been affected by exploitation or abuse and need support, please visit