×
google news

Reddit hit with £14m fine for failing to protect children’s data

The UK Information Commissioner's Office fined Reddit over £14 million after finding that the platform lacked robust age checks, exposing children to inappropriate content and unlawful data processing

UK regulator fines Reddit over failing to protect children’s data

The UK’s Information Commissioner’s Office (ICO) has hit Reddit with a hefty penalty after finding the platform did not do enough to safeguard the personal information of young users.

The regulator concluded that Reddit’s age checks were weak, leaving children — including those under 13 — exposed to privacy harms and inappropriate content. The fine topped £14 million and came with an order to tighten age-verification procedures.

What the ICO found

Investigators discovered that Reddit relied largely on self-declared ages, a method the ICO judged easily bypassed and unsuitable where children are likely to be present. Because the platform couldn’t demonstrate a reliable way to confirm users’ ages, the regulator said there was no lawful basis to process the data of children under 13.

The ICO also found Reddit did not take sufficient steps to prevent underage users from encountering material that could harm them.

Why this matters for business and ESG

This ruling reaches beyond one company. For boards and compliance teams, it’s a reminder that data governance and user safety are now core elements of corporate responsibility. Poor age assurance carries direct financial risk — fines, remediation costs — and indirect damage: loss of user trust and reputational fallout. Companies that treat age verification as an afterthought may find themselves on the wrong side of both regulators and public opinion.

Practical steps platforms should take

Effective age assurance is layered, not single-solution. Practical measures include:
– Stronger age-gating and conservative defaults when a user’s age is uncertain.
– Parental consent workflows or optional parental verification for younger accounts.
– Privacy-by-default settings for suspected minors, reducing data collection and limiting profiling.
– Technical options such as credential checks, document verification where lawful, and behaviour-based risk scoring.
– Design choices that avoid promoting content known to be harmful to children and that limit algorithmic amplification of inappropriate material.
– Regular auditing, monitoring and transparent documentation of tests, risk assessments and remediation.

Where verification is impractical, platforms should err on the side of restricting access and data processing. Treat age assurance as an ongoing programme: pilot controls, evaluate results, and iterate.

Regulatory expectations and enforcement outlook

Regulators want more than policies on paper. The ICO expects demonstrable, auditable safeguards — evidence that checks are tested, updated and actually reduce risk. Enforcement will focus on repeated failures and on organisations that can’t show improvement. Firms that adopt continuous monitoring, third‑party validation or external audits are likely to reduce enforcement exposure.

Advice for parents and guardians

Parents can play a big role in reducing risk. Check privacy settings and age gates on apps your children use, turn off location sharing, and make profiles private where possible. Use parental controls and third‑party tools if available. Have open conversations about what’s safe to share online, set clear rules for account creation and screen time, and encourage children to report anything unsettling. If a platform appears to handle a child’s data poorly, report it to the service and, if needed, to the ICO.

What this means for the sector

The ICO’s decision sends a clear message: superficial age checks won’t pass regulatory muster. Companies operating community forums, social networks and other interactive services should embed child protection into product design and data practices. For those weighing ESG priorities, building verifiable, auditable age-assurance systems isn’t just compliance — it’s essential to maintaining trust and a licence to operate in digital markets.

Next steps

Operators should begin with a documented baseline assessment, adopt multi-layered controls, run pilot evaluations and publish results where appropriate. Regulators will continue to watch whether new controls work in practice; the ICO has signalled it will monitor day-to-day operations to ensure underage access and exposure to harmful content are actually reduced.

The Reddit ruling is a wake-up call. Protecting minors online demands sustained effort, clear evidence of effectiveness and a willingness to redesign services with children’s safety front of mind.


Contacts:

More To Read