Children must show ID to use social media under new rules to protect kids from harmful content

8 May 2024, 00:47

Ofcom has published its draft children's safety codes of practice.
Ofcom has published its draft children's safety codes of practice. Picture: Alamy

By Jenny Medlicott

Children could have to show ID to use social media platforms under new plans to stop kids from accessing harmful content.

Listen to this article

Loading audio...

Social media platforms will be required to introduce robust age-checking measures, including the use of ID such as passports, to protect children from harmful content, under new draft legislation published by regulator Ofcom.

Platforms such as Facebook and Instagram, which require users to be aged 13 to join, must enforce the age limits with new rigid checks.

Ofcom published its draft children's safety codes of practice on Wednesday, which set out how it expects online services to meet their new legal responsibilities to protect children online under the Online Safety Act.

The latest codes include more than 40 practical measures which Ofcom says will demand a step-change from tech firms by compelling safer design and operating practices from the biggest sites.

The new rules are scheduled to come into force early next year.

Platforms will also be told to “tame aggressive algorithms” and prevent children from accessing pornography and sensitive content that promotes suicide, self-harm and eating disorders.

If the social media firms do not uphold the new rules, they face a fine of £18 million or 10 per cent of global revenue and also risk the watchdog blocking their services and starting criminal proceedings against senior managers.

Read more: Chaos at the airports: Thousands of Brits stranded after Border Force’s IT systems collapse

Read more: Rachel Reeves says she 'won’t be queuing up to join' the Garrick Club after it ends its men-only rule

Social media platforms will be required to enforce age checks on their sites.
Social media platforms will be required to enforce age checks on their sites. Picture: Alamy

Dame Melanie Dawes, Ofcom’s chief executive, said: “In line with new online safety laws, our proposed codes firmly place the responsibility for keeping children safer on tech firms. They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an ­experience that’s right for their age.

“Our measures, which go way beyond current industry standards, will deliver a step-change in online safety for children in the UK. Once they are in force we won’t hesitate to use our full range of enforcement powers to hold platforms accountable. That’s a promise we make to children and parents today.”

Under the proposals, platforms which can be accessed by children and have a higher risk of harmful content appearing must configure their algorithms to filter out the most harmful content from children's feeds, and reduce the visibility and prominence of other lower-risk, but still potentially harmful, material.

A 2022 report from the Center for Countering Digital Hate showed that the TikTok algorithm, which learns from users’ behaviour, sent teenagers suicide, self-harm and eating disorder content within minutes of joining.

The platform’s algorithmic success has seen other social media sites use a similar algorithmic feed.

Ofcom has found that a third of children aged between eight and 17 with a social media profile have an adult user age after signing up with a false date of birth.

The father of Molly Russell welcomed the proposal but said more still needed to be done.
The father of Molly Russell welcomed the proposal but said more still needed to be done. Picture: PA/Handout

Sir Peter Wanless, chief executive of children's charity, the NSPCC, said the draft code was a "welcome step in the right direction" towards protecting children online.

"Building on the ambition in the Online Safety Act, the draft codes set appropriate, high standards and make it clear that all tech companies will have work to do to meet Ofcom's expectations for keeping children safe," he said.

"Tech companies will be legally required to make sure their platforms are fundamentally safe by design for children when the final code comes into effect, and we urge them to get ahead of the curve now and take immediate action to prevent inappropriate and harmful content from being shared with children and young people.

"Importantly, this draft code shows that both the Online Safety Act and effective regulation have pivotal roles to play in ensuring children can access and explore the online world safely.

"We look forward to engaging with Ofcom's consultation and will share our safeguarding and child safety expertise to ensure that the voices and experiences of children and young people are central to decision-making and the final version of the code."

Meanwhile, child online safety campaigner Ian Russell, the father of 14-year-old Molly Russell who took her own life in November 2017 after viewing harmful material on social media, said more still needed to be done.

Mr Russell said: "Ofcom's task was to seize the moment and propose bold and decisive measures that can protect children from widespread but inherently preventable harm.

"The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly's life.

"It's over six years since Molly's death, but the reality is that very little has yet changed. In some respects, the risks for teens have actually got worse.

"That's why it's hugely important that the next Prime Minister commits to finish the job to and strengthen the Online Safety Act to give children and families the protection they deserve."