Most adults back a children charity's call to create a regulator to enforce social media firms to protect children from dangerous content
More than nine out of 10 of Scottish adults (91%) back regulation of social networks to make tech firms legally responsible for protecting children, a new NSPCC survey has revealed.
More than six out of 10 adults in Scotland (63%) do not think social networks protect children from sexual grooming, and the same proportion don’t think networks protect children from inappropriate content like self-harm, violence or suicide.
Across Britain, nine out of ten parents surveyed supported regulation of social media to keep children safer.
The figures emerged as the children’s charity released a detailed proposal setting out how a robust independent regulator should enforce a legal duty of care to children on social networks.
The NSPCC’s Taming The Wild West Web vision, drawn up with the assistance of international law firm Herbert Smith Freehills, proposes the introduction of a social media regulator to force social networks to protect children on their platforms.
The regulator would:
- have legal powers to investigate tech firms and demand information about their child safety measures;
- require social networks to meet a set of minimum child safeguarding standards (making their platforms safe by design) and to proactively tackle online harms including grooming;
- deploy tough sanctions for failures to protect their young users – including steep fines for tech firms of up to €20m, bans for boardroom directors, shaming tactics and a new criminal offence for platforms that commit gross breaches of duty of care (akin to corporate negligence and corporate manslaughter).
A huge majority of adults in the NSPCC’s survey also backed a call for social networks to be legally required to make children’s accounts safe, including the highest privacy settings by default, friend suggestions turned off, not being publicly searchable, and geolocation settings turned off.
Ruth Moss from Edinburgh, whose daughter Sophie took her own life at the age of 13 after looking at self-harm and suicide content on social media, is backing the NSPCC’s campaign for statutory regulation.
Ruth said: "Sophie’s death devastated me. No mother, or family, should have to go through that. It was so unnecessary; she had so much to live for. She was only 13.
“I found out that she had been looking at completely inappropriate things online. Some of the images were so graphic that even as an adult, I was shocked. She was also communicating with people in their 30s and pretending to be older than she was, under a made up persona. Whilst the internet was heavily controlled at home and at school, Sophie had free Wi-Fi when she was out, making it very hard to police her internet use 24 hours a day.
"Social networks should have a duty of care to protect children and vulnerable people from damaging material and self-regulation is clearly not working. The protection of our children is too important to leave to the goodwill of large, profit-orientated organisations. Statutory regulation is needed and as a matter of urgency."
Peter Wanless, NSPCC chief executive, said: “The support for statutory regulation of social networks is now overwhelming.
“It is clear that society will no longer tolerate a free for all under which tech firms allow children to operate in a precarious online world with a myriad of preventable risks.
“Social media bosses should be made to take responsibility for the essential protection of children on their platforms and face tough consequences if they don’t. Over a decade of self-regulation has failed, and enough is enough.
“The Government’s Online Harms White Paper must impose a legal duty of care on social networks. Our proposal to tame the Wild West Web would make the UK a world leader in protecting children online. We urge the Government to be bold and introduce these measures without delay.”
Under the NSPCC plans, the regulator would have legal powers to demand platforms to disclose information so it could better understand the extent of the risk of harm and abuse and to investigate potential breaches.
Tech firms would have a duty to risk assess its platforms and promptly notify the regulator if children had come to harm or been put at risk on their sites.
Breaches of duty of care would result in enforcement notices and requirements to publish information on their platforms about the breach. In the case of gross breaches, tech firms would be charged with a criminal offence and directors overseeing the duty of care could face disqualification.