New laws proposed to tackle social media companies streaming child abuse, extremism, terrorist attacks and cyber bullying have been welcomed by senior police and children’s charities.
Launched on Monday, the Online Harms white paper outlines what the government says are tough new laws for internet companies and the ability to enforce them.
The white paper, which was first revealed in the Guardian last week, will legislate for a new statutory duty of care by social media firms and the appointment of an independent regulator, which is likely to be funded through a levy on the companies.
The “harms” that companies could be penalised for include failure to act to take down child abuse, terrorist acts and revenge pornography, as well as behaviours such as cyber bullying, spreading disinformation and encouraging self-harm. Senior social media executives could be held personally liable for failure to remove such content from their platforms.
In the last 15 years, reports of child abuse online have risen from 110,000 globally in 2004 to 18.4m last year.
Rob Jones, director of the National Crime Agency, said: “Industry does some great work but it has lots more to do and the technology already exists to design-out a lot of preventable offending. Industry must block abuse images upon detection and prevent online grooming; it must work with us to stop live-streaming of child abuse; it must be more open and share best practice. And abuse sites must no longer be supported by advertising.”
Javed Khan, the chief executive of Barnado’s, said two-thirds of vulnerable children supported through the charity’s child exploitation services were groomed online before meeting their abuser in person.
“Children in the UK are facing growing risks online – from cyber-bullying to sexual grooming to gaming addiction,” he said.
“Barnardo’s has long called for new laws to protect children online, just as we do offline, so they can learn, play and communicate safely. The government’s announcement is a very important step in the right direction.”
In a joint foreword to the white paper, the home secretary, Sajid Javid, and the secretary of state for culture, media and sport, Jeremy Wright, said it was time to move beyond self-regulation.
“Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough. Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However, those that fail to do this will face tough action,” said Wright.
The death of 14-year-old Molly Russell in 2017 has had a strong impact on the white paper. Her father launched a passionate campaign earlier this year to highlight the fact that self-harm and suicide were widely promoted on Instagram, a fact that he felt contributed to her killing herself.
The Christchurch shootings in March also added pressure on politicians to act. The attacker used Facebook Live to stream the killings in progress, with thousands watching the attack as it occurred and millions more seeing the video as it was uploaded across the internet over the following day.
The white paper comes after the Australian government passed tough new legislation this month to tackle the streaming of violent images on social media.
The new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, including social media platforms, file hosting sites, public discussion forums, messaging services and search engines.
- In the UK, Samaritans can be contacted on 116 123 or [email protected] In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international suicide helplines can be found at www.befrienders.org
The powerful in tech…
… must keep being challenged with bold investigative journalism. It’s been a year since The Observer and The Guardian broke the story that became the Cambridge Analytica scandal, exposing the truth and shedding light on the reality of foul play within the tech industry. We saw how personal data could be harvested on an unprecedented scale to fulfil the ambitions of the powerful. Through this courageous investigative reporting, we shamed Facebook, and prompted a global conversation about the importance of data privacy, holding tech companies to account and pressuring governments to enact regulation.
The Guardian is committed to continuing this vital work; we will keep persevering, uncovering and challenging those with so much power in the tech industry. This has never been so pressing: we’re living in a time when the integrity of our democracy and the legitimacy of our votes are in question. Political campaigns reside in our many digital feeds and, with each year, this will become ever more prominent. The world needs journalism that promotes transparency and investigates where others won’t go. Reader support means The Guardian can keep investigating the critical issues of our time.
The Guardian is editorially independent, meaning we set our own agenda. Our journalism is free from commercial bias and not influenced by billionaire owners, politicians or shareholders. No one edits our editor. No one steers our opinion. This is important as it enables us to give a voice to those less heard, challenge the powerful and hold them to account. It’s what makes us different to so many others in the media, at a time when factual, honest reporting is critical.
Every contribution we receive from readers like you, big or small, goes directly into funding our journalism. This support enables us to keep working as we do – but we must maintain and build on it for every year to come.