Draft doc struggles to describe how theoretically encryption-busting powers might be used
The UK government has set out plans detailing how it will use the new law it has created to control online platforms and social media - with one telling exception.
The Draft Statement of Strategic Priorities for online safety places an emphasis on platform providers preventing online harms in the first place, and collaborating with regulator Ofcom on how the new law - the Online Safety Act - will be implemented. But it provides little detail about how it will use the more controversial aspects of the legislation.
The set of priorities lists activities that might take place on online platforms. It expects platform providers "to take proactive steps to reduce the risks their services are used to carry out the most harmful illegal activity."
The list includes terrorism, child sexual abuse and exploitation, illegal suicide and self-harm content, illegal activity that disproportionately affects women and girls, illegal disinformation, hate that incites violence towards specific individuals or groups, UK-linked content designed to encourage or facilitate organized immigration crime by criminal groups, as well as illegal sales of weapons and drugs, illegal foreign interference such as state-sponsored disinformation, fraud, and "other priority offences."
In a statement, technology secretary Peter Kyle said: "Keeping children safe online is a priority for this government. While the Online Safety Act sets the foundation of creating better experiences online, we must keep pace with technology as it evolves to create a safer internet, especially for children."
The draft statement says the government aims to prevent harm from occurring in the first place, wherever possible. "While this is clearly a material challenge, Ofcom has significant powers at its disposal - including information gathering, audit, enforcement and penalty powers - to ensure providers comply with their statutory duties to protect users online," it says.
"The government wants to see a culture of candour created through Ofcom's transparency reporting regime, where the regulator and platforms work together to expose practices that create the greatest risks to users and address the systemic issues they uncover," it says.
The draft document also talks about increasing transparency and accountability of online platforms and "supporting continued innovation in safety technologies".
But it falls silent on the most controversial aspect of the Act, Section 122, which says platform providers should use "accredited technology" to access online content required by law enforcement or regulation.
Targeting terrorism and child sexual exploitation and abuse, the section gives Ofcom the power to "give a notice relating to a regulated user-to-user service or a regulated search service to the provider of the service" to remove and/or prevent users seeing the content.
While the government has said there is no intention to weaken the encryption technology platforms use, concerns remain around the authorities' ability to access private communications.
A critic of the act, Signal CEO Meredith Whittaker, said her stance was unchanged after it was given royal assent.
"Signal will never undermine our privacy promises and the encryption they rely on," said Whittaker. "Our position remains firm: we will continue to do whatever we can to ensure people in the UK can use Signal. But if the choice came down to being forced to build a backdoor, or leaving, we'd leave."
The Register has asked the Department for Science, Innovation and Technology for more details about how it plans to implement Section 122.
Along with the draft plans, the government has commissioned a research project to explore the impact of social media on young people's well-being and mental health. ®