Tech firms cite risk to end-to-end encryption as Online Safety Bill gets royal assent
The authorities’s controversial Online Safety Bill has turn out to be regulation amid continued issues from tech firms that it may injury the privateness of encrypted communications.
The Online Safety Act, which goals to make the web safer for kids, obtained royal assent in Parliament on 26 October 2023.
The act locations authorized duties on expertise firms to stop and quickly take away unlawful content material, such as terrorism and revenge pornography.
It additionally requires expertise firms to defend youngsters from seeing authorized however dangerous materials, together with content material selling self-harm, bullying, pornography and consuming issues.
The communications regulator, Ofcom, could have new powers to positive expertise firms that fail to adjust to the act up to £18m or 10% of their turnover, whichever is larger, which implies the most important tech firms might be fined billions.
The authorities has estimated that 100,000 on-line companies will come underneath the Online Safety Act, with probably the most stringent obligations reserved for “Category 1” companies which have the very best attain and pose the very best risk.
Technology secretary Michelle Donelan stated the Online Safety Act would guarantee on-line security for many years to come. “The bill protects free speech, empowers adults and will ensure that platforms remove illegal content,” she stated.
End-to-end encryption
But the Online Safety Act, which has taken 4 years to attain the statute books, continues to increase issues for expertise firms over provisions that would undermine encrypted communications.
Encrypted messaging and e-mail companies, together with WhatsApp, Signal and Element, have threatened to pull out of the UK if Ofcom requires them to set up “accredited technology” to monitor encrypted communications for unlawful content material.
Section 122 of the act provides Ofcom powers to require expertise firms to set up programs that they argue would undermine the safety and privateness of encrypted companies by scanning the content material of each message and e-mail to examine whether or not they include youngster sexual abuse supplies (CSAM).
‘Catastrophic impact’ on privateness
Mathew Hodgson, CEO of Element, a safe communications supplier that gives comms companies to the Ministry of Defence, the US Navy, Ukraine and Nato, stated its prospects have been demanding ensures that the corporate wouldn’t implement message scanning if required to achieve this underneath the Online Safety Act.
“Some of our larger customers are contractually requiring us to commit to not putting any scanning technology into our apps because it would undermine their privacy, and we are talking about big reputable technology companies here. We are also seeing international companies doubting whether they can trust us as a UK-based tech supplier anymore,” he stated.
Speaking on BBC Radio 4, Hodgson stated the intentions of the invoice have been clearly good and that social media firms such as Instagram and Pinterest needs to be filtering posts for youngster abuse materials.
However, giving Ofcom the facility to require blanket surveillance in non-public messaging apps would “catastrophically reduce safety and privacy for everyone”, he stated.
Hodgson stated enforcement of Section 122 of the Online Safety Act in opposition to expertise firms would introduce new vulnerabilities and weaknesses to encrypted communications programs that may be exploited by attackers.
“It is like asking every restaurant owner in the country to bug their restaurant tables – in case criminals eat at the restaurants – and then holding the restaurant owners responsible and liable for monitoring those bugs,” he stated.
The CEO of encrypted mail service Proton, Andy Yen, stated that with out safeguards to defend end-to-end encryption, the Online Safety Act poses an actual menace to privateness.
“The bill gives the government the power to access, collect and read anyone’s private conversations any time they want. No one would tolerate this in the physical world, so why do we in the digital world?” he stated.
Writing in a blog post revealed at this time (27 October 2023), Yen stated whereas he was fairly assured that Ofcom wouldn’t use its powers to require Proton to monitor the contents of its prospects’ emails, he was involved that the act had been handed with a clause that provides the British authorities powers to entry, acquire and skim anybody’s non-public communications.
“The Online Safety Act empowers Ofcom to order encrypted services to use “accredited technology” to search for and take down unlawful content material. Unfortunately, no such expertise at the moment exists that additionally protects folks’s privateness by encryption. Companies would due to this fact have to break their very own encryption, destroying the safety of their very own companies,” he wrote.
“The criminals would seek out alternative methods to share illegal materials, while the vast majority of law-abiding citizens would suffer the consequences of an internet without privacy and personal data vulnerable to hackers,” he added.
Meridith Whitaker, president of encrypted messaging service Signal, reposted the organisation’s place on X, previously recognized as Twitter, that it might withdraw from the UK if it was pressured to compromise its encryption.
“Signal will never undermine our privacy promises and the encryption they rely on. Our position remains firm: we will continue to do whatever we can to ensure people in the UK can use Signal. But if the choice came down to being forced to build a backdoor, or leaving, we’d leave,” she wrote.
Zero-tolerance method
The Online Safety Act takes what the federal government describes as a “zero-tolerance approach” to defending youngsters.
It consists of measures to require tech firms to introduce age-checking measures on platforms the place dangerous content material to youngsters is revealed, and requires them to publish risk assessments of the risks posed to youngsters by their websites.
Tech firms may even be required to present youngsters and fogeys with clear methods to report issues, and to supply customers choices to filter out content material they are not looking for to see.
Ofcom plans phased introduction
The communications regulator plans to introduce the laws in phases, beginning with a session course of on tackling unlawful content material from 9 November 2023.
Phase two will tackle youngster security, pornography, and the safety of ladies and women, with Ofcom due to publish draft steerage on age verification in December 2023. Draft pointers on defending youngsters will observe in spring 2024, with draft pointers on defending ladies and women following in spring 2025.
Phase three will deal with categorised on-line companies that will probably be required to meet further necessities, together with producing transparency experiences, offering instruments for customers to management the content material they see and stopping fraudulent promoting. Ofcom goals to produce draft steerage in early 2024.
Ofcom’s chief govt, Melanie Dawes, stated it might not act as a censor, however would sort out the foundation causes of on-line hurt. “We will set new standards online, making sure sites and apps are safer by design,” she added.
Advice to tech firms
Lawyer Hayley Brady, associate at UK regulation agency Herbert Smith Freehills, stated expertise firms ought to interact with Ofcom to form the codes of follow and steerage.
“Companies will have the choice to follow Ofcom’s Codes of Practice or decide upon their own ways of dealing with content. Unless a company has rigorous controls in place, the safe option will be to adhere to Ofcom’s advice,” she stated.
Ria Moody, managing affiliate at regulation agency Linklaters, stated the Online Safety Act tackles the identical underlying points as the European Union’s Digital Services Act (DSA), however in a really completely different approach.
“Many online services are now thinking about how to adapt their DSA compliance processes to meet the requirements of the OSA,” she stated.
John Brunning, a associate at regulation agency Fieldfisher, stated the broad scope of the act meant many extra companies can be caught by its previsions than folks anticipated.
“Expect plenty of questions when it comes to trying to implement solutions in practice,” he stated.
These embody how seemingly a service is to be accessed by youngsters, whether or not firms will want to begin geo-blocking to stop folks accessing websites that aren’t focused on the UK, and the place expertise firms ought to draw the road on dangerous content material.
Franke Everitt, director at Fieldfisher, stated on-line platforms and companies wouldn’t want to take steps to comply instantly. “This is just the beginning of a long process. Government and regulators will need to fill in the detail of what is just a roughly sketched outline of legislation,” she stated.