The UK’s controversial Online Safety Bill has passed through Parliament and will soon become law. The wide-ranging legislation, which is likely to affect every internet user in the UK and any service they access, has taken years to get to this point, but its potential impacts are still unclear and some of the new regulations are technologically impossible to comply with.
A key sticking point is what the legislation means for end-to-end encryption, a security technique used by services like WhatsApp that mathematically guarantees that no one, not even the service provider, can read messages sent between two users. The new law gives regulator Ofcom the power to intercept and check this encrypted data for illegal or harmful content.
Using this power would require service providers to create a backdoor in their software, allowing Ofcom to bypass the mathematically secure encryption. But this same backdoor could be abused by hackers, and anyone with the technical ability could create their own encryption software with no backdoor.
“The request to locate a backdoor through encrypted messages causes a constant security headache and this is likely to push users, including criminals, to other more underground messaging platforms,” says Jake Moore at cybersecurity firm ESET.
An alternative approach is to install software on every device to allow Ofcom to look at unencrypted messages before or after they are sent. This isn’t simple to implement, nor popular with privacy advocates such as Jessica Ní Mhainín at campaign group Index on Censorship.
“The Home Office’s long-standing war on encryption is misguided and opens us up to new threats,” she says. “Encryption keeps our private messages safe. It enables public watchdogs – including journalists, human rights defenders, whistle-blowers, academics and others – to communicate securely with their sources.”
The difficulty of creating such a tool to scan content while also maintaining privacy was made clear when Apple launched a tool to scour images on users’ phones for evidence of child sexual abuse. The company claimed it would generate false positives in less than 1 in every trillion uses. But independent researchers were able to create benign images that triggered the tool almost immediately, prompting criticism that led Apple to quietly shelve the project.
The UK government, even as it was pushing the Online Safety Bill through Parliament, conceded that there is no technical way to do what it demands, and that those parts of the law wouldn’t be enforced until such tools had been developed. In the meantime, it continues to push against big technology platforms that don’t yet encrypt all user data but are planning to do so, such as Facebook.
Beyond encryption, the bill also brings in mandatory age checks on pornography websites and requires that websites have policies in place to protect people from “harmful” or illegal content. What counts as illegal and exactly which websites will fall under the scope of the bill is unclear, however.
For instance, Michelle Donelan, then secretary of state for digital, culture, media and sport, said in January that “posting videos of people crossing the [English] channel which show that activity in a positive light” could be seen as aiding and abetting illegal immigration, and therefore be an offence under the bill.
When asked how it will enforce the bill, Ofcom pointed to a website last updated in June that promised that guidelines would be published following three public consultations.
Neil Brown at law firm decoded.legal says Ofcom still has a “huge amount of work” to do. The new law could plausibly affect any company that allows comments on its website, publishes user-generated content, transmits encrypted data or hosts anything that the government deems may be harmful to children, says Brown.
“What I’m fearful of is that there are going to be an awful lot of people, small organisations – not these big tech giants – who are going to face pretty chunky legal bills trying to work out if they are in scope and, if so, what they need to do,” he says.