[ad_1]
Plenty of other ideas have also been tacked onto the bill. The current text includes age checks for porn sites and measures against scam ads and nonconsensual sharing of nude images.
As the bill nears passage into law, the most contentious—and, in the short term, consequential—dispute over its content is not about what online content should be illegal online, but about the privacy implications of the government’s proposals. The current draft says that platforms such as messaging apps will need to use “accredited technology” to scan messages for CSAM material. That, tech companies and cybersecurity experts say, is a de facto ban on full end-to-end encryption of messages. Under end-to-end encryption, only the sender and recipient of a message can read the contents of a message.
The UK government says it’s up to tech companies to figure out a technical solution to that conflict. “They’re rather disingenuously saying, ‘We’re not going to touch end-to-end encryption, you don’t have to decrypt anything,’” says Alan Woodward, a visiting professor in cybersecurity at the University of Surrey. “The bottom line is, the rules of mathematics don’t allow you to do that. And they just basically come back and say, ‘Nerd harder.’”
One possible approach is client-side scanning, where a phone or other device would scan the content of a message before it’s encrypted and flag or block violating material. But security experts say that creates many new problems. “You just cannot do that and maintain privacy,” Woodward says. “The Online Safety Bill basically reintroduces mass surveillance and says, ‘We have to search every phone, every device, just in case we find one of these images.’”
Apple had been working on a tool for scanning images on its iCloud storage service to identify CSAM, which it hoped could prevent the proliferation of images of abuse without threatening users’ privacy. But in December it shelved the project, and in a recent response to criticism from organizations that campaign against child abuse, Apple said that it didn’t want to risk opening up a backdoor for broader surveillance. The company’s argument, echoed by privacy campaigners and other tech companies, is that if there’s a way to scan users’ files for one purpose, it’ll end up being used for another—either by criminals or by intrusive governments. Meredith Whittaker, president of the secure messaging app Signal, called the decision a “death knell” for the idea that it’s possible to securely scan content on encrypted platforms.
Signal has vocally opposed the UK bill and said it may pull out of the country if it’s passed in its current form. Meta has said the same for WhatsApp. Smaller companies, like Element, which provides secure messaging to governments—including the UK government—and militaries, say they may also have to leave. Forcing companies to scan everything passing through a messaging app “would be a catastrophe, because it fundamentally undermines the privacy guarantees of an encrypted communication system,” says Matthew Hodgson, Element’s CEO.
A legal analysis of the bill commissioned by the free-expression organization Index on Censorship found that it would grant the British telecoms regulator, Ofcom, greater surveillance powers than the security services, with dangerously weak checks and balances on how they were used. Civil society organizations and online privacy advocates point out that these powers are being put in place by a government that has cracked down on the right to protest and given itself far-reaching powers to surveil internet users under its 2016 Investigatory Powers Act. In July, Apple protested against proposed changes to that law, which it says would have meant that tech companies would have to inform the UK government each time it patched security breaches in its products.
[ad_2]
Matéria ORIGINAL wired