Cathcart says WhatsApp would not comply with any efforts to undermine the company’s encryption. “We’ve recently been blocked in Iran,” he says. “We’ve never seen a liberal democracy do that, and I hope it doesn’t come to that. But the reality is, our users all around the world want security.”
The bill does not explicitly call for the weakening of encryption, but Cathcart and others who oppose it say it creates legal gray areas and could be used to undermine privacy down the line.
“It is a first step,” says Jan Jonsson, CEO of Swedish VPN company Mullvad, which counts the UK as one of its biggest markets. “And I think the general idea is to go after encryption in the long run.”
“Nobody’s defending CSAM,” says Barbora Bukovská, senior director for law and policy at Article 19, a digital rights group. “But the bill has the chance to violate privacy and legislate wild surveillance of private communication. How can that be conducive to democracy?”
The UK Home Office, the government department that is overseeing the bill’s development, did not supply an attributable response to a request for comment.
Children’s charities in the UK say that it’s disingenuous to portray the debate around the bill’s CSAM provisions as a black-and-white choice between privacy and safety. The technical challenges posed by the bill are not insurmountable, they say, and forcing the world’s biggest tech companies to invest in solutions makes it more likely the problems will be solved.
“Experts have demonstrated that it’s possible to tackle child abuse material and grooming in end-to-end encrypted environments,” says Richard Collard, associate head of child safety online policy at the British children’s charity NSPCC, pointing to a July paper published by two senior technical directors at GCHQ, the UK’s cyber intelligence agency, as an example.
Companies have started selling off-the-shelf products that claim the same. In February, London-based SafeToNet launched its SafeToWatch product that, it says, can identify and block child abuse material from ever being uploaded to messengers like WhatsApp. “It sits at device level, so it’s not affected by encryption,” says the company’s chief operating officer, Tom Farrell, who compares it to the autofocus feature in a phone camera. “Autofocus doesn’t allow you to take your image until it’s in focus. This wouldn’t allow you to take it before it proved that it was safe.”