WhatsApp, Signal, and other messaging services have penned an open letter to the British government appealing for it to urgently rethink the Online Safety Bill (OSB), a piece of legislation that would allow regulators to ask the platforms to monitor users in order to identify child abuse images.
Under the bill, the government could force chat services to apply content moderation policies such as client-side scanning that are impossible to implement without circumventing end-to-end encryption, which ensures that only the user and the person they are communicating with can read or listen to what is sent.
"Around the world, businesses, individuals and governments face persistent threats from online fraud, scams and data theft," reads the letter. "Malicious actors and hostile states routinely challenge the security of our critical infrastructure. End-to-end encryption is one of the strongest possible defenses against these threats, and as vital institutions become ever more dependent on internet technologies to conduct core operations, the stakes have never been higher.
As currently drafted, the Bill could break end-to-end encryption, opening the door to routine, general and indiscriminate surveillance of personal messages of friends, family members, employees, executives, journalists, human rights activists and even politicians themselves, which would fundamentally undermine everyone's ability to communicate securely.
The Bill provides no explicit protection for encryption, and if implemented as written, could empower OFCOM to try to force the proactive scanning of private messages on end-to-end encrypted communication services - nullifying the purpose of end-to-end encryption as a result and compromising the privacy of all users.
In short, the Bill poses an unprecedented threat to the privacy, safety and security of every U.K. citizen and the people with whom they communicate around the world, while emboldening hostile governments who may seek to draft copy-cat laws.
The open letter is signed by Element chief executive Matthew Hodgson, Oxen Privacy Tech Foundation and Session director Alex Linton, Signal president Meredith Whittaker, Threema chief executive Martin Blatter, Viber chief executive Ofir Eyal, head of WhatsApp Will Cathcart, and Wire chief technical officer Alan Duric.
Last year, Apple abandoned similar controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos. Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.
The plans were criticized by a wide range of individuals and organizations, and Apple ultimately dropped the proposal. "Children can be protected without companies combing through personal data," said Apple at the time. "We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.
Under the U.K. bill, if a messaging service refused to apply the content moderation policies, it could face fines of up to 4 percent of its annual turnover. WhatsApp, Signal, and Proton have already stated that they would halt their encrypted services in the U.K. and pull out of the market if the bill required them to scan user content.
The U.K. government's Online Safety Bill is expected to return to parliament this summer.
Note: Due to the political or social nature of the discussion regarding this topic, the discussion thread is located in our Political News forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.