Democracies need to accept the mainstream adoption of end-to-end encryption in popular messaging and communications applications like WhatsApp, Signal, iMessage and, in the future according to Meta, Facebook Messenger.
End-to-end encryption, where only the sender and recipient can see the contents of a message, has been around for a long time for people who wish to keep their communications private. It gained particular traction after Edward Snowden’s revelations of mass government surveillance in 2013. Its value has since been consolidated globally as a necessary technical response to the perceived surveillance excesses of technology companies and governments alike and can help protect the safety and rights of people around the world to express themselves freely.
But some liberal democratic governments are pushing back against its adoption by mainstream instant-messaging platforms, going so far as to launch an emotive advertising campaign against it recently in the UK.
Broadly, the Australian government argues that law enforcement and intelligence agencies are finding it harder to obtain usable content directly from the platforms because the platforms cannot decrypt and see the encrypted communications that transit their systems.
While Australia, together with the other Five Eyes countries, along with India and Japan, have stated their support for strong encryption, in the same joint statement they also call for platforms to work with governments on technically feasible solutions to enable law enforcement (with appropriate legal authority) to have access to content in a readable and usable format. These two things are not mutually exclusive, except in when end-to-end encryption is being used. End-to-end encryption is specifically designed to prevent third parties from viewing the content of a communication, thus providing privacy to users.
There are other encryption architectures available for messaging systems, such as client-to-server encryption, but it is weak in terms of privacy because a third party with access to the server can read all communications. This is the architecture used in services such as WeChat, and previously in Zoom.
The seven countries have also argued that the move to implement end-to-end encryption would threaten the current practice of platforms without it to continue to proactively identify photos and videos for known child sexual abuse material and report it to authorities.
The problem is that, whether it’s liberal democratic governments monitoring communications for child sexual abuse material or authoritarian governments monitoring for what they deem politically sensitive content, both require decrypting communications (either on the device or on the server) to analyse for ‘harmful’ or ‘illegal’ content, resulting in a third party (and unintended recipient) accessing the encrypted content.
Unfortunately, as we’ve seen with other restrictions in freedom of expression online by liberal democracies, authoritarian countries will use Western examples when enacting their own policies to police content online. Western countries will be hard pressed to criticise those that censor political content if they themselves don’t support an environment in which strong encryption is the norm.
Fundamentally, any access by law enforcement to communications is an intrusion of privacy. Any liberal democratic government with that power needs to have and maintain the trust of its citizens. Even if the current government is trusted, future governments may not be. And given that there are cases in Australia alone of authorities using capabilities, powers and data in unexpected ways, there is cause for concern.
Technology companies also have a role to play. They have responsibilities to protect vulnerable users of their platforms, prevent their platforms from being used to facilitate crimes and harmful activities, and help law enforcement (with lawful requests). Together with governments, they should seek solutions that do not demonise or affect the integrity of end-to-end encryption.
Law enforcement doesn’t necessarily need unencrypted content to prosecute a crime. Metadata, which provides information about the communication, can be used to generate leads. But content makes gathering evidence to even obtain a warrant significantly easier.
As Tom Uren writes in his recent ASPI report, The future of assistance to law enforcement in an end-to-end encrypted world, there are options available to protect children using social media from harm. Platforms can use metadata and behavioural analysis to sound the alert when, for example, an adult is contacting multiple children that they haven’t had prior contact with. They can educate and encourage children and parents to report suspicious content, which could provide unencrypted content for the platforms to analyse. Perhaps messages to and from children might not use end-to-end encryption by default.
Law enforcement agencies seeking access to evidence for a crime still have a wide array of tools in their belts. Police can access metadata from social media companies about users’ accounts or from service providers in Australia (such as ISPs) that are required to store certain metadata for two years. Australian law enforcement can take over social media accounts or apply for computer access warrants to obtain evidence from devices on which data is usually unencrypted. These investigative techniques are more challenging and can’t necessarily be done at scale.
End-to-end encryption, especially in combination with anonymising software and the growing complexity of modern internet networks, which includes a move towards a decentralised Web 3.0, does make the job of online policing more difficult.
We need to find a middle ground. There is a nuanced debate to be had about how users of online services can be protected both by law enforcement and by platforms, while at the same time governments ensure that the right to privacy, enforced with strong encryption, is not dismissed easily.