Every day we carry our lives on digital devices tucked in our pockets. But public trust in those devices has reached an all-time low thanks to scandals ranging from election interference by Russian hackers to the weaponisation of social media by governments and extremists. Last month, the Australian government proposed legislation that could make things worse.
Imagine if a law enforcement official could secretly force Apple to hack into your phone to access your encrypted data. Or compel Google to trick you into installing spyware on your phone by sending you a fake software update. Or require Facebook to covertly rewrite Messenger or WhatsApp so authorities can access your encrypted conversations.
The Australian government’s draft Telecommunications and Other Legislation Amendment (Assistance and Access) Bill opens the door to just that, and more. The bill, which was introduced into parliament by Home Affairs Minister Peter Dutton on 20 September (just 10 days after submissions closed), would allow Australian law enforcement and security agencies to order technology companies and even individuals to do vaguely described ‘acts or things’ to facilitate access to your encrypted data and devices through newly created ‘technical assistance’ and ‘technical capability’ notices. Although officials would still need a warrant to obtain private communications and data, the bill requires no prior judicial authorisation before the attorney-general could compel your phone maker or app provider to undermine their security features.
The bill states that Australian courts will retain their powers of judicial review to ensure officials are acting lawfully. However, the proposal doesn’t provide sufficient transparency, oversight or accountability mechanisms to ensure its broad powers aren’t abused. Agencies would impose notices in secret, and the bill makes it an offence for companies to tell the targeted person about it. While secrecy may often be necessary in an investigation, the bill doesn’t allow disclosure even when it would no longer pose a threat to security or jeopardise an investigation. It is also difficult to envision how an individual could seek judicial review if they never find out that their device was deliberately compromised.
In all, the proposed law leaves too much discretion to officials to decide whether an order is justified as necessary and proportionate, and doesn’t impose sufficient safeguards to prevent abuse.
The proposal does forbid the creation of ‘systemic’ weaknesses or vulnerabilities in technology. But the broadly drawn bill doesn’t define ‘systemic’, and other key terms, and provides too much room to agencies to determine their contours. The result is that many of the actions companies might be forced to take could introduce vulnerabilities that cause widespread harm to cybersecurity and human rights, despite the bill’s intent.
Agencies could, for example, require a company to use its software update system to trick users into installing government code or spyware, a move that would undermine trust in routine software update channels. If users fear that updates may be compromised, they may be more reluctant to install them. Phones and other devices would then be less secure over time because they wouldn’t have necessary software fixes, which would undermine cybersecurity for users beyond the targets of an investigation.
Because of the ambiguities in the bill, some of the capabilities it may compel could be interpreted by security experts (including those working for service providers) as creating security ‘backdoors’ or as preventing the use of strong, end-to-end encryption.
Australia’s proposal emulates the approach in the UK’s Investigatory Powers Act. It also follows a joint statement from the Five Eyes countries—the consortium of Australia, the US, the UK, Canada and New Zealand for joint cooperation in signals intelligence—demanding greater ‘voluntary’ cooperation from technology companies to access encrypted data or else face new laws or other ‘technological’ measures. In the US, the government is already trying to compel Facebook to circumvent security features in the Messenger app, much like it tried to do to Apple in 2016.
If adopted, the Australian bill would pose considerable threats to cybersecurity and human rights. And its effects wouldn’t be limited to Australia. Once Apple, Facebook or Google has to disclose the source code behind its products or to trick you into installing spyware posed as a software update for Australia, other governments will demand the same. And once a company rewrites code to access information held on your device, it could be forced to use that compromised code again and again, by Australian or other authorities. Such an outcome creates additional risks that the compromised code could be breached, stolen and disseminated, affecting users around the world.
On 10 September, Human Rights Watch submitted comments to the Department of Home Affairs urging the withdrawal of the draft bill and the crafting of an approach that meets the needs of law enforcement while also protecting cybersecurity and human rights. For example, any legislation creating new surveillance capabilities should require agencies to use the least intrusive measure to access private communications to ensure that any limit on privacy and security is proportionate. It should specifically affirm that it doesn’t prevent companies from employing end-to-end encryption. And it should require prior authorisation from a judicial authority that is independent of the agency seeking to compel action by a company, while also creating meaningful avenues to challenge overreaching orders.
Given the extraordinarily intrusive nature of the actions agencies could compel, any proposed law requires far more robust oversight and accountability mechanisms than the bill currently provides to check executive power and ensure people’s rights are preserved.
The technology companies we rely on to keep our data safe already face an escalating arms race to protect us from cybercriminals and other security threats. Encryption is a key part of their arsenal, and so is their ability to fix security problems through regular software updates. Ordinary users should be able to trust that their technology hasn’t been deliberately compromised by their own government. Australia, the US and the other Five Eyes governments should be promoting strong cybersecurity, not turning our own devices against us.