- The Strategist - https://aspistrategist.ru -
Not dark yet—strong encryption and security (part 2)
Posted By Andrew Davies on June 27, 2017 @ 06:00
In the previous part [1] of my exploration of the impact of strong encryption on our security agencies, I described the unsophisticated days of intercepting telephony in the 1970s. With voice communications, it’s largely a case of ‘grab it or it’s gone’. Most of the history of signals intelligence is about eavesdropping on moving data. But the advent of internet communications introduced a new angle, as data ‘at rest’ in a computer or smartphone at either end of the communications channel became a potential source of intelligence.
The Apple handset that became the centre of a 2016 court case [2] in the US last year provides an intriguing case study. Even after being presented with a court order to provide the FBI with access to the handset, Apple declined on the grounds [3] that they would have to create an access channel that could be used to render vulnerable any iPhone using that system. It wasn’t a case of encryption being the sticking point—the problem was getting past the phone’s passcode. It’s a more complicated story [4] than sometimes appreciated, but it brought the tension between customer privacy, information security across the wider economy, and the requirements of law enforcement and intelligence agencies very much into public view.
There’s something a little puzzling about the pushback in the iPhone case. As I pointed out last time, we all lived happily enough in the post-1979 world of legislatively-guaranteed warranted access to our telecommunications. Philosophically at least, it seems reasonable for governments to want that level of access to be preserved (or, perhaps more accurately, reinstated). In principle I’m inclined to agree, with the proviso that there’s robust and effective oversight [5], including the stipulation of warranted collection.
It must be said that some governments haven’t helped themselves in that respect. The public is more tolerant of focused investigations of suspicious behaviour and individuals than it is of wider ‘fishing expeditions’ into big data pools. In 1979, it was hard to do much of the latter, but more recently the US National Security Agency was caught out hoovering up large quantities of metadata [6] under their Prism program without sufficient oversight [7]. A UK system called Tempora went well beyond metadata [8], and was undiscriminating in its targeting. And the Australian government did a horrible job [9] of explaining its own ambitions for metadata collection.
And in practice, I don’t think we can get there from here. Encryption isn’t just a tool used by bad people to plan bad things: it’s now a critical part of the rapidly growing online economy. Banking and e-commerce couldn’t function effectively without it. As we saw in part 1, the US government rolled out strong encryption for exactly that reason in the 1970s (and continues to support today). And individuals have perfectly valid reasons to implement security mechanisms such as virtual private networks [10]—any traveller doing internet banking over someone else’s Wi-Fi network has good reason to want the additional protection. In fact, given how poor network security can be, it makes good sense for users to implement protective measures over sensitive data.
Perhaps most important are end-to-end encryption systems, used by applications like WhatsApp, Signal, iMessage, and Facebook Messenger. Only the two client users have the key to decrypt any message. Companies such as Apple and Facebook, on whose products the messages are transmitted, don’t have access to unencrypted messages or to encryption keys.
There have been calls to outlaw strong encryption [11] so that law enforcement and intelligence agencies can crack communications between targets of interest. That begs many questions. Who decides how strong is ‘too strong’? Does ASIO or the AFP need to be able to access data in an hour, a day, or a week? Moore’s Law [12] tells us that what the NSA can do today, others will be doing in the not too distant future. So how can we ensure the protection of innocent but sensitive communications? Or is the government going to decree that some privacy measures won’t be available to the public at large?
Finally, even if we managed to tie up all of the loose ends in the Australian telecommunications marketplace, how do we quarantine local users from apps and hardware that are compatible with Australian networks and are readily available from offshore vendors? Australia, the UK, and even the US can’t legislate for the totality of the messaging app universe, and any lawful intercept legislation would quickly move serious threats onto other platforms that could be even worse for law enforcement—or even wider society. High profile companies like Apple, Google and Facebook tend to help when it’s clearly a public duty to do so (they work with authorities to identify and eliminate child pornography, for example). But smaller firms, especially those in other countries, might feel no such obligation. And any vulnerabilities engineered into products will be available to be exploited by entities other than our own security agencies.
I think it’s an intractable problem. The horse has bolted, and the access to data through lawful intercept that our security agencies once enjoyed will never be possible again. As Bob Dylan might put it, it’s not dark yet, but it’s getting there [13].
Note: I had a lot of useful feedback from my ASPI colleagues on these two posts. I thank them, but don’t blame them for anything here.
Article printed from The Strategist: https://aspistrategist.ru
URL to article: /not-dark-yet-strong-encryption-security-part-2/
URLs in this post:
[1] previous part: /going-dark-strong-encryption-security-part-1/
[2] Apple handset that became the centre of a 2016 court case: https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute
[3] on the grounds: https://techcrunch.com/2016/02/17/tim-cook-apple-wont-create-backdoor-to-unlock-san-bernardino-attackers-iphone/
[4] complicated story: https://www.wired.com/2016/02/apples-fbi-battle-is-complicated-heres-whats-really-going-on/
[5] robust and effective oversight: /government-surveillance-and-australias-multiple-watchdogs/
[6] hoovering up large quantities of metadata: /looking-through-the-prism/
[7] without sufficient oversight: /edward-snowden-the-media-and-the-pulitzer/
[8] well beyond metadata: http://www.wired.co.uk/article/gchq-tempora-101
[9] horrible job: http://www.smh.com.au/digital-life/digital-life-news/george-brandis-in-car-crash-interview-over-controversial-data-retention-regime-20140806-101849.html
[10] virtual private networks: https://en.wikipedia.org/wiki/Virtual_private_network
[11] outlaw strong encryption: https://www.businessinsider.com.au/john-mccain-calls-anti-encryption-legislation-paris-attacks-isis-back-doors-2015-11?r=UK&IR=T
[12] Moore’s Law: /graph-of-the-week-moores-law/
[13] not dark yet, but it’s getting there: https://www.vevo.com/watch/bob-dylan/not-dark-yet/USSM20100447
Click here to print.
Copyright © 2024 The Strategist. All rights reserved.