Telegram is an encrypted internet messaging app developed in Russia to avoid censorship by the Putin regime. It’s so effective that it’s proving problematic for law enforcement and counterterrorism agencies around the world, including in Australia.
Russia’s Pavel Durov founded the service after he came under Kremlin pressure to relinquish control of his social media platform, VKontakte (VK), which was so popular that it earned Durov the title ‘the Zuckerberg of Russia’. When he launched Telegram in 2013, a Kremlin-linked firm attempted to wrest control of it away from him, but the case went through the American legal system because Telegram was registered in the US. In 2014, Durov won ownership of the app.
Durov’s willingness to stand up to an authoritarian regime paid off: Telegram is the 8th most used messaging service worldwide. Its 200 million users come mainly from illiberal countries such as Russia and Iran, where it’s mostly used as a political platform—just as VK was—to evade state censorship. This is likely to be the real reason behind Russia’s and Iran’s recent bans of Telegram. Telegram’s refusal to capitulate to the Kremlin earned it Amnesty International’s support.
So why is Telegram a problem? As a company, Telegram appears hostile to law enforcement: the head of Europol said that the force was getting some cooperation from Telegram ‘but nowhere near what we are getting from Facebook, Twitter and some of the others’. Durov himself has said he’s willing to live with terrorist attacks because privacy must come first. The company has been generally unwilling to comply with requests from law enforcement bodies, and the Russian Telegram ban started when Telegram refused to hand over encryption keys.
Telegram is also structured to resist government requests and subpoenas. It’s incorporated in Dubai, but its servers’ locations and employees’ names are secret, thanks to a complex of transnational shell companies scattered worldwide. Access to user data requires not only international cooperation, but knowing where the data is located—a nearly impossible task without cooperation from the messaging service.
Telegram’s technology has also made it the ‘app of choice’ for the Islamic State terrorist group. Telegram offers ‘channels’, in which one user broadcasts a content feed to be read or watched by unlimited subscribers. When channels are taken down, IS quickly uploads thousands of megabytes of old data to new and backup channels, using bots to disseminate information broadly across multiple channels. And IS is quickly across attempts to put out fake media in its name, reminding users to trust only its accounts. The Combating Terrorism Center at the US Military Academy, West Point, has extensively analysed IS’s file-sharing methods on Telegram.
Telegram also offers two types of communications: server–client encrypted and end-to-end encrypted (‘secret chats’). The former aren’t secure—and indeed are the default setting, which has been critcised for tricking unwary users. But IS uses them for massive group chats (Whatsapp allows a maximum of 256 people) that provide a virtual community for followers to engage with violent jihad and be targeted for recruitment. A few IS administrators monitor dozens of chatrooms, removing suspected infiltrators. The chatrooms are taken down by Telegram fairly rapidly, but IS has developed methods to retain and transfer followers to new ones.
The ‘secret chat’ function uses end-to-end encryption. Data is stored locally—if the phone is lost, the data is gone, and not even Telegram can access the content of chats. Only message participants have the encryption key needed to read secret chat messages. The Telegram app doesn’t permit screenshots, and participants can set self-destruct timers that delete messages from histories and phone logs. Without those messages, law enforcement is unable to access what may be important evidence for prosecutions.
Of course, its encryption could be broken—skilled cryptographers have doubts about Telegram’s encryption algorithms. And contradictory claims abound about whether secret services worldwide have broken the algorithms; just last week a Chinese-language media source claimed that the PLA’s cyber unit had broken into Telegram. But, so far, no confirmed major problems have been found. In the meantime, Telegram is believed to have been used to help plan terrorist attacks in Russia and France and to disseminate extremist propaganda.
The problems will be more acute if Durov realises his ambition to build a closed ecosystem of integrated applications onto Telegram’s platform, including cryptocurrency payments for digital and physical goods and services. If successful, the system will function like China’s WeChat, which is used by a billion people for everything from paying utility bills and hailing taxis to messaging friends, all without ever leaving the app.
But, unlike WeChat, which operates under the benevolence of the Chinese Communist Party, Durov doesn’t want Telegram to go the same way as his earlier project, VK. He wants it to be an ecosystem separated from state regulation. Like a tax haven, it would allow users to avoid taxes and state scrutiny of financial movements. If Durov succeeds in this endeavour, states will have a harder time tracking the financing, locations and interactions of terrorists.
So between the company’s unwillingness to cooperate with law enforcement—seen both in the Russian case and in Europol’s frustration—and IS’s preference for the app’s technology, Telegram is a major problem.
In part 2 of this post, I’ll look at why legislative fixes—such as the one proposed by the Australian government—will be insufficient to address these issues.