A senior analyst with ASPI’s International Cyber Policy Centre, Fergus Ryan took part in today’s hearing of the Senate Select Committee on Foreign Interference through Social Media. This is his contribution to ASPI’s submission to the inquiry in which he focused on his concerns about TikTok. The full submission is available for download on the Parliament of Australia website.
There are three main national security risks with the PRC-owned video-sharing app, TikTok, that Australians should be concerned about. Two of them—data and content manipulation—are applicable to most other major social media apps regardless of their country of origin. The third risk, that a single political party, the Chinese Communist Party (CCP), has decisive leverage over TikTok, exacerbates the other two risks and is unique to TikTok as a major mainstream social media app.
The first and most discussed risk is about data. Following years of scrutiny, TikTok has been forced to be more forthcoming about the fact that TikTok user data is accessible and has been accessed from the PRC. Close observers of TikTok statements from as early as 2020 know that it has only ever been TikTok’s goal for China-based employees to have minimal access to user data–not to cut it off completely.
Furthermore, the app relies on this access to function. As stated in a September 2020 sworn affidavit by the company’s then chief information security officer, ‘TikTok relies on China-based ByteDance personnel for certain engineering functions that require them to access encrypted TikTok user data.’
In 2023, this still has not changed. Even as the company puts into place its US$1.5 billion plan dubbed ‘Project Texas’ to move all data attached to American users to the United States, and to institute various governance, compliance and auditing systems to mitigate national security concerns, TikTok vice president Michael Beckerman maintains that engineers based in China ‘might need access to data for engineering functions that are specifically tied to their roles.’ At a Senate hearing about social media and national security in September 2022, Vanessa Pappas, TikTok’s chief operating officer, declined to commit to cutting employees in China off from the app’s user data.
As long as PRC-based engineers are able to access TikTok user data, that data is at risk of being accessed and used by PRC intelligence services. TikTok’s constant refrain that user data is stored in Singapore and the US and that it would never hand over the data to the Chinese government even if it were asked is beside the point. The location in which any data is stored is immaterial if it can be readily accessed from China.
Moreover, TikTok’s parent company, ByteDance, couldn’t realistically refuse a request from the Chinese government for TikTok user data because a suite of national security laws effectively compels individuals and companies to participate in Chinese ‘intelligence work’. If the authorities requested TikTok user data, the company would be required by law to assist the government and then would be legally prevented from speaking publicly about the matter.
Unfortunately, even if TikTok’s parent company, ByteDance, were able to sever access to the app’s user data from the PRC, Beijing’s intelligence services could still readily access sensitive data on virtually anyone in Australia via the commercial data broker market.
Second, in what has unfortunately been an under-discussed risk, TikTok could continue to skew its video recommendations in line with the geopolitical goals of the CCP. This threat continues to worsen as more and more people get their news and information from online platforms such as TikTok over which the Chinese party-state can control, curate and censor content.
There’s ample evidence that TikTok has done this in the past. Leaked content moderation documents have previously revealed that TikTok has instructed ‘its moderators to censor videos that mention Tiananmen Square, Tibetan independence, or the banned religious group Falun Gong’, among other censorship rules. TikTok insists that those documents don’t reflect its current policy and that it had since embraced a localised content moderation strategy tailored to each region.
In ASPI’s 2020 report into TikTok and WeChat, we found they suppressed LGBTQ+ content in at least eight languages. After British MPs questioned TikTok executives about our findings, they publicly apologised. Our report also included a deep dive on TikTok’s Xinjiang hashtag & found a feed that was flooded with glossy propaganda videos with only 5.6% of those videos being critical of the crackdown on the Uyghurs.
In 2022, TikTok blocked an estimated 95% of content previously available to Russians, according to Tracking Exposed, a nonprofit organisation in Europe that analyses algorithms on social media. In addition to this mass restriction of content, the organisation also uncovered a network of coordinated accounts that were using a loophole to post pro-war propaganda in Russia on the platform. In other words, at the outset of Putin’s invasion of Ukraine, TikTok was effectively turned into a 24/7 propaganda channel for the Kremlin.
Following years of intense scrutiny, it is unlikely that TikTok will, in any overt way, become a conduit for pro-CCP propaganda. In a welcome sign in recent months, the company has even begun to label ‘China state-affiliated’ accounts on the platform. It is unclear if these labels also ensure that the content is reduced on the platform as it currently does on other platforms like Twitter.
To further build confidence, TikTok should, as other social media platforms have, regularly investigate and disclose information operations being conducted on the platform by state and non-state actors.
Any manipulation of the public political discourse on TikTok is likely to be subtle. Unfortunately, because each user’s TikTok feed is different, any influence the CCP has over the app will be very difficult to track. It would be trivially easy for the app to, for example, promote or demote certain political speech in line with the CCP’s preferences. The app could tip the scales in favour of speech attacking a political candidate who is critical of the CCP, for example.
TikTok certainly has the ability to detect political speech on the app as it monitors keywords in posts for content related to elections so that it can then attach links to its in-app elections centre. Experiments conducted by nonprofit group Accelerate Change found that including certain election-related words in TikTok videos decreased their distribution by 66%. They also found that TikTok is consistently suppressing videos when it can detect that they’re about voting.
In 2020, US TikTok executives noticed views for videos from certain creators about the US presidential election were mysteriously dropping 30% to 40%, according to people familiar with the episode and cited by the Wall Street Journal. The executives found that a team in China had changed the algorithm to play down political conversations about the election.
Algorithmic manipulation of content is not limited to TikTok. To take one example in February 2023, Twitter chief executive Elon Musk rallied a team of roughly 80 engineers to reconfigure the platform’s algorithm so that his tweets would be more widely viewed. There is clearly a need for all social media companies to be more transparent about how changes to their algorithms affect the content users receive.
The third risk, rightly identified by Cybersecurity Minister Clare O’Neil as a ‘relatively new problem’, is that apps like TikTok are, as the minister put it, ‘based in countries with a more authoritarian approach to the private sector’.
For TikTok’s parent company, ByteDance, this authoritarian approach has included compelling company founder Zhang Yiming to make an abject apology in a public letter for failing to respect the Chinese Communist Party’s ‘socialist core values’ and for ‘deviating from public opinion guidance’—one of the CCP’s terms for censorship and propaganda.
The enormous leverage the CCP has over the company drove ByteDance to boost its army of censors by an extra 4,000 people (candidates with party loyalty were preferred) and it’s what continues to motivate ByteDance to conduct ‘party-building’ exercises inside the company.
In April 2021, Beijing quietly formalised a greater role in overseeing ByteDance when state investors controlled by the China Internet Investment Fund (controlled by internet regulator CAC) and China Media Group (controlled by CCP’s propaganda department) took a 1% stake in ByteDance’s Chinese entity, Beijing ByteDance Technology, giving it veto rights over the company’s decisions. At the time, one of the other two seats on the company’s board was held by Zhang Fuping (张辅评) who was secretary of the company’s Party committee.
More recently the CAC named a director from its bureau overseeing data security and algorithmic governance to the board of ByteDance’s main Chinese entity. According to the Wall Street Journal, this director replaced another CAC official who was formerly part of the regulator’s online opinion bureau.
The PRC party-state is, in other words, completely intertwined with ByteDance to the extent that the company, like many other major Chinese tech companies, can scarcely be considered a purely private company that is only geared towards commercial ends. These companies are neither state-owned nor private, but hybrid entities that are effectively state-controlled.
Too much of the public discussion about the risks of TikTok has been narrowly focused on data security. Even if TikTok were to completely sever access to its user data from China (which it does not plan to do), China’s intelligence services could still buy similar user data from data brokers.
It therefore would be to Australia’s benefit if more rigorous data privacy and data protection legislation were introduced that apply to all firms operating here regardless of ownership. If protecting national security and guarding against foreign interference are our goals, a broad approach such as this is necessary.
But a complete overhaul of regulation around data will still not address the risk that the CCP could leverage its overwhelming influence over TikTok and its parent company ByteDance to manipulate Australia’s political discourse in a way that would be unlikely to be detected.
There’s no technical fix to a problem driven by ideology. The CCP considers the country’s lack of soft power or ‘international discourse power’ (国际话语权), as having a ‘discourse deficit’ (话语赤字) against the strength of Western media and governments, which in turn has a serious impact on China’s international ambitions. The party is open about its view that homegrown social media apps like TikTok present the opportunity to leapfrog the West and begin to meaningfully close that gap. In the past they’ve attempted to conduct their influence operations on Western social media apps in a process referred to as ‘Borrowing a boat out to sea’ (借船出海). With TikTok, they own the boat.
(This piece has also appeared on the author’s online newsletter, Red Packet, about China, censorship, surveillance & propaganda.)