The UK House of Commons Digital, Culture, Sport and Media Committee recently published its interim report on its inquiry into disinformation and ‘fake news’. The report makes a lot of good recommendations, but one in particular stands out: dropping the term ‘fake news’ in favour of more specific terms like ‘disinformation’ and ‘misinformation’.
Disinformation (deliberately false information) isn’t the same as misinformation (unintentionally false information). While misinformation is problematic—say, when ‘misspeaking’ is twisted to mean saying the opposite of what was said—it’s also mostly nonthreatening.
But disinformation and disinformation campaigns are neither new nor benign. The so-called dark arts of psychological warfare, disinformation campaigns are a tried and tested tactic to foster distrust in the government.
During World War II, a journalist named Sefton Delmer was recruited for Britain’s Political Warfare Executive to wage a disinformation campaign against the German population. His team invented rumours suggesting elite corruption, jokes poking fun at the authorities, and anti-authority symbols. He ran a radio station staffed by a supposed German ‘insider’ who spread conspiracies about the German government.
As Delmer put it:
The objective … is subversive. We want to spread disruptive and disturbing news among the Germans which will induce them to distrust their government and disobey it, not so much from high-minded political motives as from ordinary human weakness … We are making no attempt to build up a political following in Germany.
The Russian equivalent—memes, conspiracy theories, fake ‘articles’—was on full display during the 2016 US election. Today, just as in the 1940s, it’s hard to measure the effect. Back then, German newspapers observed that it was difficult to tell the difference between three genuine news stories and a falsified fourth one. In 2016, a marginal effect could have been decisive: only 80,000 votes in three states made the difference between Hillary Clinton’s victory or defeat.
But Russia’s disinformation campaign is different from Delmer’s in two crucial ways. First, Russia has technology that Delmer could have only dreamed of. Delmer couldn’t force radio stations and people to circulate his propaganda, but Russia can by gaming hosting platforms’ algorithms. Second, Russia uses disinformation campaigns in peacetime. According to the authors of Cyber strategy, Russia is unique in using cyber operations to undermine democracies as a strategic goal.
But rather than calling and responding to disinformation operations for what they are, we have adopted, if ironically, one of US President Donald Trump’s favourite phrases: ‘fake news’.
Though the term wasn’t coined by Trump, it was popularised by him. He uses it to describe negative coverage as if it were deliberately false information, conflating two different things. Unflattering portrayals fit within the standard of truth we expect from media organisations, while disinformation undermines the capacities of individuals to make informed choices. Disinformation is only one part of Russia’s interference operations, but in many ways it’s more insidious than using trolls to inflame political divisions because it undermines the basic concept of truth. Democracies can’t fight against the erosion of the informed voter when false information is labelled as a difference of opinion.
So how do we combat these dark arts? For starters, media outlets, analysts, politicians and others need to stop using the term ‘fake news’. It’s too politically loaded, and using it as a catch-all blurs the distinction between opinion and fact and hinders the development of an appropriate response.
Instead, we need to revisit the vocabulary of subversive propaganda (add ‘cyber’, if need be). Terms like ‘disinformation’ are firmly grounded in historical precedent, even with the Russians (for example, the Soviets tried to spread rumours that the US military invented AIDS), and have a basis in international law (subversive propaganda has been regulated since World War I). These terms provide a stronger conceptual basis from which to criticise and oppose Russian actions. The methods are different, but fundamentally the strategic purpose of Russia’s cyber-interference operations is no different than in bygone eras.
Next, democracies need to actively defend themselves against disinformation. It’s not enough to pin responsibility on social media companies. Certainly, platforms like Twitter and Facebook must bear some of the blame—they are neither neutral nor powerless. Taking down fake accounts should become a routine operation, rather than a ‘spare time’ task. Moderating false information should be done carefully, with a view to preserving free speech—a group that flags blatantly false articles, similar to Wikipedia’s taskforce, might be a good model.
Ultimately, though, the real defence must begin with citizens. It’s not disinformation itself, but attention to disinformation that gives it oxygen. A conspiratorially minded fringe is okay—sites like 4chan and InfoWars churned out conspiracy theories long before the Russians weaponised them—but it’s problematic that we appear peculiarly willing to suspend disbelief and give attention to disinformation that appeals to our political points of view. Worryingly, Facebook’s attempts to reduce disinformation often result in users flagging articles they don’t like as ‘false’.
Digital literacy and critical analysis skills need to be prioritised, so that people can better identify disinformation, and distinguish between opinion and fact. In the meantime, journalists must continue to produce quality, fact-based news, and politicians need to lead in openly supporting journalism as the main weapon against disinformation. They need to refrain from using ‘fake news’—especially to describe negative coverage—and deny disinformation airtime. Politicians can also learn from international experiences: the failure of Russian interference in France’s election is particularly instructive.
However democracies choose to defend themselves, the first step is recognising that we’re under attack. In the 1940s, Delmer noted that the trick of spreading disinformation is disguising it so people don’t see it for what it is. Today, the term ‘fake news’ makes that easy.