- The Strategist - https://aspistrategist.ru -

Protecting democracy from the reversibility of online information

Posted By on March 10, 2022 @ 14:00

Cybersecurity and the fight against disinformation share one key feature that, if better understood, could point the way to a more durable defence for democracies.

Malware on the internet and the meaning of content online are reversible in ways that challenge the orderly processing of information needed for stable democracies.

In the cyber domain, order is the ability for businesses, governments and economies to function without data breaches, disruptions and the theft of valuable data.

Order, in the case of online content, means the public’s ability to understand and trust the information they receive.

The weapons of malware on the internet are themselves information that, with re-engineering, can be repurposed to be used against their creators.

The Shadow Brokers hacking group exposed tools used by US intelligence agencies. Once the tools were hacked and released in 2016, they were incorporated into ransomware used against US and Western targets.

Researchers Karlis Podins and Kenneth Geers write [1]:

Malware is a weapon unlike old-fashioned tanks and planes … [F]ully-functioning cyber weapons can be found every day, by a careful observer, within network traffic and even on most email servers. And just as with Aladdin’s magic lamp, these tools can be quickly repurposed for new operations, entirely distinct from what the malware was originally intended to do.

Even discerning the motives of hacking is deeply dependent on perspective.

I once heard a cybersecurity professional explain that a poorly written antivirus tool was hard to distinguish from a piece of advanced malware.

The ambiguity about code found on networks extends to online content in the form of words, videos and images. Even if the facts are agreed (a bomb exploded, a politician made a statement), meaning inevitably diverges. Online, though, meaning is easily, wilfully inverted to the opposite of its original intention.

At the start of the Covid-19 pandemic, the White House’s chief medical adviser Anthony Fauci urged people not to use face masks. Months into the outbreak, Fauci changed his advice about the utility of masks to prevent Covid’s spread. Today, millions of people read the change in advice as evidence of a government cover-up.

The further the intended audience is from the action, the easier it is to present facts to them with the meaning reversed: think of the successful effort by Russia and Syrian proxies to convince the world that the White Helmets rescue crews were terrorists and crisis actors.

The easy reversibility of meaning is a challenge for the democratic system that relies on a baseline of public understanding of events among differing communities.

Reversed meaning is distinct from opinions differing over the same sets of facts. If it were just about opinions, the Associated Press would not now produce a story [2] called ‘NOT REAL NEWS: A look at what didn’t happen this week’ including items like ‘Photos of London Olympics don’t show prior knowledge of pandemic’.

Instead, it’s a matter of misunderstandings that unspool easily, or that are encouraged to unspool, across information found online.

What is routinely called ‘disinformation’ flourishes in this environment.

The problem for democracy is that the understanding of broad notions that bind a society shouldn’t be so easily reversible: questions like who is the legitimate authority and who is the recognised winner of an election.

Differing views are the lifeblood of democracy.

Before the internet, information was just as contested as it is today. However, the economics of communication didn’t allow for the inversion of meaning at the scale the internet now permits.

‘Poe’s law’ holds that it’s impossible to create a parody on the internet without attracting someone who takes the joke as being real.

The same logic applies to non-parody.

On 6 January last year, an organised riot and coup attempt took place at the US Capitol. A year later, pro–Donald Trump demonstrators argued [3] the riot was akin to the Tiananmen Square protests of 1989 in China.

A ‘solidarity’ movement has emerged in Australia and Canada (replete with the signage inspired by the 1980s Polish trade movement). If members see ‘communism’ everywhere, they can co-opt images of the 1980s anti-communist movement to galvanise people over networks, irrespective of historical fact.

Shouting verified facts onto social media only creates more fuel for misunderstanding and confusion. What’s required for democracy to thrive is to make some core ideas and information less easily reversible. But how?

The answer is to develop strategies around information that make certain democratic meanings more ‘sticky’. The evolution in ransomware defence offers a clue. Cyber defenders have learned [4] they can no longer simply out-engineer would-be ransom-seekers; they must take action to change the behaviour of the humans operating on the other side of the network.

So we see publicised arrests, warnings made through diplomatic channels, legal reforms to speed the flow of information needed to recover stolen funds, even efforts that instil doubt in the ransomware gang themselves—all of these are human-to-human operations over the internet.

In the realm of communication, perhaps core meanings central to democracy (‘insurrection is not a legitimate option’, for example) can be made stickier in the minds of the public if the locus of communication relies less on the internet itself and more on the human-to-human cross-promotion of the narrative.

Overpowering the reversibility of information online requires information that orients or makes sense of the same information offline.

When Russia annexed Crimea, when Russian-backed militants shot down MH17 and when Russia intervened in Syria, the Kremlin and its proxies were able to spread information that could be easily, wilfully misconstrued online. Russian soldiers were ‘little green men’, and White Helmets were ‘terrorists’.

Differences in fact were recast as mere differences of opinion.

In the current crisis over Ukraine, Russia’s efforts to paint the West as the aggressor have been stymied in part by the US State Department and the British foreign office releasing information that, once ingested into the news cycle, makes it more difficult to claim that the US is fomenting war with Russia, or that the US–UK–Australia position on Ukraine in 2022 has parallels to the same governments’ views of Iraq in the lead-up to the 2003 invasion.

Another broader example: US President Joe Biden’s Summit for Democracy in December was an online event of sufficient complexity and breadth that no single fact that could be wilfully misconstrued to invert its overall meaning. Democracies with shared interests stand together in the world.

Could this strategy be replicated on smaller issues like those that are debated in domestic politics?

The goal in making the ideas that are core to democracy less reversible is not to make those ideas irreversible, which evinces the rigid thinking of illiberal regimes.

Nonetheless, we have to recognise that truths once seen as self-evident are no longer self-evident online.

With the emergence of the metaverse, virtual reality, non-fungible tokens and cryptocurrencies, what’s not real will continue to compete for our attention with what is real—like democracies and their citizens.

If we recognise that the world of digital information is, at heart, like the world of digital code where everything can be re-engineered and reversed easily, then we can build new strategies around information. One of those strategies should be to ensure the meaning of core issues is anchored in a way that supports democratic outcomes.



Article printed from The Strategist: https://aspistrategist.ru

URL to article: /protecting-democracy-from-the-reversibility-of-online-information/

URLs in this post:

[1] write: https://ccdcoe.org/uploads/2018/10/Art-10-Aladdins-Lamp.-The-Theft-and-Re-weaponization-of-Malicious-Code.pdf

[2] story: https://apnews.com/article/coronavirus-pandemic-kyle-rittenhouse-ghislaine-maxwell-sports-entertainment-7ffe13c23d14043f69beb77f88309809

[3] argued: https://www.washingtonpost.com/dc-md-va/2022/01/06/dc-vigils-january-6-capitol/

[4] have learned: https://www.zdnet.com/article/ransomware-is-the-party-almost-over-for-the-cyber-crooks/

Copyright © 2024 The Strategist. All rights reserved.