- The Strategist - https://aspistrategist.ru -

The new struggle for truth in the era of deepfakes

Posted By on November 12, 2020 @ 15:15

A billboard firm cleverly lures clients with a simple slogan on an otherwise blank canvas—unsee this [1].

Its genius rests on a human trait: we can’t unsee, unhear or unsmell anything. Our senses are primordial devices programmed to extract millions of data points [2] every second, most of it, at some level, novel. Yet the brain can sort and analyse only around 50 points per second in order to assess a possible response.

Research suggests [3] we make more than 200 decisions just about food every day. Because the brain also chews through calories, as animals we favour simple responses that maximise our energy reserves. At some level, we hunger to be told what to do, and believe, because it’s much less tiring than overthinking.

Propaganda, whether purveyed by Joseph Goebbels, Proctor and Gamble, or Mills and Boon, is calibrated to overcome complex decision-making. Seeing is believing, a picture is worth a thousand words, and repetition and sloganeering make it stick.

Similarly, conspiracy theories satisfy our human biological platform, feeding our desire for simplicity rather than complexity. They are sticky, too. One of the stickiest is the anti-Semitic screed The protocols of the elders of Zion, written in Czarist Russia in around 1903, and still a hot favourite [4].

In the recent confessional Netflix documentary The Social Dilemma [5], some progenitors of the social media revolution explain that it was calculated to tease and slake our thirst for the novel, addictive and gossipy. Clickbait is exactly as it describes: small pieces of tempting information that attract our clicks, hooking us and generating advertising revenue for Google and Facebook.

Moreover, research [6] from the Massachusetts Institute of Technology suggests that fake news spreads six times faster than the truth. A drunk Nancy Pelosi makes for compelling ‘news’ that returns greater advertising revenue, rather than a sober House speaker doing more of the same dull routine of law-making and politicking. One wins plenty of clicks, the other doesn’t.

Pelosi was not drunk in that viral video of May 2019 [7]. The quick discovery that a malicious garden-shed-conservative propagandist had slowed the video to slur her speech made no difference. Slander sticks, gossip is viral, and repetition of another ‘drunk’ Pelosi video a year later reinforced the original for those who believed that she was, regardless of what was likely, let alone true.

As an instinctive huckster, President Donald Trump was well suited to the era of fake news, with his mercurial temperament, lurid sense of proportion, and sledgehammer approach to bedrock political traditions that once seemed perpetual. As his grip on the political bullhorn dwindles, the world is left assessing the damage of four years of distorted reality that have shaken the foundations of convention.

But before liberal states have fully grasped the corrosive effects of fake news—or legislated to rein in the social media behemoths that trade in it—we are on the cusp of the ‘deepfake’ era that will make the past four years seem as quaint as the cinematic effects of Woody Allen’s 1987 mockumentary Zelig [8], in which the chameleon-like Jewish imposter Zelig supports Adolf Hitler at a Nazi rally and peers from a Vatican balcony behind Pope Pius XI.

Artificial intelligence has supercharged the ability of amateurs to acquire the voice and image of anybody who has been recorded, and to recompose voice and image into entirely fake video sequences. It’s an emanation of the so-called fourth industrial revolution that is embedding hyper-technology into every particle of our lives. It promises a gamed-up world, in which the boundaries of reality are befuddled by a propaganda Pandora’s box in the palm of every hand. Welcome to the ‘infocalypse’.

At this point, the efforts remain reasonably juvenile, and detectable with complex software. But the author of a recent book [9] on deepfakes reckons that within a year, anybody with a mobile phone will be able to recreate and improve on the de-ageing effects applied to Robert De Niro and Al Pacino in Martin Scorsese’s 2019 movie The Irishmen [10]. When filmed, it was the result of hundreds of technicians, millions of dollars and a year of work.

For an introduction to deepfakes, or ‘synthetic media’, the makers of the satirical cartoon Southpark have conjured Sassy Justice [11], a new series that premiered just last month. It’s a radically technologically updated version of Zelig. Deeply funny, this deepfake show is also a portent of disaster, a time when we can no longer believe our eyes as willingly as we do today.

In the video, Donald Trump has been transformed into a satin-bewigged effeminate reporter from a news station in Cheyenne, Wyoming. Mark Zuckerberg, Julie Andrews, Jared Kushner, Ivanka Trump and Michael Caine have similarly been AI-shanghaied (presumably without their permission). What the show illustrates is the potential of this seminal technological revolution.

The entertainment possibilities are boundless. Paul Robeson can be brought to life as the new James Bond. Jim Morrison will join a mashed-up K-Pop tour. There’ll be remixes of Charlie Chaplin supporting #Me2 and #BLM, Amelia Earhart spruiking space flights for Elon Musk, and Maria Callas singing duets with Taylor Swift. In fact, there’ll be no need for actors, and with a few swipes on our mobile device each of us will be able to star in Titanic—or Gas Light [12].

Then there’s the bad stuff. If you think that social media stokes teenaged anxiety, there is worse to come. AI has already been deployed in what the creators of one app [13] claimed was harmless fun to strip the clothing from any photographic image of a female figure. As if that’s not bad enough, in this new dystopia a photo can be feasibly lifted from your child’s Instagram and transformed into a fully-fledged pornographic video. Try unseeing that.

When writing my own book [14] about the war in Sri Lanka, I relied on forensic reconstruction of open source audio-visual evidence with electronic fingerprints that left no doubt as to the provenance of evidence. Yet the weak link in any investigation is always doubt. In theory, AI can simply retool itself to avoid forensic detection. The implications for judicial and investigative processes are convulsive.

Identity theft is already a multibillion-dollar industry that financial institutions spend billions fighting against. An American widow was recently duped of almost $300,000 in the course of a romance conducted entirely over Skype by an imposter [15] posing as an American admiral. Your Nigerian scammer need no longer use their own voice, but instead can target the elderly with reconfigured voice recordings of their children lifted from Instagram grabs and reworked according to script: ‘Mother, can you wire me $10,000?’

Now contemplate the videos, photos and voice recordings that constitute the evening news, and the extent to which they drive political and social discourse and spark royal commissions, resignations of ministers and revolutions.

To conjure just one random unsettling example, an academic recently floated [16] the disbanding of Australia’s special forces due to allegations of criminal misconduct in Afghanistan, sourced from video evidence. The rules-based order [17] has been a central tenet of Australia’s foreign policy, and we like to think that we take our obligations under international law seriously.

Imagine, momentarily, a new deepfake body-cam video sequence showing Australian troops beheading Afghan civilians and desecrating the Koran. Grainy footage would do. Its release might ruin trade with the Middle East, lead to the killing of Australians and shatter agreements with nations such as Indonesia, as well as swing public pressure to disband the special forces.

The Russians [18] are still best at using information wedges. After a brief pause in the 1990s, Russian disinformation changed tack. Coupled with the internet, captive social media audiences and the smartphone, Russia exchanged parsimonious Cold War ops for ‘flooding the room’ (a favourite play of Trump’s, deployed in his first debate [19] against Joe Biden). The value proposition was proven in the spoliation of the 2016 presidential election result. Simmering US culture wars did the rest.

Pluralistic societies are being shaken by information-revolution developments eroding our resilience [20]. There is nothing new about political gossip, or the use of new technology for pornography or fraud, or companies making money, or adversary countries seeking an edge. What is new is the speed, scale, force multiplication and challenge to our singular human psychology. To quote Trump, a.k.a. Fred Sassy, ‘As human beings, we all rely on our eyes to determine reality.’

So what can we do when our eyes are no longer a measure of perception? When our senses are drowned in a flood of dubious images? How will we make truth more resilient in order to maintain stable governance, trust in institutions and faith in the evening news? Here are four suggested solutions.

1. Strengthen the gatekeepers. When the internet arrived, it seemed that everybody could be a journalist. But like it or not, there are hierarchies of competence [21]. Iconic media organisations governed by public values are a vital element of liberal democracy. Public broadcasters should be boosted [22], and their reach expanded, not defunded at a time when competitor nations like Russia and China are expanding their media reach.

2. Legislatively decouple Facebook and Google from their clickbait-driven profit bases, because whatever the cost to shareholders cannot compare with the social, political and economic losses to society at large. Clickbait algorithms destroy advertising revenue streams [23] that fertilise the small-town journalism that is the bedrock of the media’s oversight and investigative role.

3. Legislate to protect the role of the media. According to the Alliance for Journalists’ Freedom, Australia is the weakest [24] of the Five Eyes alliance countries when it comes to protecting the media (a weakness that accounts for the raids on the ABC by the Australian Federal Police that made headlines around the world). These protections ought to include media freedom laws, a public interest defence in defamation, the protection of journalists’ data, protection for whistleblowers, and a public-interest test in matters of national security.

4. Build a farm-to-table system for news. All news and information must be traceable to sources so that hierarchies of competence can be established, and rated on a scale of indicators. Information needs an evidentiary chain, or a genealogy for consumers to establish ‘truth’ to their satisfaction. The security of such a system is complex and expensive, but possible with block-chain technology, multilateral R&D and shared purpose between like-minded nations.



Article printed from The Strategist: https://aspistrategist.ru

URL to article: /the-new-struggle-for-truth-in-the-era-of-deepfakes/

URLs in this post:

[1] unsee this: https://buythisspace.com.au/

[2] millions of data points: https://www.britannica.com/science/information-theory/Physiology

[3] Research suggests: https://www.researchgate.net/publication/227344004_Mindless_Eating_The_200_Daily_Food_Decisions_We_Overlook

[4] hot favourite: https://www.theatlantic.com/politics/archive/2020/08/conspiracy-theory-rule-them-all/615550/

[5] The Social Dilemma: https://www.youtube.com/watch?v=uaaC57tcci0

[6] research: https://news.mit.edu/2018/study-twitter-false-news-travels-faster-true-stories-0308

[7] viral video of May 2019: https://www.washingtonpost.com/technology/2020/08/03/nancy-pelosi-fake-video-facebook/

[8] Zelig: https://www.youtube.com/watch?v=OhpYWVvUPRU

[9] recent book: https://www.amazon.com.au/Deepfakes-Coming-Infocalypse-Nina-Schick/dp/1538754304

[10] The Irishmen: https://www.vulture.com/2020/01/how-the-irishman-used-cgi-and-special-effects-on-actors.html

[11] Sassy Justice: https://www.youtube.com/watch?v=9WfZuNceFDM

[12] Gas Light: https://en.wikipedia.org/wiki/Gaslighting

[13] one app: https://www.theverge.com/2019/6/27/18761496/deepnude-shuts-down-deepfake-nude-ai-app-women

[14] book: https://www.amazon.com.au/Cage-Fight-Lanka-Tamil-Tigers/dp/1934137545

[15] imposter: https://www.thebritishjournal.com/world/romance-scammer-used-deepfakes-to-impersonate-a-navy-admiral-and-bilk-widow-out-of-nearly-300000-thebritishjournal-reports-182705-2020/

[16] recently floated: https://theconversation.com/the-reputation-of-australias-special-forces-is-beyond-repair-its-time-for-them-to-be-disbanded-148795

[17] rules-based order: https://interactives.lowyinstitute.org/features/rules-based-order/

[18] The Russians: https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/csi-studies/studies/vol-64-no-1/active-measures-and-information-wars.html

[19] first debate: https://www.youtube.com/watch?v=CweqW7Pzxz8

[20] eroding our resilience: https://apo.org.au/node/309148

[21] hierarchies of competence: https://www.nytimes.com/2020/11/01/business/media/ben-smith-election.html

[22] should be boosted: https://podcasts.apple.com/au/podcast/ideas/id151485663?i=1000497388239

[23] destroy advertising revenue streams: https://dankennedy.net/2020/05/11/how-google-destroyed-the-value-of-digital-advertising/

[24] the weakest: https://www.gtlaw.com.au/insights/press-freedom-australia-white-paper

Copyright © 2024 The Strategist. All rights reserved.