{"id":64026,"date":"2021-04-22T15:16:22","date_gmt":"2021-04-22T05:16:22","guid":{"rendered":"https:\/\/www.aspistrategist.ru\/?p=64026"},"modified":"2021-04-22T15:44:29","modified_gmt":"2021-04-22T05:44:29","slug":"defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference","status":"publish","type":"post","link":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/","title":{"rendered":"Defending democracies from disinformation and cyber-enabled foreign interference"},"content":{"rendered":"
<\/figure>\n

The Covid-19 pandemic has caused unique societal stress as governments worldwide and their citizens have struggled to work together to contain the virus and mitigate its economic impact. This has been a trying time for democracies, testing the capacity of democratic governance to mobilise state and citizenry to work together. It has also tested the integrity of open information environments and the ability of these environments to deal with the overlapping challenges of disinformation, misinformation, election interference and cyber-enabled foreign interference.<\/p>\n

Covid-19 has spurred the world into a new era of disinformation where we can see the daily erosion of credible information. Individuals, organisations and governments are increasingly fighting for the value of facts. But the global information environment was home to bad-faith actors long before the pandemic hit, from states interfering overseas or misleading their own populations with targeted disinformation to conspiracy groups like QAnon and alt-right extremist groups. Some of these groups have leveraged<\/a> legitimate public concerns related to the pandemic, vaccine rollouts and issues like data privacy to build new conspiracy theories, and Covid-19 has provided them with a bigger platform to do so.<\/p>\n

Relationships between governments and social media platforms are increasingly strained. Divisions are deepening about how to best balance free expression while dealing with the public harms caused by mis- and disinformation and speech that incites violence or hatred, and how to best tackle rapidly emerging issues such as the proliferation of manipulated content and the risks caused by increasingly sophisticated<\/a> deep-fake technologies that could mislead and erode trust in institutions.<\/p>\n

Policymakers are also increasingly frustrated<\/a> at seeing authoritarian states, particularly China and Russia, leverage US social media networks<\/a> and search engines<\/a> to project propaganda and disinformation<\/a> to an increasingly global audience. This is particularly perplexing, as the Chinese state, for instance, bans these same platforms at home and both limits access to<\/a> and censors<\/a> foreign embassy accounts on Chinese social media platforms.<\/p>\n

The online ecosystem allows a range of state and non-state actors that are already manipulating the information environment for strategic gain to shift<\/a> from exploiting the pandemic to disrupting and undermining<\/a> trust in Covid-19 vaccine rollouts. This shift will only further tear at the already tense relationship between democratic governments and the major technology platforms. Anti-vaccination, conspiracy-infused misinformation disseminated across social media has mobilised public opposition to vaccine programs in<\/a> several<\/a> countries<\/a>. \u00a0January\u2019s Capitol Hill riot in Washington demonstrated the potential for online mobilisation to transfer offline and manifest<\/a> as violent civil unrest.<\/p>\n

This is not the public square we want or need, especially as the world seeks to return to some version of normality in 2021.<\/p>\n

From elections and state actors to 24\/7 cyber-enabled interference<\/strong><\/p>\n

Governments have had more success at raising public awareness about online election interference than other forms of such intrusion. Since the Russian meddling in the 2016 US election, democracies have been concerned that the legitimacy of their mandates could be tarnished through election interference, although this preoccupation has distracted from authoritarian states\u2019 broader efforts to shape the information environment.<\/p>\n

ASPI\u2019s research on cyber-enabled interference targeting electoral events<\/a> has identified two predominant vectors: cyber operations, such as denial-of-service and phishing attacks to disrupt voting infrastructure and\/or to target electronic voting; and online information operations that attempt to exploit the digital presence of election campaigns, voters, politicians and journalists. Together, these two attack vectors have been used to seek to influence voters and their turnout at elections, manipulate the information environment and diminish public trust in democratic processes.<\/p>\n

ASPI\u2019s research identified 41 elections and seven referendums between January 2010 and October 2020 where cyber-enabled election interference was reported. There has been a significant uptick in this activity since 2017, with Russia the most prolific state actor engaging in online interference, followed by China (whose cyber-enabled election interference activity has increased significantly since 2019), Iran and North Korea.<\/p>\n

Following several high-profile incidents of election interference, there is now a proliferation of multi-stakeholder forums<\/a> designed to coalesce public and policy attention around malign online activity leading up to and during elections. But an exclusive focus on the interference that surrounds elections is problematic. Information operations and disinformation campaigns\u2014that exploit the openness of the information space in democratic societies\u2014are happening every day, across all social media platforms.<\/p>\n

In the Philippines, researchers and investigative journalists have repeatedly<\/a> shown<\/a> how<\/a> the Duterte administration has continued to rely on the influence-for-hire market of bloggers, trolls, inauthentic accounts and online influencers to create and promote pro-government content and help distract citizens from issues such as the government\u2019s handling<\/a> of Covid-19. Social media platforms are showing a growing willingness to publicly attribute such activity. In September, Facebook linked the Philippines\u2019 military and police<\/a> to the removal of a network of domestically focused fake profiles, consisting of 20 Instagram accounts, 57 Facebook accounts and 31 Facebook pages.<\/p>\n

And it\u2019s not just states that spread disinformation. News outlets, fringe media and conspiracy sites\u2014some with significant global reach\u2014are also guilty of deliberately misleading their audiences. For example, in December 2019, Facebook took down<\/a> more than 800 accounts, pages and groups linked to the conservative, Falun Gong\u2013affiliated Epoch Times<\/em> for misrepresentation and coordinated inauthentic behaviour.<\/p>\n

Governments that shift their attention to these issues only in the lead-up to and during an election miss the bigger strategic picture as malign actors consistently target the fissures of social cohesion in democracies. Some strategic actors have aspirations that are much more global than influencing an individual country\u2019s election outcome. While governments have spent the last few years (re)building their capabilities to counter foreign interference, they are struggling to handle the different set of complicated challenges\u2014from online attribution and enforcement, to protecting citizens<\/a> from harassment and threats from foreign actors\u2014posed by cyber-enabled foreign interference outside of election time. One issue is that, unlike with traditional foreign interference, the responsibility for action is distributed across the platforms and government agencies. In many countries, unless there\u2019s an election to focus on, government leadership has fallen mainly down the cracks between intelligence, policing and policy agencies.<\/p>\n

The Chinese state\u2019s flourishing interference and disinformation efforts<\/strong><\/p>\n

Given authoritarian regimes\u2019 limited capacity to absorb social unrest peacefully, including in cyberspace, the pandemic has threatened stability. The emergence of Covid-19 from Wuhan created the risk of domestic political instability for the Chinese Communist Party. The party-state\u2019s international standing was endangered by the spread of the virus and the resulting global economic disruption.<\/p>\n

So how did the CCP respond to this challenge as Covid-19 spread? It threw itself into a battle of information and narratives, much of which played out online and continues to evolve today. At home, it suppressed and censored<\/a> information about the virus. Open-source researchers and citizen journalists in China who had been collecting and archiving online material at risk from censorship were detained and had their projects shuttered.<\/p>\n

China\u2019s censors also sent thousands of confidential directives<\/a> to media outlets and propaganda workers, curated and amended trending-topics pages, and activated legions of inauthentic online commentators to flood social sites with distracting commentary. One directive from the Cyberspace Administration said<\/a> the agency should control messages within China and seek to \u2018actively influence international opinion\u2019.<\/p>\n

The effort to influence international opinion, which remains ongoing, relied on a very different toolkit to the one wielded at home. US social media networks were central, providing the ideal platform for China\u2019s \u2018wolf warrior\u2019 diplomats, state media outlets, pro-CCP trolls and influencers, and inauthentic accounts to boost and project the CCP\u2019s narratives, including disinformation about where the virus originated. They also provide the perfect space for this collective of online actors to try to undermine critical reporting<\/a> from Western media, research institutes and NGOs, and smear and harass researchers and journalists, whose work provided facts and analysis in stark contrast to the propaganda being disseminated globally by the CCP.<\/p>\n

The Chinese state\u2019s large-scale pivot<\/a> to conducting information and disinformation operations on US platforms occurred across 2019 and 2020. As the pandemic spread, the Chinese state found itself ideally positioned to experiment with cross-platform and multi-language information activity that targeted overseas audiences and largely amplified<\/a> Covid-19-related and political narratives.<\/p>\n

But the efforts of the Chinese state lack the sophistication of others that engage in this online behaviour, such as Russia. For example, the Chinese state makes little effort to cultivate and invest in rich, detailed online personas, and it lacks the linguistic and cultural nuance needed to build credible fake influence networks. Despite this, the Chinese state\u2019s information efforts are persistent. While it may have focused on quantity over quality thus far, given the enormous resourcing the CCP can bring to developing this capability, quick improvement can be expected. The Chinese state also brings an alignment in tactics and coordination\u2014among diplomats, state media outlets, co-opted foreign fringe media organisations, and pro-CCP trolls and influencers\u2014that no other state can match.<\/p>\n

Stronger defence and models for collaboration<\/strong><\/p>\n

Covid-19 and the CCP\u2019s efforts to control and shape international information flows about the pandemic through online propaganda and disinformation have made clear just how easy it is for malign actors to manipulate open information environments.<\/p>\n

Harder choices will have to be made about how to protect our information ecosystems better and how to deter and impose costs on the many malign actors seeking to exploit it. This will require governments to work more closely with the platforms and civil society groups to \u2018defend forward\u2019 to counter and disrupt malicious information activity. There is also a lucrative market of influence-for-hire service providers, to which state actors can outsource propaganda distribution and influence campaigns to obfuscate their activities. These commercial actors are increasingly part of the fabric of political campaigning in many countries. However, the lack of transparency around these activities risks corrupting the quality of democracy in the environments in which they operate.<\/p>\n

Globalisation and the openness of democracies make these acute challenges, as their openness has left democratic states vulnerable to threats of interference and subversion. Much of the thinking around cyber-enabled foreign interference has been framed by Russian meddling in the 2016 US election, yet other strategic actors are able to exploit disinformation campaigns and information operations in powerful combination with other levers of state power. China, for instance, has interwoven disinformation<\/a> with its diplomatic and economic coercion of Australia in retaliation for the Australian government\u2019s call for an independent international inquiry into the origins of the Covid-19 pandemic.<\/p>\n

Given the cross-cutting nature of this challenge, diplomacy and policy are fundamental to pulling together like-minded countries to engage with and contest cyber-enabled foreign interference and the actors\u2014state and non-state\u2014that spread disinformation for strategic gain. Social media platforms have been a front line on this battlefield, and it is often the platforms that must detect and enforce against state-linked information operations and disinformation campaigns that exploit their users. Yet, the platforms are driven by different motivations from national governments.<\/p>\n

Multilateral and multi-stakeholder approaches must be encouraged to facilitate the defence of democracy as a system of governance and values. This is particularly important in the arc of states from Japan down through Southeast Asia to India, many of which have fast-growing economies but fragile democracies, and where the Chinese state\u2019s power projection has the potential to influence a long-term drift away from democratic governance.<\/p>\n

There are models for collaboration between states in pushing back against interference. The European Centre of Excellence in Countering Hybrid Threats<\/a> draws together expertise from across the EU and NATO to facilitate strategic dialogue on responding to hybrid threats, developing best practice, building capacity through training and professional development, and joint exercises. NATO Stratcom<\/a> is another centre of excellence that combines both strategic and tactical expertise from across the alliance in collective defence against disinformation and information operations.<\/p>\n

These models could be replicated through the Quad grouping of Australia, India, Japan and the US. The alignment of interests among these countries could provide an important vehicle for building structures like those that have been trialled elsewhere and offer resilience against cyber-enabled foreign interference. This should include multi-stakeholder 1.5-track engagement that brings together governments, civil society and industry; mitigates against the splintering of economic and national security interests; and drives greater investment in civil society capacity building around detection, strategic communications and digital diplomacy. Social media networks and search engines must do a better job at deterring and punishing actors that actively spread disinformation on their platforms and should audit what they categorise and promote as \u2018news\u2019<\/a>.<\/p>\n

Finally, there is strength in democratic collectives. Governments themselves can take steps to mitigate the risks of cyber-enabled foreign interference, but democracies can increase their power by banding together to attribute, raise costs and deter interference by other states. States targeted individually may be reluctant to escalate grey-zone aggression. However, where there\u2019s a collective response, adversaries are likely to recalibrate their behaviour in the face of collective actions like diplomatic measures and economic sanctions.<\/p>\n

This is an abridged version of Danielle Cave and Jake Wallis\u2019s essay for the Observer Research Foundation\u2019s 2021 Raisina Dialogue. For the full paper, including detailed policy recommendations, click here<\/a>.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"

The Covid-19 pandemic has caused unique societal stress as governments worldwide and their citizens have struggled to work together to contain the virus and mitigate its economic impact. This has been a trying time for …<\/p>\n","protected":false},"author":691,"featured_media":64030,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_mi_skip_tracking":false,"footnotes":""},"categories":[1],"tags":[106,2279,2243],"class_list":["post-64026","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-general","tag-democracy","tag-disinformation","tag-election-interference","dinkus-strategist-special-report"],"acf":[],"yoast_head":"\nDefending democracies from disinformation and cyber-enabled foreign interference | The Strategist<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Defending democracies from disinformation and cyber-enabled foreign interference | The Strategist\" \/>\n<meta property=\"og:description\" content=\"The Covid-19 pandemic has caused unique societal stress as governments worldwide and their citizens have struggled to work together to contain the virus and mitigate its economic impact. This has been a trying time for ...\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/\" \/>\n<meta property=\"og:site_name\" content=\"The Strategist\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/ASPI.org\" \/>\n<meta property=\"article:published_time\" content=\"2021-04-22T05:16:22+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-04-22T05:44:29+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"966\" \/>\n\t<meta property=\"og:image:height\" content=\"726\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Danielle Cave\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@ASPI_org\" \/>\n<meta name=\"twitter:site\" content=\"@ASPI_org\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Danielle Cave\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#website\",\"url\":\"https:\/\/www.aspistrategist.ru\/\",\"name\":\"The Strategist\",\"description\":\"ASPI's analysis and commentary site\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.aspistrategist.ru\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-AU\"},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-AU\",\"@id\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#primaryimage\",\"url\":\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg\",\"contentUrl\":\"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg\",\"width\":966,\"height\":726},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/\",\"url\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/\",\"name\":\"Defending democracies from disinformation and cyber-enabled foreign interference | The Strategist\",\"isPartOf\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#primaryimage\"},\"datePublished\":\"2021-04-22T05:16:22+00:00\",\"dateModified\":\"2021-04-22T05:44:29+00:00\",\"author\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/1730ec525f034baa16dd911fea57775f\"},\"breadcrumb\":{\"@id\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#breadcrumb\"},\"inLanguage\":\"en-AU\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.aspistrategist.ru\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Defending democracies from disinformation and cyber-enabled foreign interference\"}]},{\"@type\":\"Person\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/1730ec525f034baa16dd911fea57775f\",\"name\":\"Danielle Cave\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-AU\",\"@id\":\"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/0eb0eb0ac065aaf45b63a5b7a87b53d7?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/0eb0eb0ac065aaf45b63a5b7a87b53d7?s=96&d=mm&r=g\",\"caption\":\"Danielle Cave\"},\"url\":\"https:\/\/www.aspistrategist.ru\/author\/danielle-cave\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Defending democracies from disinformation and cyber-enabled foreign interference | The Strategist","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/","og_locale":"en_US","og_type":"article","og_title":"Defending democracies from disinformation and cyber-enabled foreign interference | The Strategist","og_description":"The Covid-19 pandemic has caused unique societal stress as governments worldwide and their citizens have struggled to work together to contain the virus and mitigate its economic impact. This has been a trying time for ...","og_url":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/","og_site_name":"The Strategist","article_publisher":"https:\/\/www.facebook.com\/ASPI.org","article_published_time":"2021-04-22T05:16:22+00:00","article_modified_time":"2021-04-22T05:44:29+00:00","og_image":[{"width":966,"height":726,"url":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg","type":"image\/jpeg"}],"author":"Danielle Cave","twitter_card":"summary_large_image","twitter_creator":"@ASPI_org","twitter_site":"@ASPI_org","twitter_misc":{"Written by":"Danielle Cave","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebSite","@id":"https:\/\/www.aspistrategist.ru\/#website","url":"https:\/\/www.aspistrategist.ru\/","name":"The Strategist","description":"ASPI's analysis and commentary site","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.aspistrategist.ru\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-AU"},{"@type":"ImageObject","inLanguage":"en-AU","@id":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#primaryimage","url":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg","contentUrl":"https:\/\/www.aspistrategist.ru\/wp-content\/uploads\/2021\/04\/interference2204.jpg","width":966,"height":726},{"@type":"WebPage","@id":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/","url":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/","name":"Defending democracies from disinformation and cyber-enabled foreign interference | The Strategist","isPartOf":{"@id":"https:\/\/www.aspistrategist.ru\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#primaryimage"},"datePublished":"2021-04-22T05:16:22+00:00","dateModified":"2021-04-22T05:44:29+00:00","author":{"@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/1730ec525f034baa16dd911fea57775f"},"breadcrumb":{"@id":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#breadcrumb"},"inLanguage":"en-AU","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.aspistrategist.ru\/defending-democracies-from-disinformation-and-cyber-enabled-foreign-interference\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.aspistrategist.ru\/"},{"@type":"ListItem","position":2,"name":"Defending democracies from disinformation and cyber-enabled foreign interference"}]},{"@type":"Person","@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/1730ec525f034baa16dd911fea57775f","name":"Danielle Cave","image":{"@type":"ImageObject","inLanguage":"en-AU","@id":"https:\/\/www.aspistrategist.ru\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/0eb0eb0ac065aaf45b63a5b7a87b53d7?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/0eb0eb0ac065aaf45b63a5b7a87b53d7?s=96&d=mm&r=g","caption":"Danielle Cave"},"url":"https:\/\/www.aspistrategist.ru\/author\/danielle-cave\/"}]}},"_links":{"self":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/64026"}],"collection":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/users\/691"}],"replies":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/comments?post=64026"}],"version-history":[{"count":9,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/64026\/revisions"}],"predecessor-version":[{"id":64040,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/posts\/64026\/revisions\/64040"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/media\/64030"}],"wp:attachment":[{"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/media?parent=64026"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/categories?post=64026"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.aspistrategist.ru\/wp-json\/wp\/v2\/tags?post=64026"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}