The 2024 independent review of Australia’s national intelligence community has kicked off. It will focus on the 10 agencies that comprise the NIC and comes at a time of increasing complexity and uncertainty in Australia’s strategic environment.
Among the review’s terms of reference is a direction to consider the NIC’s ‘preparedness in the event of regional crisis and conflict’ and whether the NIC is positioned effectively to respond to the evolving security environment.
In our view, the review should demand special attention to one particularly complex problem: global catastrophic risk.
Global catastrophic risk, and its crueller cousin existential risk, are threats that could cause harm on a horrific scale. Nuclear winter, engineered pandemics, extreme climate change, space weather. Millions—even billions—could be killed. Recently, artificial intelligence experts have called out the extinction risk from AI.
In its 2020 Global Trends report, the US intelligence community warned the country’s new president, Joe Biden, and his administration of these concerns, made increasingly pressing by technological acceleration:
Technological advances may increase the number of existential threats; threats that could damage life on a global scale challenge our ability to imagine and comprehend their potential scope and scale, and they require the development of resilient strategies to survive. Technology plays a role in both generating these existential risks and in mitigating them. Anthropomorphic risks include runaway AI, engineered pandemics, nanotechnology weapons, or nuclear war.
It’s time for Australia’s intelligence community to be equally proactive.
The pathways to global catastrophe might seem unlikely. But that’s the role of intelligence: to identify, analyse and warn about threats to global and national security. Should the NIC assess global catastrophic risk, it will see that the risk is uncomfortably high.
Take nuclear. A full-scale nuclear war between, say, the US and Russia could lead to the deaths of about five billion people within two years. An aggregate of expert estimates puts the annual probability of a nuclear war at around 1%. The same figure was produced by a recent quantitative risk assessment. That’s roughly a coin toss’s chance out to 2100.
Or take extreme climate change. Surprisingly little work has been done to assess catastrophic climate-change scenarios. The three most relevant studies put the chance of a catastrophic climate-change event at between 5% and 20%. The environmental, economic and security consequences in a world where warming is significantly higher than the 1.5°C target remain extremely uncertain.
Or take engineered pathogens. Advances in biological engineering could increasingly empower less skilled actors to synthesise novel diseases that are both highly lethal and highly infectious. In 2022, a world-leading bioengineering expert assessed that ‘within a decade, tens of thousands of skilled individuals will be able to access the information required for them to single-handedly cause new pandemics’. Biorisk experts forecast an 8% chance that a genetically engineered pathogen could kill more than a hundred million people by 2050. AI might accelerate those timelines.
This scale of risk is falling through the gaps. Australia, unlike many of its peers, doesn’t have a robust national risk assessment and it’s unclear if any NIC agency is giving the risk the attention it deserves. The clock is ticking on all of these threats. In many cases, policy interventions or well-researched response plans could be relatively straightforward and highly effective. The necessary first step is assessment.
The intelligence review is the perfect opportunity to put global catastrophic risk on the radar and to correct course.
The 2018 legislation that governs the Office of National Intelligence is silent about whether these critical considerations are within its scope. One option is amending the act to explicitly include global catastrophic risk as a focus area. A simple addition to section 7— which sets out the duties of ONI—along the lines of ‘and global catastrophic risk that would threaten Australia’s survival, security and prosperity’ would ensure that the NIC doesn’t short-change these risks by focusing only on short-term requirements. Equally, a finding or recommendation that these risks are in scope for the NIC could catalyse action.
Going further, an extreme global threats mission within ONI could work across the NIC to collect data on, analyse and monitor these risks. A recurring ONI-led national assessment of global catastrophic risk would track the trends and strategic dynamics that could lead to global catastrophe. ONI could also lead a Five Eyes working group, including collaborating with the US Office of the Director of National Intelligence, which is beginning to prepare its next Global Trends report.
Regardless of the mechanism, NIC agencies must focus on pathways to, scenarios for and contributors to global catastrophic risk. Immediate priorities should be assessments of the impacts of AI on nuclear stability, AI-enabled cyber weapons, advanced autonomous weapons, geomagnetic storms on critical infrastructure, and engineered pathogens. The Australian Security Intelligence Organisation should track and assess the catastrophic risk emanating from domestic non-state actors and the potential for increasingly available AI and biotechnology tools to boost terrorist capability.
Intelligence is critical for helping policymakers navigate the future, which is at increasing risk of a global catastrophe. The NIC must be Australia’s eyes and ears.