As humans develop increasingly smarter machines, we may be building in a dangerous bias, to society’s detriment. The technology sector, in areas such as cybersecurity development and STEM (science, technology, engineering and maths) in general, is overwhelmingly male. That gender imbalance inevitably limits diversity of thinking and means that cyber technology, both in its design and in the analytical output produced, are likely to mirror the biases that prevail in the world in which we live. And this brings consequences that will become more profound as time passes, particularly in the national security realm.
In March, Australian Signals Directorate chief Mike Burgess gave an address at ASPI outlining some of the work his agency has done to address threats to Australia’s security. He noted the importance of the work done at ASD by women. Though they make up just over a third of the agency’s workforce, it’s notable that women fill 54% of management positions.
Yet the number of women involved in the cyber field overall remains incredibly low. It’s estimated that the global shortage of cybersecurity workers is likely to reach 1.8 million by 2022, and governments and the private sector are missing out on a significant source of potential talent and diversity in thinking if they don’t engage substantively in efforts to recruit women.
But hiring more women is not enough. There’s another dimension that doesn’t get mentioned nearly enough in the context of cybersecurity—and that is the role of gender perspectives. It’s not enough to just have more women participating; there’s also a need to consider the roles of men and women (and non-binary-identifying individuals) operating in the cyber domain.
There’s considerable evidence that technology has a significant role in perpetuating inequality across society. Women are far more likely to be victims of cyber bullying and shaming through online platforms than men. A few weeks ago, ‘a female diplomat faced a barrage of anti-abortion text messages from an advocacy group, disrupting a major UN summit on women’s rights’. Incidents like this highlight the low threshold for gender-focused harassment. What’s most problematic is that decision-makers can be harassed online by anyone with a computer, often anonymously.
The fact that women, according to a 2015 UN report, are 27 times more likely than men to be harassed online is simply a reflection of a larger societal problem, now enabled by technology. Metadata and the information gathered for geolocation services can enable abusers to track partners or harass individuals through multiple mediums, meaning that technology acts as a less visible enabler for domestic abuse. Practical resources, tools and government responses need to consider these threats to women’s security.
What’s more, technology can be misused by the state as a tool to supress women’s rights. Saudi Arabia’s mobile app Absher is an example of this. The app was developed by the Saudi government as a portal for citizens to access government services. However, alongside the ability to lodge licence applications and view documents, the app also allows male guardians to approve or withdraw permissions for women to travel internationally. While some have argued that this may have positive effects because lenient guardians can grant women more autonomy, it also demonstrates how technology can act as a tool to perpetuate violations of women’s rights. It also raises significant questions about the ethics of platforms like Google and Apple that facilitate apps that can be used to oppress women.
In addition to misuse and abuse, there’s also potential for technology to perpetuate existing biases—particularly in the domain of artificial intelligence. Male and female patterns of behaviour are often different, so more diverse thinking is required to prevent societal inequalities from being built into a machine’s ‘abilities’ from the start.
Already there are concerns that AI and the devices that form part of the internet of things are being dominated by women’s voices in service roles (like Apple’s Siri and Amazon’s Alexa). The machine-learning processes that many AI tools rely on may result in more discriminatory practices, if data and processes don’t address some inherent biases that already exist in society, based on perceptions of what constitute traditionally feminine and masculine roles. Having more diverse participation in the cyber workforce is one way to reduce these risks, but it also requires more coherent policies that proactively address inequality in the platforms, from their earliest stages of development.
Much of what people do is based on assumptions, many of them unconscious. Female networks often function and communicate differently to male networks. Yet assuming that offline gender norms and perceptions carry over to online platforms can present a wider security risk when it comes to issues such as violent extremism and terrorism, because people’s offline behaviour may differ from their behaviour in online communities. The different ways that men and women use online platforms needs to be factored more comprehensively into analysis on issues related to terrorism, for instance.
The availability of information on various social media platforms also means there’s a gendered element in the way information and individuals are manipulated. ‘Honeypots’ are not a new phenomenon in the world of intelligence. However, the use of such approaches through tactics such as ‘catfishing’ (pretending to be someone you’re not on social media) presents a complex security risk for governments, particularly when social media platforms, devices and apps leave crumbs of information about high-profile individuals for enemies to exploit. During a recent military exercise, NATO’s Strategic Communications Centre of Excellence targeted troops through fake social media pages and was able obtain intelligence on military operations, demonstrating how easy it can be for a hostile force to get information (for the price of US$60).
Of course, the challenges presented by cyberspace and technology aren’t new. As technology historian Melvin Kranzberg noted more than three decades ago, ‘Technology is neither good nor bad; nor is it neutral.’ It is a mirror that reflects the society in which it is used. That means that like the rest of our discussions about defence and national security, the conversation about women in cyber also needs to include consideration of their diverse perspectives in the development and use of technology. Otherwise, we are literally missing half the picture.
This article is part of a series on women, peace and security that The Strategist is publishing in recognition of International Women’s Day 2019.