“I think there are elements of the election administration function that should absolutely be considered critical infrastructure, and that is the administration element,” he said. “That’s the systems, the machines, the counting process, the protocols around it – I think it’s, at least in the US, a step too far to call the political parties themselves as part of the infrastructure, but they do have certainly a contribution and a piece involvement.” The Parliamentary Joint Committee on Intelligence and Security (PJCIS) is currently looking into Australia’s Security Legislation Amendment (Critical Infrastructure) Bill 2020, which, among other things, looks to bring more sectors into the definition of “critical infrastructure”. These are communications, financial services and markets, data storage and processing – including cloud providers – defence industry, higher education and research, energy, food and grocery, healthcare and medical, space technology, transport, and water and sewerage sectors. Krebs said Russian interference in the 2016 US election led the focus for the 2020 election to be on thwarting technical attacks and disruptions of election systems by ransomware attacks against voter registration databases, and of media outlet hacks, both on websites and television, such as changing the results on the live tally. “But as we got closer to the election, what we actually realised were the most likely were perception hacks, or disinformation campaigns, claiming to have been able to access the system, claiming to have been able to change an outcome, or that somebody else was doing it,” Krebs explained. “And ultimately, that’s what we saw with some of the claims of Hugo Chávez from the grave hacking into the vote counting systems, and those persist to this very day, with the so-called audit in Arizona, and elsewhere. “Those are the more pervasive, much harder to debunk, because there’s an asymmetry of the adversary. Even if it’s domestic, it’s still an adversary, in this case, [a] domestic actor that is trying to undermine confidence in the process for their own outcomes.” He said what is at stake is defending democracy. See also: Researchers found three flaws in ACT e-voting system that could affect election outcomes Adding to Krebs’ remarks was the director of the Australian Strategic Policy Institute’s international cyber policy centre, Fergus Hanson, who considers political parties themselves as a key vulnerability, given the scale to which their operations need to grow come election time. “Trying to provide a solid cybersecurity basis for that is very difficult for a very small organisation that’s undergoing massive and rapid scaling. I think providing government support for all political parties to be more resilient to interference, I think, would be really important,” he told the PJCIS. “And we’ve seen in lots of countries where political parties have been breached [or] there’s been hacks or leaks – operations that have potentially swayed people’s views on parties during the heat of a campaign.” Further on misinformation and disinformation, Krebs said an understanding exists of there being underinvestment in cybersecurity and the critical infrastructure community, but there has been “virtually no investment on countering disinformation, nowhere”. “More important is that right now than in the deployment of COVID-19 vaccinations, we are seeing an active threat environment from Russia and China for vaccine diplomacy and we’re also seeing it from conspiracy theorists and just anti-vaxxers in general – there’s a much longer tail on the disinformation,” he said. “But I will say that I’ve been impressed with the Australian government’s efforts over the last several years to take disinformation and threats to democracy in particular, very, very seriously. In fact, they’re well ahead of where I would say the United States is.”
RELATED COVERAGE
Countering foreign interference and social media misinformation in Australia DFAT, the Attorney-General’s Department, and the AEC have all highlighted what measures are in place to curb trolls from spreading misinformation across social media. ASPI wants statutory authority to prevent foreign interference through social media It said the authority would be granted explicit insight into how content is filtered, blocked, amplified, or suppressed, both from a moderation and algorithmic amplification point of view. Facebook, Google, Microsoft, TikTok, and Twitter adopt Aussie misinformation code Code will not apply to government content, political advertising, satirical work, or other journalistic pieces that are governed by an existing Australian law.