At the next federal election, Australia will be using outdated, 20th-century rules to try to manage a very 21st-century digital threat.
Create a free account to read this article
$0/
(min cost $0)
or signup to continue reading
We know social media can be used to reach millions of Australians to sow confusion, division, and fear, yet we are largely blind to its true extent.
We've already seen attempts to undermine Australia's COVID-19 efforts. Somewhere out there Australia's most popular piece of pandemic disinformation is being circulated, viewed by thousands. We don't know what it is, who is pushing it, or who is being targeted.
Neither does anyone in government.
A single meme or video won't create a conspiracy believer, radicalise a voter, or destroy faith in public institutions. But the cumulative drip of conspiratorial, divisive and extreme information aimed at susceptible audiences is eroding our democracy.
A recent report by the Senate Select Committee on Foreign Interference through Social Media warned of the threat Australia's antiquated digital regulation poses to our democracy, and called for greater transparency on the part of social media companies.
It correctly demanded the release of the Australian Communications and Media Authority's report into the functioning of the Australian Code of Practice on Disinformation and Misinformation.
The code, developed and launched by industry lobby group DIGI in 2021, is voluntary and opt-in, with no enforcement or penalties. Facebook, for example, opted into every commitment, yet its 50-page transparency report reveals little about features that have substantive impacts on our digital public square. We are merely asked to trust it.
Facebook is good at appearing transparent, telling the Senate committee it took down 18 million pieces of harmful misinformation since the pandemic began. In 2020, it claims, it removed 110,000 pieces of COVID-19 misinformation in Australia.
This sounds impressive, but it has no meaningful context. How many people were served that content by Facebook's algorithms before it was removed? How many engaged with it, shared it? How many received and engaged with the corrected information? How does Facebook determine what harmful misinformation even is?
This lack of transparency makes it near impossible to design appropriate regulatory responses. Ultimately we are left beholden to offshore digital behemoths to decide how they'll manage the threat.
READ MORE:
The Department of Home Affairs' social media insights team, working to counter violent extremism, has spotted online disinformation campaigns. But its main recourse has simply been to alert the platforms and hope for co-operation.
In no other industry do we accept such self-regulation. If I open a pub, I can't tell the health inspector: "You're not coming in, just trust me, my kitchen is clean." Why do we allow this for online spaces where millions of Australians gather?
Social media companies claim secrecy is pivotal to their continuation, but that's the same self-interested nonsense every growing industry says about looming regulation. We need not destroy social media to regulate it, but we do need to lift the hood so we can better understand the problem.
Reset Australia, part of a global initiative working to counter digital threats to democracy, has been calling for a "live list" of the most shared posts during contentious times such as pandemics or elections.
A live list would show us what content is gaining traction online, help us understand why, and help us counter harmful misinformation.
We know social media algorithms are designed to engage us. The more extreme, conspiratorial, and emotive the content, the more some users become involved. The algorithm then responds by serving up even more.
If we better understood these algorithms, we could demand limits on what they amplify. Australians could then decide what dominates their information ecosystem, not trillion-dollar international companies.
In the past, social media engagement was seen as an issue of individual choice: if you don't like what's online, don't view it.
This no longer applies in 2022, when we are all affected by the lack of online accountability - whether we have a social media account or not.
- Chris Cooper is executive director of Reset Australia.