YouTube’s decision to ban anti-vax content will do little to tackle the systemic problem of misinformation, says Reset Australia as it calls for public oversight and algorithmic audits to uncover how Big Tech profits from amplifying false and misleading content.
“Content moderation is a giant game of whack-a-mole – ultimately it’s futile because there will always be new content popping up where you’re not looking,” said Chris Cooper, Executive Director of Reset Australia, the Australian arm of the global initiative working to counter digital threats to society.
“If YouTube is serious about tackling misinformation it needs to be transparent about how its algorithms are amplifying this content to viewers.”
The group claims that YouTube has consistently been opaque about the nature and extent to which its recommendation algorithms are leading people to misinformation or the true scope of anti-vaccine conspiracies.
“Big Tech’s timid attempts at self-regulation, like labelling posts as fake or de-platforming individual spreaders have as much impact as an oil company planting a thousand trees to counter climate change. The problem with all these downstream interventions is they don’t tackle the core systemic issue of unchecked algorithms,” Cooper said.
According to Reset Australia, recommendation algorithms are used across technology platforms to keep users online for longer and enable them to be served more advertisements. Algorithms prioritise the most engaging content, but increasingly research is showing this also happens to be the most emotive, conspiratorial, and enraging content.
“YouTube’s algorithms prioritise content for its engagement value, rather than its accuracy,” Cooper stated.
“So while social media didn’t invent conspiracy theories – its unchecked algorithms have supercharged them into global movements.
“Even in a country like Australia, with a 95% childhood immunisation rate, Covid-19 vaccine misinformation has led to a high degree of hesitancy.”
He said much of this anti-vax content can be found on YouTube, despite the fact that YouTube’s policy states that it doesn’t allow content about Covid-19 that contradicts health authorities.
“Many viewers have likely had conspiracies recommended to them by an algorithm, rather than seeking it out themselves,” he asserted.