In a welcome injection of sanity into the debate about whether social media companies should remove content that challenges the the scientific establishment’s orthodoxy about climate change, lockdowns and the vaccines, a report from the Royal Society has concluded that the risks of censorship outweigh the benefits. The Financial Times has more.
Calls for social media sites to remove misleading content – for example about vaccines, climate change and 5G technology – should be rejected, according to the U.K.’s senior scientific academy.
After investigating the sources and impact of online misinformation, the Royal Society concluded that removing false claims and offending accounts would do little to limit their harmful effects. Instead, bans could drive misinformation “to harder-to-address corners of the internet and exacerbate feelings of distrust in authorities,” its report says.
In the UK there have been calls from across the political spectrum for Twitter, Facebook and other platforms to remove antivax posts. However, “clamping down on claims outside the consensus may seem desirable but it can hamper the scientific process and force genuinely malicious content underground”, said Frank Kelly, mathematics professor at the University of Cambridge who chaired the Royal Society inquiry.
He added that removing content and driving users away from mainstream platforms makes it harder for scientists to engage with people such as anti-vaxxers. “A more nuanced, sustainable and focused approach is needed,” he said.
While illegal content that incites violence, racism or child sex abuse must be removed, legal material that runs counter to the scientific consensus should not be banned, the report said. Instead there should be wide-ranging action to “build collective resilience” so that people can detect harmful misinformation and react against it.
“We need new strategies to ensure high quality information can compete in the online attention economy,” said Gina Neff, Professor of Technology and Society at the University of Oxford, and a co-author of the report. “This means investing in lifelong information literacy programmes, provenance-enhancing technologies and mechanisms for data sharing between platforms and researchers.”
The well informed majority can act as a “collective intelligence” guarding against misinformation and calling out inaccuracies when they come across them, said Sir Nigel Shadbolt, executive chair of the UK Open Data Institute and another co-author. “Many eyes can provide powerful scrutiny of content, as we see in Wikipedia,” he added.
Some fears about the amplification of misinformation on the internet – such as the existence of “echo chambers” and “filter bubbles”, which lead people only to encounter information that reinforces their own beliefs – have been exaggerated, the report found.
Stop Press: Dr. Vinay Prasad, an Associate Professor of Epidemiology and Biostatistics at the University of California, San Francisco, has made a similar case in UnHerd, arguing that Joe Rogan should not be censored by Spotify for inviting Dr. Peter McCullough and Dr. Robert Malone on to his show. Dr. Prasad sets out what he thinks they got right and what he thinks they got wrong, but concludes that any attempt to suppress dissenting voices would be contrary to the principles of open scientific inquiry and would undermine public trust in science. Excellent piece. Worth reading in full.