Labour and SNP politicians on the Committee currently scrutinising the Online Safety Bill laid an amendment to include “health-related misinformation and disinformation” as a recognised form of lawful but “harmful” speech, ratcheting up the censorship yet further. Mark Johnson has written about the illiberal move for UnHerd.
Civil libertarians often talk about a phenomenon known as the ‘ratcheting effect’. This is the idea that when it comes to the erosion of our liberties, the trajectory tends to head in one direction; in favour of state power at the expense of our rights and freedoms.
It is the reason why we draw red lines that should not be crossed. If you breach the principle of non-interference in people’s rights with a relatively minor incursion, what is to stop that minor incursion from escalating to something more significant in the future?
Yet with the Online Safety Bill, a censor’s charter which has been so long in the making the ratcheting effect is happening in real time. Last week, SNP and Labour politicians on the Committee currently scrutinising the Bill laid an amendment to include “health-related misinformation and disinformation’ as a recognised form of lawful but ‘harmful’” speech. This threatens to open a Pandora’s box of censorship.
The terms ‘misinformation’ and ‘disinformation’ have grown to become part of the political lexicon in recent years. The concepts of being incorrect or misleading have been left behind for alternative terms, with loaded connotations. Yet they are malleable terms, often deployed in ways to discredit or silence another individual’s argument in the course of public debate.
Stoked by these fears, we have seen Big Tech increasingly taking on the role of online speech police in recent years. During the coronavirus era, this reached new extremes. At the beginning of the pandemic, Facebook took the step of removing content which promoted face masks as a tool to combat the spread of COVID-19.
Yet within a short space of time, the medical consensus on masks changed. But rather than acknowledge that it was wrong, Facebook flipped its position and censored in the other direction. A high-profile example saw Facebook label, discredit and suppress an article in the Spectator, written by the Oxford academic Carl Heneghan, disputing the efficacy of masks. What grounds or competency Silicon Valley’s fact-checkers had to overrule reasoned arguments by a Professor of Evidence-Based Medicine remains to be seen.
This approach is a direct threat to the epistemic process, so central to the free and open development of knowledge and ideas in liberal democracies. The fact that not even academics can escape this kind of truth arbitration speaks volumes.
Worth reading in full.