The European Union’s Digital Services Act (DSA) comes into force today, obliging “very large online platforms” to swiftly take down what unelected European Commission bureaucrats decide to define as ‘disinformation’.
As Laurie Wastell points out in the European Conservative, the DSA obliges online platforms to swiftly take down so-called disinformation. From today, the EC has at its disposal an aggressive enforcement regime, such that if Big Tech companies fail to abide by the EU’s ‘Strengthened Code of Practice on Disinformation’, which requires swift censorship of mis- and disinformation, then they can be fined up to 6% of their annual global revenue, investigated by the Commission, and potentially even prevented from operating in the EU altogether.
So, who is to say if something is misinformation? In the case of social media platforms operating within the EU, the EC is the arbiter of that, since it is the Commission that will decide if platforms like X and Facebook are doing enough to combat it. (It is the EU’s executive body, the EC, that is invested by the DSA with the exclusive power to assess compliance with the Code and apply penalties if a platform is found wanting.)
And what kind of speech is the DSA expected to police? The Code defines disinformation as “false or misleading content that is spread with an intention to deceive or secure economic or political gain and which may cause public harm”. That sounds innocent and apolitical enough. Yet the European Digital Media Observatory (EDMO), which was launched by the EC in June 2020 and aims to “identify disinformation, uproot its sources or dilute its impact”, appears to adopt a much broader, deeply politicised understanding of the term “misleading content”.
Consider, for instance, some of the key “disinformation trends” listed in the EDMO’s recent 2023 briefing on disinformation in Ireland. They include “nativist narratives” that “oppose migration”, “gender and sexuality narratives” that touch on drag queens and trans issues as “part of a wider ‘anti-woke’ narrative that mocks social justice campaigns”, and “environment narratives” that criticise climate-change policies and Greta Thunberg.
Clearly, what is common to such narratives is not that they constitute disinformation in the sense outlined in the Code — that is, “false information intended to mislead”. Rather, they represent opposition by members of the public to unpopular policies favoured by European elites — in this case, mass migration, transgender ideology and Net Zero.
In the words of EC President Ursula von der Leyen, it is vital that companies censor disinformation of this kind to “ensure that the online environment remains a safe space”. Safe for whom, one wonders — politicians or citizens?
Well worth reading in full.
Dr. Frederick Attenborough is the Communications Officer of the Free Speech Union.