Matthew Lesh, Head of Public Policy at the Institute for Economic Affairs, has written a great piece for Spiked pointing out that the social science research about the role of social media in driving political polarisation and extremism is decidedly mixed and certainly insufficient to demonstrate a need for further restrictions on free speech. In other words, the case for the Online Safety Bill and other attempts to censor social media is unproven.
For example, some research suggests that social media ‘reinforces the expressers’ partisan thought process and hardens their pre-existing political preferences’. But other studies have found that partisanship has grown more among groups who are less likely to use social media, that it has grown more in the US despite social media expanding across the planet, and that Facebook’s news feature may even reduce polarisation by exposing people to more viewpoints.
On the question of filter bubbles and echo chambers for news, the evidence is just as mixed. One study found that Facebook’s algorithm fails to supply people with news that challenges their attitudes. But other studies suggest that most users (and conservatives especially) subscribe to a variety of media outlets, that social media actually drive a more diverse array of news sources, and that social media might actually help decrease support for right-wing populist candidates and parties.
Then there’s the issue of foreign disinformation allegedly warping elections. While there is evidence that material from Russia’s Internet Research Agency has reached tens of millions of people in the West in recent years, it is less clear that this has had a strong impact. Russian trolls have largely interacted with individuals who are already highly polarised. Furthermore, studies indicate that just 0.1% of people share 80% of the ‘fake news’ that is in circulation.
To the extent that serious issues with social media have been identified, many studies indicate that they are not widespread. For example, one study found that just 5% of users are in a news echo chamber. On the question of YouTube rabbit holes, another study found that extremist videos are largely watched by people who already hold extremist views and that other people are not driven to that content by recommendations.
This all points to the possibility that partisanship, extreme views and political dysfunction are driven by deeper social and cultural factors. To the extent that you come across more extreme views on social media it is because individuals who are more partisan are more likely to use those platforms. But it is easier to blame social media for, say, the election of Donald Trump than it is to address the disenchantment that drove his victory in 2016.
Worth reading in full.
To join in with the discussion please make a donation to The Daily Sceptic.
Profanity and abuse will be removed and may lead to a permanent ban.