There’s never been a better time for social media than during lockdowns — when Covid ushered us into our homes and expected us to find entertainment for what would turn out to be months. TV shows dried up, so where did people head? Social media. Social media has been well-established in people’s lives before Covid, certainly, but it was only when a pandemic spread across the globe that the reality of social media, and its consequences, set in.


Anyone can be hooked by false feeds

Conspiracy theories. Everyone can name one. There are endless lists of people in Hollywood that are supposedly reptiles; we faked the moon landing, despite the technology to do so not even existing at the time; believing the Earth is flat; and vaccines causing autism. Conspiracy theories in general have floated around long before Covid. But anti-vax movements, in particular, have seen a blossom of support lately with scepticism surrounding vaccine rollouts and raised eyebrows as to the agenda of the medical community (or Bill Gates, apparently).

The reaction to hearing these theories is probably laughter. Maybe some derision, a comment about ‘how could you be so stupid?’ Indeed, it is nice to sit atop a superiority complex and say we would never be fooled by such things. Says who? Any implication that most of us aren’t as susceptible to manipulation and disinformation is utterly false. We’ve just been manipulated in other ways.

Social media creates rabbit holes whose depth extends the longer you dig. It’s like smoking. It’s easiest to never smoke, it’s easier to give up after only one cigarette, but it becomes challenging when it becomes a habit after months and years. Twitter timelines and Facebook feeds are curated for us on the basis of prior activity. The more you interact, the more an echo chamber forms around your social media. We are barraged with content from people who think the same way as us. Who can blame us for assuming that this is the way everyone thinks?

On the basis of information curated for us, we think flat earthers and anti-vaxxers are growing movements of, simply put, idiots. Yet the people in these groups did not simply wake up one morning and choose to deny established science and reality. These conspiracy groups come to their conclusions on the basis of information curated for them.

Misleading headlines are not the problem

It certainly doesn’t help when established media offers headlines that give further fuel to these groups. The Daily Express, for instance, posted a story about a vaccine investigation after a person died ‘within hours of a jab’. Of course, they elaborate and cite a doctor who says the death was likely unrelated, as allergic reactions typically occur within 15-30 minutes. But that headline makes enough of a dangerous implication. Shortly, another story was posted by the Express about a Covid sufferer dying soon after getting a second vaccine dose – actually, they died because Covid can still be contracted after the first dose. Indeed, the Daily Express accepts this, but their headline didn’t think it important enough to mention.

However, the blame cannot be dumped on the media. Sensationalist headlines exist because there is an audience waiting for them. Sensationalist headlines like these are just one of the numerous sources that are fed into an individual’s curated feedback loop. The Daily Express might be to blame for sensationalism, but it cannot be blamed for the delivered-to-your-door nature of headlines.

Casual disinformers

So who is to blame then? Blame tech algorithms and their machine learning. These algorithms serve a purpose and are to blame for the feedback loop of disinformation. When the algorithms detect a user is more likely to show interest in conspiracy themes, they simply serve up more of your preference. Facebook gets criticism for housing condensed amounts of extremism and fake news, but YouTube’s recommendation system is cut from a similar cloth. Algorithms cannot detect truth, all they know is that a user has clicked on a type of video and so this user needs to be shown more of that video type. This is fine when the genre is essays about films, or skits, or video game commentary. A consumer of harmless entertainment is fed more harmless entertainment.

However, this changes when a user is introduced to extremist content, often of a political flavour. The algorithms recommend more, known as a ‘pipeline’. This goes for conspiracy theorists. They’re drowned in content that affirms their narratives — why would they have any reason to believe otherwise? Tech algorithms inherently lack morality, or an incentive to steer people away from disinformation. Algorithms that see users spiralling down rabbit holes are not malfunctioning or crossing a line, they’re simply working as expected.

Perhaps it’s time we stopped sneering at the idiots across the aisle who’ve fallen victim to disinformation rabbit holes and instead started asking who, or what, is pushing them into them.

DISCLAIMER: The articles on our website are not endorsed by, or the opinions of Shout Out UK (SOUK), but exclusively the views of the author.