Instagram has a censorship problem. On July 21, it rolled out its new ‘Sensitive Content Control’ which lets users determine how much ‘sensitive’ content they’d like to see. Instagram defines sensitive content as: ‘posts that don’t necessarily break our rules, but could potentially be upsetting to some people’. The following, are the settings on the Sensitive Content Control: Allow, Limit, Limit Even More. All accounts are now set to the default ‘Limit’ setting. This means that posts featuring ‘sensitive content’ will not appear on your Explore page anymore. To see these posts, you will need to change your settings to ‘Allow’. Ostensibly, this feature gives people the chance to control how much nudity, violence and weaponry they see on Instagram. In reality, it’s a failed attempt to regulate itself that strangles activism.


‘Sensitive’ or just ‘Undesirable’ content?

Ever since the worldwide Black Lives Matter protests last summer, Instagram has proven to be an unlikely hotbed for activism and grassroots journalism. Pictures of sunsets and smashed avocado paninis have been discarded for discussions of critical race theory, petitions and educational resources. As a result, grassroots journalist and activist accounts have sprung up on Instagram like trees around an oasis. Some of these accounts report on socialist housing squats in Berlin, whilst others expose police brutality in Baltimore. Marginalised stories that don’t make the mainstream news can now be shared and gain followings, all the while bringing awareness. However, this new default setting means that these pages — which often share explicit videos of violence — will not appear in the Explore pages. This makes it more difficult for new users to find the pages, effectively stopping them from gaining new followers.

Why has Instagram introduced the new feature? It comes at a tricky time for the companies involved. Big Tech is facing a resounding message across the world from governments: Sort it out, or we will! What’s the ‘it’? The content allowed on its platform.

Russia has recently demanded that Google, Twitter and Facebook (which owns Instagram) restore pro-Kremlin content that was removed from the platforms. In April, India ordered the tech companies to take down posts criticising its handling of the pandemic. These countries seek to destroy dissent. On the other hand, countries such as the US seek to regulate the rampant misinformation that helped propel Trump to power and led to the Capitol Hill riot. As a result, tech companies are frantically trying to regulate themselves — whether through ‘Sensitive Content Control’ or Facebook’s new Oversight Board – before governments step in. What both governments and companies miss is that the problem isn’t content control, it’s the nature of the platforms themselves.

Big Tech = Big Profits

Instagram is an advertising agency: it collects data and sells ads, which are targeted to each user based on the data collected. It exists to make a profit, like the rest of the boys in the Big Tech band. It’s well-known — thanks to books and films such as The Social Dilemma — that these profits come from deliberately encouraging addiction to the site. The more time you spend on the site, the more ads you are shown, and therefore the more money the site makes. In order to glue you to the platform, it needs to know what kind of thing you like. So, Instagram and other platforms have an insidious algorithm that shows you only the content you like. That’s why your feeds are never balanced debates: they only show the side you agree with. Because of this algorithmic setup (called ‘network effects’), controversial content which keeps being posted and updated is valuable. Content that’s incendiary, harmful and provocative naturally rises to the top in this gamed system. It doesn’t matter to the platform whether it’s true or not: it matters whether it makes a profit.

That’s why there’s been such a rise in fake news. Misinformation has been around for as long as information has, it’s just never been so weaponised, widespread and targeted before. And that’s because there’s a lot of money to be made from it. At the end of the day, all self-regulation can be expected to fail if it curtails potential profits. Just look at this new Sensitive Content Control; all it does is strangle the kind of thinking that might threaten the tech companies’ status quo. A more fundamental change is needed.

Social media, despite its detrimental effects on mental health, is here to stay. We need to learn how to use it best. This can be done by encouraging grassroots journalism, promoting activism and sharing marginalised experiences. Solutions are possible: make the algorithm transparent; force tech companies to register as public-benefit corporations; rethink what we want online platforms to achieve. At the moment, it’s ‘free speech’ when it makes the platform a profit and censorship when it doesn’t. It’s not good enough.

DISCLAIMER: The articles on our website are not endorsed by, or the opinions of Shout Out UK (SOUK), but exclusively the views of the author.