• Saturday, Dec 07, 2019
  • Last Update : 02:46 pm

Filter bubbles and echo chambers

  • Published at 12:02 am December 3rd, 2019
Social media advertising
Bigstock

How confirmation bias is weaponized by political parties online

In a recent speech, actor Sacha Baron Cohen slammed Facebook for not fact checking its political advertising. He went as far as to say that if Facebook was around in the 1930s, they would most likely host ads for Adolf Hitler’s infamous “final solution.”

While it was a scathing indictment of Facebook and other social media platforms, Cohen’s argument is something that has been floating around the communications academia field for a long time. But to really understand why these unchecked ads are harmful, one requires an understanding of “filter bubbles.”

Filter bubbles are essentially pockets of space where the same ideas are dissipated and absorbed by a group of people over and over again. Its almost like an echo chamber and it reinforces the ideas that people already believe in and sometimes add extremism to these ideas in small increasing doses.

There is a lot of reinforcement of these ideas in these spaces and eventually the people in these spaces become polarized. As a result, they often become dogmatic in these ideas, or very stubborn at the very least. Now, with the advent of social media, these spaces have become virtual and invisible to anyone that doesn’t belong to that particular filter bubble. Whether or not these bubbles have the potential to be dangerous depends on the sort of content being discussed.

If the topics are things like which brand makes the best movies or which soccer team is the best in Europe right now, it isn’t harmful at all. The polarization based on those sorts of ideas don’t wreak havoc. But when the topics involve politics, policy, religion, and race, the results can be devastating.

The other thing one needs to be aware of is the amount of data these social media platforms have on their users. They know almost everything about their users. This means on top of those filter bubbles that the users have voluntarily created, these platforms can make pretty accurate guesses about your ideology and classify you accordingly.

And since Facebook doesn’t fact-check its political advertising, they can then point the people making those ads in the direction where their false, polarizing message will have the greatest impact. So you could put literally any damaging message under the guise of political advertising and perpetuate those messages into a filter bubble of your choice where your message will resonate.

In Bangladesh, political advertising through social media is not that common yet. Be that as it may, it is my strong suspicion that political parties will soon start using these social media ads. And the results will be absolutely devastating. Even before all these ads became prominent all over the world, social media was used in Bangladesh for smear campaigns, inciting religious violence and spreading misinformation. A prime example of which are incidents like the attack on the Buddhist temples in Ramu.

The scary thing is that if the political parties really wanted to, they could just bypass Facebook’s ad mechanism in Bangladesh because, to anybody using Facebook in Bangladesh, the types of users, and where they hang out on the site is super obvious. And while this is not an indictment on the users that don’t necessarily have the highest education, it is often those users that are most vulnerable to damaging messaging, which is not entirely their fault either.

They’ve spent so much time in thee filter bubbles with a tendency to believe anything and everything they see online that just “happened” to align with what they believe in that they don’t know any better.

The idea that filter bubbles are only created in right-wing spaces is absurd. They can be created in any part of the political or socio-economic spectrum and they’re equally harmful. They rob people an actual image of what’s going on in the world and the ability to engage in meaningful discourse and, at the end of the day, that affects everyone in society. Experts are all looking for ways to diffuse this phenomenon in a way that lets people enjoy social media but not suffer from the dire consequences that they come with. As of now, the progress is negligible but let’s all hope for a future where that is a reality. 

Nibir Khan is a freelance journalist.