← Back Published on

Are you in an Echo Chamber?

When you’re online, do you feel like everyone is constantly on the same understanding on a variety of topics where the crazy, fringe ideas couldn’t possibly appeal to a substantial portion of people? Are all your ads, news articles and timelines often mirroring your political opinions? If so, then you could be in a digital echo chamber.

A digital echo chamber is a phenomenon where similar ideas are floated back and forth within our digital spaces, reinforcing preexisting beliefs. It has other names such as “filter bubbles” or “epistemic bubbles,” but the concept is the same.

Digital echo chambers are made possible through companies like Facebook, Google, Twitter, Microsoft and many other tech companies tracking your online behavior. Eli Pariser, an entrepreneur, activist, and author who literally wrote the book on filter bubbles spoke on the subject 9 years ago. These algorithms that are designed to tailor each user’s online experience to their political leanings and beliefs becomes evermore important in today’s society nearly a decade later. There are over 57 signals a company might use in order to tailor your feed towards you, including your device type, choice of browser and your location.

Some argue whether digital echo chambers even exist.

Dr. Grant Blank, a survey research fellow at the Oxford Internet Institute, states that “one of the characteristics of the internet is that it has created a very large, complex media environment that includes not just social media but also print media, television, radio, and online media of various types, including online copies of print media – as well as specialised online media.”

He proposes that if you account for the entirety of this complex, multimedia environment that we live in (instead of purely social media), you will see that people are consuming a lot of media, interacting with others across the political spectrum and are checking information that they find on social media.

There are also people who agree there is an echo chamber but don’t necessarily blame the companies for it. In an article in Wired, Kartik Hosanagar claims that we choose to remain within these echo chambers. According to him, there are three different factors that lead us to encounter content that might diverge from our political and social beliefs:

  1. Who we choose to follow or add as a friend and what news stories they share.
  2. Which news stories shared by friends are displayed by the algorithm
  3. Which of the displayed posts we actually interact with.

Out of the three, only the second one is out of our control.

Most people are more likely to interact with and click on posts that more closely aligns with their beliefs, and this causes the algorithm to display similar posts in order to keep their attention for longer – creating a cycle where confirmation bias is nurtured. Tech companies are motivated to use these tactics to keep viewers’ attention so they can generate revenue. Companies like Facebook make their profits through targeted advertising, such as political ads. Therefore, it is important to check the source of your social media ads whenever possible and see who paid for the advertisement. These tactics may prove useful in making a profit, but they can cause us to lose sight of reality. Living in an echo chamber may comfort us by strengthening our confirmation bias, “but ultimately they lock us into perpetual tribalism, and do tangible damage to our understanding.

Social media platforms are incentivized to set up algorithms that displace people in echo chambers, and many people who post content on these platforms take advantage of these algorithms. In Psychology Today, Professor Phil Reed of Swansea University stated an interesting point: People who posted more polarizing opinions tended to spread information in support of their views much more frequently than people with more nuanced views, and therefore had greater influence in online discourse and a larger following.
Echo chambers may feel inescapable; they can be formed in Facebook groups, subreddits, various corners of TikTok (looking at you “conservative TikTok” and “leftist Tiktok”), Twitter feeds and ads that’ll pop up in timelines and videos. But you can still break out of these chambers. The University of Alberta has created a great video on how to tackle it and how to talk to someone who may have found themselves in one.

HOW ARE ECHO CHAMBERS CREATED?

Many companies use cookies to track your browsing activity and send you targeted information, so make sure to routinely clean out your computer of cookies. It’s become such an issue that the European Union has enacted privacy legislation, where websites are required to get consent from visitors in order to put cookies in their computer. I use CCleaner to clean out unwanted files and cookies from time to time; it’s basic maintenance, and it speeds up your computers and removes those pesky cookies as well.

On the surface, echo chamber algorithms may seem like a fairly innocent way for companies to be profitable, but they can actually lead to foundational issues like rampant tribalism, unfounded conspiracy theories and many other toxic behaviors to fester and grow. It is your duty as an internet user to think critically of everything you see online. What is the source? Are they credible? Are there any inherent biases? Are a variety of other sources saying something similar?

Also, make sure to question your own biases before taking something as fact. What are my biases, can I recognize them? There are multiple cognitive biases that have been identified and studied in psychology and behavioral economy. A few biases that can be readily seen within echo chambers include:

  • Confirmation bias: The tendency to search for and focus on information that confirms one's preconceptions
  • Salience bias: The tendency to focus on information that is more emotionally striking and ignore less evocative ones
  • Anchoring bias: The tendency to rely too heavily on one piece of information.
  • Groupthink: The desire for conformity and harmony in the group, causing group members to minimize conflict and reach a consensus by actively suppressing dissenting viewpoints.
  • Reactance: The urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice

In today’s world where we are drowning in a sea of information, it is important we learn to filter and fish out good, substantial information from the bad and unsubstantiated.

It is only through being aware of ourselves, our immediate habits and the environment we create that we can fully take advantage of the wealth of information we have at our disposal to create a nuanced understanding of the world and the people around us.