Introduction to Information and Power

Grade Received: A
To see my process story and reflection on this work, please click here.

This assignment is based on cultural theorist Raymond William’s keywords concept. In this assignment, a keyword was chosen and analyzed to answer questions such as how the keyword is used in scholarly, policy, and public debates; how it opens up information-related issues in society; and what scholars or policy events are connected to the keyword.

Keyword: Filter Bubbles

The filter bubble, coined by author and Internet activist Eli Pariser, is a concept that describes how Internet algorithms monitor users’ interests to predict what else they would like (Pariser 10). Consequently, filter bubbles “create a unique universe of information” for each of its users, thus changing the way users find and exchange information on the Internet. Pariser claims that filter bubbles have three dynamics: users are alone in them; they are invisible; and users cannot choose to enter the bubble as the bubble simply comes to them (10-11).

How Filter Bubbles Can – or Cannot – Tailor Worlds

Filter bubble algorithms allow websites to determine which articles users should read or which people they should follow, supposedly isolating them in their own views (“How”). Hence, the information presented to users becomes a “perfect reflection of [their] interests and desires” (Pariser 11-12). According to Coquelin and Ruhmannseder, since users in filter bubbles are constantly fed with information that confirms their current beliefs, they may become less open to other points of view (28).[1] So, a sort of tribalism occurs; users antagonize those who counter their views while befriending those who share them. Napoli adds on to this claim, saying that personalization and algorithms work together to deflect “content that do not correspond to the user’s established content preferences and political orientation” (77).

However, there are studies and experts that say otherwise. In 2015, a Facebook-funded study at the University of Michigan, which worked with about ten million users, revealed that users’ newsfeeds are only six percent less likely to show posts that conflict with their political views (diFranzo 4-5; Meineck).[2] Although news that does not align with users still appear, it is the users themselves who choose not to click on them. The researchers attributed the study results not to newsfeed algorithms, but rather to the users’ personal biases. Adding to these results is the idea that it is human nature to prefer information that matches one’s current beliefs while avoiding those that do not (Borgesius et al. 6).

There are also recent studies that contradict Pariser’s filter bubbles. In 2018, Krafft et al. found that out of their more than 1,500 research participants, about a quarter to half of all cases showed that participants searching for German political parties and figures on Google encountered the same search results (3, 30; Bruns 4).[3] Furthermore, about five to ten percent of those results were shown in the same order (Krafft et al. 30). These outcomes contradict the premise of filter bubbles as they show that personalization has little to no effects on content diversity (Bruns 4).

There is even evidence that shows social media may be connecting people to other views instead of isolating them. In 2011, a Pew Research survey suggested that social media may actually be increasing people’s interactions with those from other backgrounds and/or views, thus bursting filter bubbles (diFranzo 4-5). Echoing this survey’s results is a 2013 research study at the University of Oxford, wherein researchers analyzed 50,000 American users and found that readers are especially confronted by news sources from alternative political views on social media (Meineck). Moeller and Helberger support these outcomes, saying that depending on the input and algorithm design, algorithmic recommendations can potentially provide users with diverse results (4).

Filter Bubbles, Fake News, and Conspiracy Theories

Several scholars say that alleged fake news and conspiracy theories can easily emerge and spread in filter bubbles because they are more likely to be accepted there (Coquelin and Ruhmannseder 26; Rieger and Schneider 13-14).[4] The Internet’s design also allows users to quickly spread and absorb information they like while avoiding information they dislike (Rieger and Schneider 13-14). Adding to that is Butter, who says that conspiracy theories rapidly disseminate on platforms like YouTube, where conspiracy theorists easily spread their ideas in filter bubbles whose inhabitants are less likely to consider other ideas (10-11).[5]

Instead of censoring or deleting false information, the counterspeech doctrine in the United States’ First Amendment aims to hinder the spread of false information by having more speech (counterspeech) to prove the false information wrong (Napoli 57, 60-61). Nevertheless, Napoli says that on the Internet, people are becoming better equipped to disseminate false information by targeting certain users and their respective filter bubbles (62, 75).

However, diFranzo claims that fake news is not likely to leave the filter bubble where it was cultivated (7). So, in diFranzo’s perspective, although fake news and/or conspiracy theories intensify and worsen in their own filter bubbles, they do not disseminate enough to reach other filter bubbles. Napoli thinks otherwise, saying that because of the way partisanship and falsity are connected, false information is more likely to enter other filter bubbles (78). When that happens, the inhabitants in those filter bubbles are less likely to accept real information that contradicts the false information.

Whether or not false information is likely to leave its original filter bubble, diFranzo and Napoli seem to agree on the idea that false information can potentially intensify in their filter bubbles, causing severe effects. A 2010 study from the British think tank Demos revealed that conspiracy theories can trigger extremist groups to become more radical (Butter 10). One conspiracy theory that led to the extreme occurred during the 2016 United States presidential elections, when the pizzeria Comet Ping Pong was accused of being a child trafficking ring, which led to a shooting in the pizzeria (Siddiqui and Svrluga). Another case involves Anders Breivik, a Norwegian far-right terrorist and mass murderer who posted a 1500-page document on the Internet before his attacks in 2011, which included conspiracy theories about Europe’s alleged Islamization (Butter 9; “Factbox”). Even the 2018 Pittsburgh Synagogue Massacre stemmed from anti-semitism and conspiracy theories about Jewish people (Butter 9; Chavez et al.).

The Growing Number of Critiques Against Filter Bubbles

Despite the filter bubble rhetoric’s popularity, several scholars have found the concept problematic. Bruns, finding it hard to justify filter bubbles, acknowledges that there is a “growing chorus of scholars” that have become critical to this concept (8-9). Numerous scholars support that claim, agreeing that there is insufficient empirical evidence to prove personalized communication and/or filter bubbles (Borgesius et al. 10; Moeller and Helberger 24-25; Bruns 2; Meineck). Moeller and Helberger argue that there are too many methodological challenges when researching the existence of filter bubbles, and that this is mostly true for Europe as, “most existing studies have concentrated on the US” (25).

Other critics say that the filter bubble metaphor only hinders people from considering real societal problems. According to Bruns, filter bubbles only turn technology into a scapegoat to cover a larger societal problem, which is society’s increasing social and political polarization (2). Meineck’s viewpoint aligns with Bruns’. Meineck acknowledges that although radical users are successful in spreading hate and mistrust on the Internet, the problem is not due to filter bubbles, whose existence Meineck denies. Instead, the problem is a societal one. Meineck claims that radical users simply do not believe information that do not fit their views, regardless of whether they are from alleged filter bubbles or not. Taking filter bubbles to the extreme, writes Bruns, can be problematic as that can lead to a moral panic that vilifies new technologies while distracting discourse from focusing on real societal problems (9).

Conclusion

The filter bubble metaphor, especially in light of many tragedies apparently motivated by fake news and conspiracy theories, has grown in popularity and relevance in recent years. The way it supposedly tailors individual experiences for each user on the Internet can, according to Pariser, isolate users in their own beliefs. Although the filter bubble metaphor’s apparent extreme effects can be observed in real life, it is also important to consider the growing number of scholars speaking out against filter bubbles, many of whom even question filter bubbles’ existence. Nevertheless, in an age of misinformation and excess information, reflecting on filter bubbles and the discourse surrounding it is useful to further understand not only how people access information online, but also other topics that arise from analyzing filter bubbles, such as fake news, conspiracy theories, free speech, and content moderation.


[1] Unless otherwise stated, this citation and any other subsequent citations from Coquelin and Ruhmannseder are my own personal translations from the original German language.

[2] Unless otherwise stated, this citation and any other subsequent citations from Meineck are my own personal translations from the original German language.

[3] Unless otherwise stated, this citation and any other subsequent citations from Krafft et al. are my own personal translations from the original German language.

[4] Unless otherwise stated, this citation and any other subsequent citations from Rieger and Schneider are my own personal translations from the original German language.

[5] Unless otherwise stated, this citation and any other subsequent citations from Butter are my own personal translations from the original German language

To see this work’s bibliography, please click here.

To see further readings pertaining to this work, please click here.