In a world full of information, it might be strange to imagine entire communities being unaware, uninformed or even misinformed about facts and news, but that is exactly what's happening today’s world. Rumors and fabricated or exaggerated information is being touted as news, some organizations and malicious actors manipulate entire communities with misinformation. This is possible because they are stuck within their own circle. We are naturally drawn to ideas and communities that appeal to us but shutting ourselves in an isolated circle of thoughts and news, will just reaffirm our version of truth which can be dangerous. Filter bubbles are isolating us from others who share different beliefs than us. As a result of being isolated from other perspectives, filter bubbles have shaped the way we view reality by changing the content we view on social media sites, like Facebook and Twitter. Filter bubbles result in a society of isolation where new ideas are shunned and silenced by the will of members rather than the censorship of a tyrant. There is no organization restricting what we see, there are simply people who will voluntarily ignore information they don’t like.
The term “filter bubble” was first brought up by activist Eli Pariser in 2011. Pariser described filter bubbles as, “your own personal, unique universe of information that you live in online. And what's in your filter bubble depends on who you are, and it depends on what you do. But the thing is that you don't decide what gets in. And more importantly, you don't actually see what gets edited out” (Pariser, 2011). It’s also been known as a state of “intellectual isolation” that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history (Technopedia). Pariser first discovered filter bubbles when he was scrolling through his Facebook feed and noticed that posts that were originally on there, had disappeared. He stated, “Facebook was looking at which links I clicked on, and it was noticing that, actually, I was clicking more on my liberal friends' links than on my conservative friends' links. And without consulting me about it, it had edited them [his conservative friends] out. They disappeared.” The pictures down below are a representation of what Pariser seen on his Facebook feed. Pariser mentioned that algorithms on search engines like Google starts to take effect before logging in. He states, “There are 57 different variables that Google tracks, according to one engineer: everything from your IP address to where you're physically located to what kind of computer it is and what kind of software it has to how big your fonts are. Then you add in the gigabytes of E-mail, the years of Web search history, and you start to have a lot of data to go on” (Pariser, 2011). This is not to say our own choices do not have any effects. In fact, filter bubbles are as much a product of algorithms as they are a product of the individuals that use them.
Even though there are algorithms being used to modify what we see on social media. We determine what we see based on the links we click on. People are in control of their own choices on the internet. With this control comes power, people have the power to ignore or dislike someone else opinion or belief. Blocking or unfriending on social media is an example of the type of power that people have on Facebook, which ultimately causes the algorithms to take effect. People like having the power to control who’s allowed in their “bubble” or society. I’m not saying you shouldn’t be allowed to block or unfriend someone but blocking someone because they don’t have the same belief as you, is practically censorship. This is commonly used by extreme feminists, who blocks anyone who doesn’t agree with their beliefs and anyone who's following another user that they don’t like.
This also works in the opposite direction as liking a post is another example of the effects we have on algorithms. Philip Howard at the Oxford Internet Institute stated, "Every time you tailor your feed to get rid of people whose opinions you don't like, you add more boundaries to your reality." In other words, your context is shaped by who you friend and unfriend. Eli Pariser tested this theory on his two friends, Daniel and Scott. He had them both type in “Egypt” on google and compared the search results side by side each other. He noticed the differences between the searches when analyzing the results. Pariser stated, “Daniel didn't get anything about the protests in Egypt at all in his first page of Google results. Scott's results were full of them. And this [protest in Egypt] was the big story of the day at that time. That's how different these results are becoming." These results indicate that the way we interact with these algorithms can drastically change our search results from others. The picture below is the results of Daniel and Scott google search.
Pariser stated, “Your computer monitor is a kind of one-way mirror, reflecting your own interests while algorithmic observers watch what you click.” One of the major problems with this issue is the way we interact with the algorithms on sites like Google. Once you click on a link, it begins to distort what is real and what is popular. According to Eli Pariser “On Facebook, you see a similar dynamic, where what's true on Facebook or what gets propagated and shared on Facebook are things that people click the "Like" button on.” This is effectively a psychological subject where we remember what is continuously brought up, the same strategy is used for advertisement and this method of making popular information more visible will effectively advertise a piece of information by its popularity rather than integrity.
Filter bubbles do not only isolate social groups but individuals within those groups. If any individual tries to voice a slightly different opinion than their social group, they might be forced to remain quiet or be cast out. In other words, if they don’t share the same viewpoints as their friends, they’re opinion would be ignored or even considered the result of negative influence that should be stamped out. When people are silenced for having a different viewpoint, it leads to a censorship within the bubble alongside the filtering of information from the outside. Sometimes we are reluctant when sharing our own personal viewpoints or opinions on social media. We feel that no one else will agree with us, so we censor ourselves in order to avoid arguments and negativity. Carnegie Mellon University Ph.D. student Sauvik Das found that 71 percent of Facebook users censored their posts and comments at the last minute within a 17-day period. There was a total of 3.9 million Facebook users included in this study. The study defined self-censorship as “the act of preventing oneself from speaking.” In the same study, Das and Facebook Data Scientist, Adam Kramer, concluded that there were two primary reasons for self-censorship: the audience is vague (like with a status update) or the audience is extremely narrow (like group posts). In other words, you’re not sure who the audience is, or you’re concerned that what you wrote won’t resonate with your group.
I experience something similar when I posted my opinion on Facebook how a music video by singer, Fergie, was over-sexualizing women who breastfed. But before posting, I censored my comment a bit because I didn’t want to offend others nor argue with anyone who thought differently than me. This phenomenon is also known as “spiral of silence.” According to Pew Research Center, it’s the tendency of people not to speak up about policy issues in public—or among their family, friends, and work colleagues—when they believe their own point of view is not widely shared. To demonstrate this concept, they surveyed 1,801 American adults on their reactions to Edward Snowden's revelations in 2013 concerning widespread US government surveillance. The survey reported in this report sought people’s opinions about the Snowden leaks, their willingness to talk about the revelations in various in-person and online settings.
The study found that 86% of Americans were willing to have an in-person conversation about the surveillance program, but just 42% of Facebook and Twitter users were willing to post about it on those platforms. Facebook and Twitter users were also less likely to share their opinions in many face-to-face settings. This was true if they did not feel that their Facebook friends or Twitter followers agreed with their point of view. Also, the study found that in both personal settings and online settings, people were more willing to share their views if they thought their audience agreed with them. This in and of itself leads to fewer ideas being shared inside filter bubbles leading to an even more scarcity of thought.
People are forming beliefs and opinions of “limited” information (one-sided arguments) and not seeing other point of views. This is not to say filters are inherently evil or wrong. Filters need to exist, to get the information we need, but when people abuse the filters or when a certain mechanism like Facebook, Twitter, or Google effects what information gets to us, it creates a filter bubble. According to a study conducted at the University of Michigan in 2015, over 60 % of Facebook users are unaware of algorithmic curation on their newsfeed. Mostafa M. El-Bermawy, explains the effects of personalized feeds on our content “Our Facebook feeds are personalized based on past clicks and likes behavior, so we mostly consume political content that are similar to our views. Without realizing it, we develop tunnel vision. Rarely will our Facebook comfort zones expose us to opposing views, and as a result, we eventually become victims to our own biases” (El-Bermawy Wired.com). The algorithms are designed to show more relevant information, further restricting what we see by eliminating what they consider "not likely" to be interesting for us. It leads us to a feedback loop of self-affirmation. For example, many of my friends refuse to even entertain the idea of studying or reading about subjects such as socialism while continuously berating it and praising capitalism as the only way forward. I believe it’s best to form an opinion off seeing all sides of an argument instead of one. It gives you a way to form a more coherent opinion, but this ideal seems to be out of reach with the way social media currently works.
Social media bubbles are providing the perfect ground for fake news to spread. The very fact that such networks are usually unsupervised makes tracking them and holding perpetrators responsible almost impossible. Facebook and Google have already restricted advertising on fake news sites, but this doesn't stop users from sharing false information, and bubbles are made of more than just falsified news reports. According to Pew Research, 61 percent of millennials use Facebook as their primary source for news about politics and government, but Facebook refuses to acknowledge its identity as a news source.
This trend is seen in twitter as well, a new study performed by Philip Howard and his colleagues at the Oxford Internet Institute scoured 19.4 million tweets between 1-9 November in 2016, in the run-up to the US election. They found that 20 to 25 percent of election-related traffic came from accounts that were probably bots. The top 20 accounts averaged over 1300 tweets a day, generating a total of more than 234,000 tweets over the period. These bots largely worked to spread pro-Trump messages. By election day, pro-Trump bots were tweeting five times as often as pro-Clinton bots. The screenshot below are examples of pro-Trump bots.
Fake news over the internet has even led to murder in some cases. There was a Facebook post that said Islam was a global threat to Buddhism (The screenshot of this is posted above). Also, there was a false story shared throughout Facebook about the rape of a Buddhist woman by a Muslim man. According to former military officials and researchers, the Facebook posts were from Myanmar military personnel who turned the social network into a tool for ethnic cleansing. We can see similar incidents in India. According to the New York Times, in India, false rumors about child kidnappers have gone viral on WhatsApp, prompting fearful mobs to kill two dozen innocent people since April. WhatsApp’s design makes it easy to spread false information. Many messages are shared in groups, and when they are forwarded, there is no indication of their origin. The photo below this paragraph is an example of one the false rumors in India. On the left side of the screenshot, it’s a video claiming to show the kidnapping of a child in India. On the right side, it’s an image claiming to show bodies of little children laid out in rows in India. Filter bubbles and fake news are completely different terms, but they have a very close connection. Filter bubbles allow fake news to flourish. This is only one of the many issues with filter bubbles.
Since people are limiting themselves to a certain amount of information, new ideas aren’t popular. Creative ideas are being thrown away because people are only sticking to what they believe in or what they believe to be right. We assume that everyone thinks like us, and we forget that other perspectives exist. One of the great problems with filters is our human tendency to think that what we see is all there is, without realizing that what we see is being filtered (Farnam Street).
In an age where information is more accessible than ever before, it should be possible for everyone to connect and inform themselves. To find a thorough analysis and verified information but this seems to be the opposite of what is happening. Eli Pariser explains the ideal purpose of the internet, “We need it to connect us all together. We need it to introduce us to new ideas and new people and different perspectives. And it's not going to do that if it leaves us all isolated in a Web of one” (Pariser TED Talk). The internet should be a place that we can go to share ideas and learn new ideologies from others. It should allow us to have the power to express our opinion. Algorithms reinforce people opinions instead of allowing them to gain new insights from different perspectives.
Filter bubbles are shaping the way we view ourselves and our reality. Imagine if someone was raised with a certain type of belief their entire life, that belief would be the only thing known to them. This would eventually shape the way they view themselves and their reality. Likewise, on the internet, people will only stick to their beliefs. According to Farnam Street, “Your social circle is a filter bubble; so is your neighborhood. If you’re living in a gated community, for example, you might think that reality is only BMWs, Teslas, and Mercedes” (fs.blog). People inside of these bubbles can’t see how their actions can harm others who aren’t inside of their bubble. Since they can’t see the impact of their own actions, they will never be able to hear the beliefs of others who aren’t in their bubble.
People are allowing algorithms to define who they are and what they like. Aurora Hotta, explains the mechanism of algorithms, “From the device that you are using to your age, income, and gender. Algorithms can presume your stance on vaccinations and other significant personal and health-related issues. Likewise, your Facebook newsfeed looks completely different during election time depending on whether the algorithmic logic considers you as a liberal or conservative and feeds you with according news” (Hotta medium.com). Algorithms on websites like Google are assuming what kind of person they “think” you are based on what you like and the sites you choose to visit. The categorization and stereotyping from these algorithms are causing people to assume that this is what they “are” like because of the data being presented to them.
Filter bubbles are leading us into a society of isolation where new ideas and opinions are shunned and silenced by those who have different beliefs than us. Not only is it isolating us from each other, but it’s defining who we are and shaping the way we view the world around us and ourselves. Also, it’s allowing fabricated news to spread throughout the internet. If this polarization continues, we might see more biased actions that seek nothing but harm those outside a group. If we are to progress and solve issues in our society as a whole, then we need to expand our understanding of subjects we might not feel comfortable with. We should be able to read and hear opposing views without belittling or attacking people who hold these ideas. We must go beyond tribe mentality and work as a community. Filter bubbles are a real danger to progress and peace, to achieve these we need to prevent it. The only way we’re going to overcome this crisis is if we use ad-blocking browser extensions, read news sites and blogs that provide a wide range of perspectives, and use Incognito browsing, and delete search history. Filter bubbles need to be popped once and for all.