On Filter Bubbles

One of my favorite books about Internet culture is The Filter Bubble by Eli Pariser, which was published in 2011. In the book, Pariser discusses the algorithms search engines like Google, social networking sites like Facebook, and other websites use in order to personalize an individual user’s experience. Google, for example, guesses what search result would be most relevant to you based on your search history, your browsing history, your location, and other idiosyncratic factors, meaning that the search results you get might be vastly different from what a different user with a different background might receive.

This is, ostensibly, pretty great–after all, who wants to sift through a bunch of irrelevant search results before they find the website that they’re actually looking for? Who wants look at boring photos on their Facebook feed when all they really care about are Buzzfeed quizzes and news articles? The use of these algorithms help to tailor our experience of these websites in a way that makes them more useful to us.

However, the application of these filters enable Internet companies to collect information about your browsing habits in order to target advertising. They see you frequently like Upworthy videos that show up on your Facebook timeline, so they make sure those videos appear more often…and also display ads about charitable donations and “socially conscious” companies. They notice that you tend to Google the lyrics to Top 40 pop songs, so they make sure that lyrics (as opposed to Wikipedia articles or music news sites) floats to the top of your search results…and also display ads about local concerts and newly released albums.

However, even more insidious than the advertising is the construction of “filter bubbles”: users get less exposure to conflicting viewpoints and end up being ideologically isolated within their own information bubble. A liberal and a conservative Googling the exact same event end up getting vastly different search results (say, searching for a specific news story and getting either MSNBC or Fox News depending on your past browsing history), which only serves to reinforce previously held convictions and removes opportunities for civic discourse. Or, as Pariser put it:

A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas.

Research has demonstrated that individuals who only associate with people who agree them become more extreme in their beliefs and values than they would be if they were exposed to a diverse array of ideologies–and if you look at research on the increased political polarization of US society, that definitely seems to hold true: over the past 10 years, the number of people who lie on either extreme end of the political spectrum has vastly outstripped the number of moderates. To put it differently, Republicans are becoming increasingly more conservative while Democrats are becoming ever more liberal.

The filtered nature of our virtual reality is vastly impacting the political structure of our society, and I’m not so sure that that’s a good thing.

3 thoughts on “On Filter Bubbles

  1. abwrubel

    Thinking of the self as a completely marketable thing is a scary thought. I feel like your post has a lot to do with pleasure, and the way in which the internet may, subjectively speaking, over pleasure the individual. Not only are products being sold and marketed to us, pleasing our capitalist tendency, but also our ideologies are constantly being confirmed, which acts as another form of pleasure. Is over pleasure bad? Could it make individuals more complacent?

  2. fmanto

    I remember discussing this entire concept in another class last year. First of all, I think it’s so interesting how data is being used now in terms of marketing to the specific user. The internet contains incredible amounts of information, which is why catering to a user may be a good way of personalizing each user’s experience. BUT at the same time, when you personalize too much, it begins to become a negative. With everything, there needs to be a balance. I truly believe that balance has yet to be determined for this. I remember when FB said they would track people who watched videos more often on their newsfeeds and provide more videos for them to watch. I love videos, so this sounded kind of cool to me. But now I have 5 or even 6 videos at the top of my newsfeed now, which tends to take away from the original experience.

  3. snmarquez

    This is so important to realize. Many people take for granted that companies behind a product are creating these things, not for the good of all, but rather to make a profit. As you said, while these filters help user get in touch with what they are interested in, they do so but losing diversity, which may be the most important thing of all.

Leave a Reply