One of my favorite books about Internet culture is The Filter Bubble by Eli Pariser, which was published in 2011. In the book, Pariser discusses the algorithms search engines like Google, social networking sites like Facebook, and other websites use in order to personalize an individual user’s experience. Google, for example, guesses what search result would be most relevant to you based on your search history, your browsing history, your location, and other idiosyncratic factors, meaning that the search results you get might be vastly different from what a different user with a different background might receive.
This is, ostensibly, pretty great–after all, who wants to sift through a bunch of irrelevant search results before they find the website that they’re actually looking for? Who wants look at boring photos on their Facebook feed when all they really care about are Buzzfeed quizzes and news articles? The use of these algorithms help to tailor our experience of these websites in a way that makes them more useful to us.
However, the application of these filters enable Internet companies to collect information about your browsing habits in order to target advertising. They see you frequently like Upworthy videos that show up on your Facebook timeline, so they make sure those videos appear more often…and also display ads about charitable donations and “socially conscious” companies. They notice that you tend to Google the lyrics to Top 40 pop songs, so they make sure that lyrics (as opposed to Wikipedia articles or music news sites) floats to the top of your search results…and also display ads about local concerts and newly released albums.
However, even more insidious than the advertising is the construction of “filter bubbles”: users get less exposure to conflicting viewpoints and end up being ideologically isolated within their own information bubble. A liberal and a conservative Googling the exact same event end up getting vastly different search results (say, searching for a specific news story and getting either MSNBC or Fox News depending on your past browsing history), which only serves to reinforce previously held convictions and removes opportunities for civic discourse. Or, as Pariser put it:
A world constructed from the familiar is a world in which there’s nothing to learn … (since there is) invisible autopropaganda, indoctrinating us with our own ideas.
Research has demonstrated that individuals who only associate with people who agree them become more extreme in their beliefs and values than they would be if they were exposed to a diverse array of ideologies–and if you look at research on the increased political polarization of US society, that definitely seems to hold true: over the past 10 years, the number of people who lie on either extreme end of the political spectrum has vastly outstripped the number of moderates. To put it differently, Republicans are becoming increasingly more conservative while Democrats are becoming ever more liberal.
The filtered nature of our virtual reality is vastly impacting the political structure of our society, and I’m not so sure that that’s a good thing.