Classifications of Personal Involvement

classification

Ontology is shared understanding of a given interest and its subdivisions, and in the case of Netflix, a set of micro genre’s divided to produce the ultimate personalized watch list. The article How Netflix Reverse Engineered Hollywood, written by Alexis D. Madrigal, focuses on the engineering of the categories or subgenres, and features that have occurred as a product of this complicated algorithm. Netflix is able to break down these films to the nitty gritty and collect the information into the vat of cinema knowledge, then apply the information in a productive manner. Shows like House of Cards or Orange is the New Black has been strategically created based off the data pulled from the viewer’s preferences, they created shows based on what their users like. Personally I think they are genius what they have done makes total sense, but there is so much more complication that exist within a system like this. But honestly my brain cannot fully wrap around the amount of work that went into developing this program. The company has created a system that uses local knowledge from their user community to create a personalized genre, which will help avid video streamers like myself, develop cinema ontology just for personal experience.

 Classifications are the basis to the Netflix organization, but also involved in multiple constructions in the everyday life. We categorize our lives from are stores, animals, food, jobs, and many more standard practice Classifications exist as product variations within our local knowledge. Just like our mind, the Internet uses the information like cookies to create a personal surfing experience. For example, Facebook uses information that you post to create your ad preferences. They pair with companies like DAA( Digital Advertising Alliance) which provides ads which are customized to the users, by using information like age, location, liked pages, and other shared data. This information actually leaves me slightly unsettled. Facebook is tracking my Internet presence as a way of gaining resources. I did know that they were doing this, and it’s awesome that they are attempting to please the user and give them an intimate Facebook experiences, but I do not like that I am being Facebook stocked by Facebook. Netflix approach seems to be more for the user, but this is because there is a membership fee, where as Facebook gets their money from ads so we can use the site for free. It just leaves me with an unsettling feeling, how much do they really know?

Ontologies and Individuals

Reading “Local-Global: Reconciling Mismatched Ontologies in Development Information Systems” by Jessica Seddon and Wallack Ramesh Srinivasan reminded me of peoples’ struggle to reconcile their identity within a system of classifications. After the resource day at UCLA, I realized that there were too many organizations that I should join. I picked three organizations that best represented my interest and identity. However, I ended up devoting myself to only one organization. My other two interest had to be discarded for the meantime. Therefore, my other interests will be lost to the organization that I chose. If my understanding of the reading is correct, an organization representing a single demographic or interest is a mismatch to what defines an individual. Jessica Seddon and Wallack Ramesh Srinivasan notes that “While any group’s ontology is unlikely to match that of every individual within the group, the extent of mismatch tends to increase with the scale of the group and the differences between the purpose of individual and group ontologies.” Ideally, an individual should not be broken up between three interest, but should have one organization that addresses his or her interest in its entirety. Instead of choosing an organization that fits one criteria and leaves out the rest of my interests, I chose an organization that was the most diverse in an attempt to keep my interests broad.

I searched the web for a visual example of an ontology related to the reading and found this simple visualization beginning with a lion and an antelope. The diagram of the two animals resemble the way classifications are divided and the way they relate. The over simplified diagram of the two animals only leads to further classifications.  The problem addressed by Seddon and Wallack is that when information is not “inclusive” or “collaborative” to the community, a mismatch of information takes place. For instance, to further develop this animal ontology, one can create a way for people to add more information about lions and antelopes. The classification process does not really tell us much what is really a lion or an antelope. This visual ontology is suppose to represent lions and antelopes, but because of their classification, information that defines a lion and an antelope are lost. The ontology, therefore, is not reality all the time. Each organization had it’s own ontology that best represents that organizations goals. However, since most organizations specialize to serve the interest of a specific demographic, an individual with a multitude of interests will struggle to reconcile his or her conflicting interests.

 

Sources:

Seldon, Jessica and Srinivasn, Ramesh Wallack. “Local-Global: Reconciling Mismatched Ontologies in Development Information Systems”. 42nd Hawaii International Conference on System Sciences, 2009. http://rameshsrinivasan.org/wordpress/wp-content/uploads/2013/03/18-WallackSrinivasanHICSS.pdf. Web. 20 Oct. 14

 

web. image. http://www.scientific-computing.com/features/feature.php?feature_id=37

Week Three: Netflix and Facebook

What stuck me most about the article “How Netflix Reverse Engineered Hollywood” were how many comments lamented the fact that despite the prevalence of ultra-specific altgenres, many users are only given the same suggested movies over and over. Because the function of these altgenres is to intimately personalize the film selections for a highly specific viewer, viewers are only given a select amount of options by the algorithm created by Netflix, limiting the immediate scope of their film watching. One user commented, “[This] explains why Netflix has steadily made its search function harder and harder to use. It really does not want to empower end-users, it wants to effectively program content for you… Some must be more profitable than others; hence those are the ones you are spammed with… The missing element is how profitable each and every stream might be.” While I am not sure about the factual accuracy of this comment, it does remind me of a similar site that attempts to create personalized content to enhance revenue: Facebook.

The similarities of Netflix and Facebook lie in the “design of the software system that supports them. How that software functions is the result of decisions made by programmers and leaders within the company behind the website” (Grosser). Netflix is designed to suggest films that you would want to watch based on your previous watch history. This leads to personalized streams and, most likely, increased at revenue for Netflix. Facebook is structured in a similar personalized way, but while many “personalities” can use one Netflix account, Facebook’s interface forces the user to realistically portray themselves online the same way they would as if they are in real life. It requires the use of a real name, location information, schools and jobs held, and what the music and movies one likes. According to Grosser:

This ideological position of singular identity permeates the technological design of Facebook, and is partially enforced by the culture of transparency the site promotes. The more one’s personal details are shared with the world, the harder it is to retrieve or change them without others noticing—and thus being drawn to the contradictions such changes might create. This is further enforced by the larger software ecosystem Facebook exists within, such as search engines, that index, store, and retain those personal details in perpetuity (Blanchette et al., 2002).

The personal details Facebook collects leads to a data-mining trove, and allows Facebook to use this information to target the user with personalized ads, much in the same way Netflix uses previously watched films to recommend movies a viewer will most likely watch. Both of these website’s software are what allow them to be so successful in marketing to specific interests, but also limit the variety of “interests” displayed, thereby regurgitating the same limited types of objects—-be it movies, or ads.

Grosser, Ben. “How the Technological Design of Facebook Homogenizes Identity and Limits Personal RepresentationHz-Journal. Hz-Journal, 2014. Web. 20 Oct. 2014.

Netflix and Society

Before reading Alexis Madrigal’s article “How Netflix Reverse Engineered Hollywood”, I wasn’t aware that tens of thousands of microgenres even existed. Moreover, I was skeptical of the fact that Netflix recommended movies based on all the different films and TV shows that we previously watched. I just thought that Netflix recommended the same movies to almost everybody and only claimed to tailor recommendations as some sort of marketing scheme. I didn’t think there would be a group of people who sat down, analyzed, and tagged all the different films to put such a vast project together. Therefore, after reading Madrigal’s article, my appreciation and respect for Netflix grew tenfold. It reminded me about the years and years of hard work that Pandora employees had to invest in order to create the music genome project, which is similar to Netflix in that it endeavored to analyze and tag every song, artist, and album in order to recommend music to individuals. Another example would be Yelp. Yelp endeavors to compile a list of all the restaurants and different arenas of the service sector and tags each establishment in order to recommend places to eat and where one should take his car for maintenance or repair as an example. In this post, however, I wanted to focus on the system of classification and how recommendations and reviews in Netflix lead the masses in society to watch similar movies and shows.

Screenshot (8)

Although Netflix has 76,897 ways to describe a movie through genre tagging, only a very small fraction of those genres are seen in one’s personalized Netflix home-page. The genres that you see are most likely the most popular genres. Moreover, one huge determining factor that goes into the selection of movies that are shown in the front page is the review or star process. The movies and shows that we see in the front page are usually ones with great reviews or the most number of stars. This is one reason why I see a vast majority of similar recommended movies and films. For example, my Netflix home-page and my friend’s home-page share many of the same movies. In fact, sometimes our home-pages look almost identical. Although, of course, personalization comes into play and we do see some differences in the recommendation of movies, for the most part, it seems as if Netflix showcases movies that are highly rated and popular. Since, people only want to watch the best and top-quality movies, as a society, we end up watching movies from a pool that is essentially not that vast at all. This makes me think about the future and how maybe if this trend of classifying and recommending continue as it has for the movie industry such as Netflix and the music industry such as Pandora, Spotify, etc, then we as individuals in one nation might prove to be strikingly similar to one another.

Week 3: Plateau Peoples’ Web Portal & Christopher Columbus

Plateau Peoples’ Web Portal is a brilliant and elaborate site is a “gateway to Plateau peoples’ cultural materials” held in multiple historical preservation establishments. Tribal administrators (working with their tribal government) provided information and their own materials as to expand the archives. This website is crucial to celebrating diversity within the Indigenous People’s culture rather as to group them together singularly as “Native American.” The website provides cultural materials, respectively to each tribe, digitally along with a map so that one can see exactly where it used to be.

Just looking at the six tribes featured is only six of the 560 federally recognized tribes that exist in the United States alone. Most of the cultural materials provided provide insight to the devastating imperialism and cultural genocide of these tribes. 50 million people had been living, thriving, existing in America before the voyage of 1492. The Plateau Peoples’ Web Portal is a terribly real reminder that the land we live on today is occupied illegally and the persistence of Indigenous Peoples’ Day should be federally recognized instead of Columbus Day. Replacing Columbus Day with Indigenous Peoples’ Day is a huge feat in recognizing that Christopher Columbus did not discover anything. One cannot discover land that is already inhabited by millions of people. On top of that, Columbus is the notorious catalyst that caused the genocide of millions of North American inhabitants and the cultural annihilation of hundreds of diverse cultures. The Plateau Peoples’ Web Portal is an homage to the tragically destroyed cultures and classifies them individually giving them the recognition that is due.

 

Indigenous Peoples’ Day is not a new phenomenon, just incredibly unrecognized. Minneapolis recently passed a resolution this April for Indigenous Peoples Day to rename the second Monday of October. Hopefully we can observe just the start of a historical observance revolution. Going through this website and viewing the pictures available is a huge reality check.

(Note: The Indigenous People of the Plateau include some parts of Canada, not just the U.S.)

Sources:

Plateau Portal

Indian Country Today

City Pages Blog

Week 3: Screwing with Netflix and Facebook Suggestions

week3image

http://www.theatlantic.com/technology/archive/2014/01/how-netflix-reverse-engineered-hollywood/282679/

http://www.wired.com/2014/08/i-liked-everything-i-saw-on-facebook-for-two-days-heres-what-it-did-to-me/

A few years ago, I remember scrolling through Netflix and finding a few movies that looked slightly interesting. Now, I have about 5-6 movies on my “to-watch” list at all times. Netflix, the online movie and TV subscription service, currently has over 50 million subscribers globally. In 2006 they had 10 million subscribers. It’s clear that Netflix has grown to be one of the top web apps to date. The question is how they did it – did they improve their movie selection, or did they improve their movie selection suggestions?

We can assume that Netflix did both. However, the more important development was the suggestions. Alexis Madrigal spent several weeks with a group of coworkers pulling apart Netflix’s magic. According to their results presented in the article “How Netflix Reverse Engineered Hollywood”, Netflix has 76,897 unique ways to describe movies. It all boils down to tags; Netflix has analyzed and tagged every movie and TV show in every possible way imaginable. “When these tags are combined with millions of users viewing habits, they become Netflix’s competitive advantage. The company’s main goal as a business is to gain and retain subscribers. And the genres that it displays to people are a key part of that strategy”.

There is only one other similar intelligence approach in existence. “Netflix has built a system that really only has one analog in the tech world: Facebook’s NewsFeed”. Both Netflix and Facebook’s NewsFeed operate under the users viewing and liking habits – depending on how you interact with Netflix or Facebook, their algorithms spit out other certain movies or links you might be interested in. But what happens if you mess with this algorithm and like everything?

Mat Honan did. In “I liked Everything I saw on Facebook for Two Days. Here’s What It Did to Me” featured on Wired, Honan explains that after his Like binge, his Facebook NewsFeed became extremely liberal and extremely conservative. None of his friends popped up anymore; it was all advertisements and articles. “As I went to bed that first night and scrolled through my NewsFeed, the updates I a saw were (in order): Huffington Post, Upworthy, Huffington Post, Upworthy, a Levi’s ad, Space.com, Huffington Post, Upworthy, The Verge, Huffington Post, Space.com, Upworthy, Space.com.”

So I did the same thing, but with Netflix and only for 20 minutes. I went to “Personalize” and 5-starred and clicked “interested” for every movie that popped up. On Netflix’s splash page I did the same thing. The difference between Honan and my result’s was that Netflix recycled the movies I already 5-starred. They had no other movies to suggest, no other reserves to pull from. Facebook, on the other hand, has the entire internet to suggest to you. Which ends with a clogged NewsFeed full of stuff that in the end, Honan didn’t like anyways.

Algorithms do an amazing job of feeding consumers stuff that they’d be interested in. But they can be screwed with, and that’s because in the end, they’re robots trying to cater to humans.

Netflix & Pandora

After reading the article “How Netflix Reverse Engineered HollywoodI couldn’t be more intrigued. It was always nice that Netflix would recommend movies to me but I never really put too much thought into it. I would say nine times out of ten, there is a recommendation for me waiting when I finish my series or movie that I’ll actually watch. For example, (and please don’t judge me for my taste in television here), I had just finished watching One Tree Hill when Netflix recommended that I watch Desperate Housewives. These series are so completely different in terms of what they’re about, the age demographic, the setting, etc. but Netflix knew based on who else watched it and everything else that I had watched that I would love it too! And now, thanks to Alexis Madrigal, we know that this is because of the absurd amounts of categorization, metadata and refined vocabulary that Netflix uses.

For me personally, I found it interesting that the categorizations were so specific. It’s hard to fathom that there were people that went through all of the movies available on Netflix to tag them with enormous amounts of metadata – and some of the categories are so oddly specific. This got me thinking about Pandora. Similar to Netflix, Pandora uses what you like/ have listened to in order to recommend new music. Although there are differences – Pandora is music, Netflix is film, Pandora uses “thumbs up” and Netflix uses stars for ratings, I have to imagine that Pandora must have a similarly specific metadata process it uses in order to produce music that people will enjoy. Like Madrigal said, “The better Netflix shows that it knows you, the likelier you are to stick around.” And this is true for Pandora as well. If I put on a station that gives me five songs in a row that I don’t really care for, I’m going to use a different site to listen to music. However, the main difference between these two sites is probably that Pandora is free while Netflix costs about eight dollars per month. Because of this, it would be more detrimental for Netflix to lose its customers. Of course Pandora wants to be a successful site (which it already is) but they don’t have a tangible loss when people stop listening to them.

 

Madrigal, Alexis C. “How Netflix Reverse Engineered Hollywood.” The Atlantic. Atlantic Media Company, 02 Jan. 2014. Web. 20 Oct. 2014.

Week 3: Movie and Music Genre Generators

Picture 1This was a very comprehensive article that triggered a couple examples of other generators I encountered after reading it. Near the beginning of his piece, Madrigal had an interesting point about tracking the URL and its incremental values at the end of the web address. I’ve also been able to navigate between pages using that same line of logic. It strikes me that for some databases, a lot is revealed to the public, while others are much more privatized depending on how the system was created in the first place. Once you discover the thought process behind the ways some things are categorized, you can easily find what you’re looking for in a general sense.

After playing with the generator for a few minutes, I started to wonder if any director will rise up to the challenge and actually create a movie based on the results of this generator. Maybe that way, we’ll be able to see more original movies. I’ve always believed in the notion that people can do great things once they’re given some limitations. It’s an intriguing thought that something original can be made based on unoriginal words, descriptions, and genres produced by algorithms. This truly showcases the wide range of possibilities that any given combination can produce.

I also love this quote from the article, “It’s where the human intelligence of the taggers gets combined with the machine intelligence of the algorithms. There’s something in the Netflix personalized genres that I think we can tell is not fully human, but is revealing in a way that humans alone might not be.” In a very digital humanitarian sense, this project was able to produce many eye-opening graphs that gives the public an inside look to what types of things human beings prefer just by analyzing and presenting the data a different way. Instead of being recommended different genres and searching through them, we are now able to generate our own and the results that pop up says a lot about our diverse preferences and creativity of past directors that have shaped the movie industry.

Because of how many times Spotify was mentioned in the class, the first thing I did after reading the article was to google for a ‘Spotify based music genre generator’. What I found was this playlist generator site linked to Spotify that lets you search for playlists that were made from other users and were categorized by either mood or genre. It’s unfortunate that you can’t search for both simultaneously, but when creating a playlist, you can tag descriptions under both types of categories. Another site that I encountered was a more minimalist music genre generator that operated under a similar idea as the Netflix one in that it combined a couple of descriptive words from a database to create new music genres. Lastly, I found an actual site that lets you generate your own generators. Even though there was a user-created movie genre generator, it only allowed you to mash together two random genres and I’m willing to bet that the database it was pulling the genres from is a lot less detailed than the one Yellin created under Netflix.

Sites Used in Order of Appearance:

http://www.theatlantic.com/technology/archive/2014/01/how-netflix-reverse-engineered-hollywood/282679/6/

http://playlists.net/

http://jbdowse.com/genres

http://www.generatorland.com/

Phrenology and Classification

Bowker and Star continue to expand upon a topic we began discussing last week, classification. They discuss means of classification as a “spatial, temporal, or spatio-temporal segmentation of the world”. In lecture we went over many classification and organizational systems that are currently in use, from library classifications to those used in social medias. These exist to make data retrieval easier, they create ways to expedite retrieval by categorizing information into relevant subgroups. We have been discussing so many useful and innovative systems of classifying observations and information that I want to bring in a very specific, and now obsolete, classification system subset related to how crania appear: phrenology.

Phrenology hails back to late 18th century Germany, springing from the observations of physician Franz Joseph Gall. Now considered a pseudo-science, phrenology was invented to classify specific physiological features of the skull as belonging to specific characteristics or faculties. There were 27 faculties that Gall identified, ranging from reproductive instincts and the love of one’s offspring to murderous instincts and metaphysics. Franz Joseph Gall’s classification of specific traits within this range of physical contours in the cranium is an example of how even if something can be classified or has already been classified that does not necessarily lend truthfulness its subject. With that said, the phrenology classification can be seen as a precursor to psychology and its physiological correlations.

1895-Dictionary-Phrenolog

In the more modern era, we think of phrenology as a funny side note in medical and scientific history. It seems laughable now that something as arbitrary and even changeable as the lumps and indentations on a head could denote specific personalities or behavioral tendencies. Had there been a better reception of phrenology, or perhaps more scientific evidence, this system could have become a standardized method of evaluating personalities. These shortcomings on a scientific scope are tied to its shortcomings as a classification system as Bowker and Star describe it. It is interesting to me that even though Bowker and Star lay down three qualities for a classification system to have (consistent unique classificatory principles, mutually exclusive categories, a complete system) they only exist in a theoretical ideal setting. Phrenological studies identify different faculties, categories of behavior and personality, but they are difficult to quantify on their own; it becomes difficult to say with certainty that a specific node on the head is of a size to denote that a person has an aptitude for education. This lack of ability to definitively categorize what size bump or indentation in the skull leads to personality traits, combined with pseudo-scientific reasoning (this being the biggest reason) that ultimately makes phrenology an idea of the past.

 

You can learn more about phrenology from books that UCLA currently has in its collection (and you get to see the Library of Congress classification system in action!). The Biomed History and Special Collections Cage has a few really old books on the subject! See here and here

Week 3: Is Technology Dividing Cultures?

There is seemingly an infinite amount of ways to categorize information. In most cases, classification is biased towards an individual or a cultural preference, giving priority to the social norms of each respective community. However, in our digitized world, cultures lacking modern technology are deemed incapable of making these classification decisions, and this notion incorrectly contributes to the ever-growing digital divide. This idea unethically promotes a division between cultures who have access to new technology, and those who don’t. We must understand that in some cases, modern, digital technologies aren’t relevant or useful in certain culture’s practices.

Modern technology is thought to provide a more efficient and powerful way to accomplish tasks, but as we investigate cultures and their practices, we find that new digital and technological advances aren’t always the most practical. Last Winter, I took a class taught by Professor Ramesh Srinivasan in the Information Studies Department. We thoroughly discussed the concept of the digital divide and how cultural appropriation in regards to promoting new technologies must be a slow and adaptive process, as different  communities stress different needs. One article in particular struck me, as it fought the urge to close the gap the digital divide is apparently creating. A sociotechnical experiment took place in India in the early 2000s that documented the results of a technology influx: several computers were setup in a rural farming village in Southern India with the expectation for residents to become educated on the various modern technologies in hopes of bringing the community up to speed on the digital age. Instead of appropriating the new technology into their lives for the benefit of their community and economic infrastructure, the children were seen playing video games and creating an unnecessarily competitive environment that took away from their studies and daily chores. This agricultural village had no apparent need for the technology, nor did they understand how it could provide a benefit to their community, as they were very comfortable in their way of doing things.

i-slate

What is important to note, is that the individuals who provided the technology were only present for the setup and removal of the devices, and didn’t make themselves available to facilitate the usage of the computers. This is ethical in terms of cultural appropriation standards, as it allowed the community to learn for themselves how to use the devices, as exemplified in the success of the Plateau Peoples’ Web Portal. However, introducing such a foreign object into the community requires more mentoring and technical assistance than there was provided. With the assistance from computer experts, this community could’ve benefitted from learning how to track weather patterns and the prices of goods to help their agricultural economy improve. There is a balance between introducing and enforcing knowledge on other cultures, and with the abundance of new technologies, we must be careful in the ways in which these devices are presented in order to maintain the unique practices and the integrity of cultures around the world.