Modeling the Past

It is amazing to track the progress that has been made within the entire field of digital humanities over the past few decades. The constant advances in technology have paved the way for the addition of exciting, new visualization tools to be used in digital humanities projects. For example, 3D modeling tools have become very popular and have given professionals as well as intrigued students the opportunity to create as well as experience complete 3D digital models. Diane Favro’s depiction of digital technologies in “Meaning in Motion. A Personal Walk Through Historical Simulation Modeling at UCLA”, gives a brief history of the evolution of 3D modeling at UCLA, specifically discussing its role in a roman architecture project. The ability to create a detailed virtual simulation using 3D modeling tools gives complete creative control to the creator of the model, and allows for an endless amount of final products. Having the ability to recreate intricate roman architecture virtually, and situate it in a specific time in history for others to explore is truly an amazing capability.

 

Diane Favro’s account of the Roman 3D modeling project specifically reminded me of another 3D model my ancient Egyptian history professor went over in our class. This ancient Egyptian model of Karnak (link), allows users to explore the famous Egyptian city virtually through the years to see its holistic progression as a city over time. 3D models similar to this one allow for students to learn about ancient times and places through visual understanding and exploration, rather than simply reading endless amounts of text. Being able to connect ideas and history with a visual location adds an extra element to one’s complete understanding of a specific topic. 3D modeling has revolutionized the capabilities of those in the digital humanities field and offers extremely unique perspectives that other forms of model are not able to provide. I am very interested to see where this technology goes in the future, as I believe there are still many facets within 3D modeling that have yet to be created and explored. The concept of 3D modeling had advanced and become a reality so quickly that it is exciting to think about how it will progress in the coming years. This technology will allow for historical sites to be preserved digitally and provide students as well as any other interested parties the chance to thoroughly explore any destination they desire. There possibilities are endless.

 

Sources:

  1. Digital Karnak Project: http://dlib.etc.ucla.edu/projects/Karnak
  2. Diane Favro. “Meaning in Motion. A Personal Walk Through Historical Simulation Modeling at UCLA”, in: Visualizing Statues in the Late Antique Forum

Pondering User Perception

Everything in this world has some type of design or format affiliated with it. In Jesse James Garrett’s presentation titled “Elements of User Experience” she keys in on the idea of user experience design as being “the design of everything with human experience as an outcome and human engagement as a specific goal”. This definition applies to almost every aspect of our day-to-day lives, as we are constantly interacting with our surroundings. User experience design has been revolutionized and transformed by the constant technological advancements in our society. Companies plan out their web design and layout in order to project a certain image and mood that parallels their values. This layout, consisting of a surface, skeleton, structure and scope according to Garrett, is seen by the user and interpreted to form an overall perception. Deliberate choices are made when laying out the skeleton of a website, in order to enhance user experience as well as provide them with information in a logical way.

 

Screen Shot 2014-11-23 at 7.15.43 PM

 

For example, the image above of a typical Gmail inbox is much more complex than it appears to be. The surface of the site needs to be simplistic as well as visually appealing in order to ensure the user is not confused during their initial visits to the site. The specific skeleton of the site in this instance needs to be very carefully laid out, in order to allow for the user to quickly find new mail and be able to sort through the different sections and additional applications. Certain individual customization features are available to the user, but the basic skeleton and structure must be carefully organized and laid out by Google to ensure a positive user interface experience. As Garrett states in her presentation, there is a fine line that separates the two views of a product as information as well as technology. In this example, Google first focuses on the information side of the spectrum to project the hard data to the user. The technology side of the spectrum is observed as the individual components of the site that make the content available. Different people will form different perceptions of each and every website, illustrating the importance of a site’s surface, skeleton, and structure to provide a solid foundation for the user to interpret. Different content requires unique structure, but the overall feeling of the website will ultimately be determined by the omnipotent user.

 

Sources:

1. Jesse James Garrett, Elements of User Experience, http://www.slideshare.net/openjournalism/elements-of-user-experience-by-jesse-james-garrett

Disaster Relief Through Google

In Michael F. Goodchild’s piece titled “What does Google Earth Mean for the Social Sciences?” he begins my mentioning that Google Earth “presents a subject for social research in its own right” as well as the need to “address some of the issues identified in the earlier social critiques of cartography”. Google Earth has provided each and every person with the ability to explore the world’s geography at the touch of a few buttons. Google Earth has found ways to avoid daunting problems associated with mass amounts of data required to display the millions of data elements that create the picture of earth’s surface. They are able to store data locally, which is made possible through requiring users to download the program. The ability for users to view the earth in different levels of detail allows for the analysis of a certain location from many different aspects. All of these tools have rendered Google Earth as a great tool for research at both the scientific and explorative levels. As we continue to make technological advancements, the exploration of our world, and what lies beyond it, has become increasingly more tangible with these types of geographic information systems.

Google Earth allows for the integration of unique layers within its program to access different information. For example, updates related to current earthquakes can be integrated through a live access feed. This helpful concept directly relates to the idea behind Google Crisis Map. This tool that is integrated within Google Maps puts “critical disaster-related geographic data in context” by using the map program to highlight the key areas that are affected in real-time. It is also able to integrate links to fundraising sites or help hotlines during major times of crisis around the world. This allows for users to stay up to date on current disasters and all of the different ways they can ensure their safety as well as the safety of others. The ability to interact with this data and both contribute and download it instantly, shows the potential that these geographic information systems have to help the world. This level of disaster-related broadcasting and communication would never have been possible without this type of system. Its ability to efficiently define the spatial and geographical data has aided social-science research exponentially, and will continue to keep users up to date on all of the major events occurring on a given day.

 

Sources:

  1. http://www.google.org/crisismap/weather_and_events
  2. Michael Goodchild, “What Does Google Earth Mean for the Social Sciences?”

Intertwining Tweets

Every single thing in this world is connected in one way or another. Some may be connected on an extremely basic level, but these connections always exist. In Scott Weingart’s blog post titled “Demystifying Networks”, he discusses how many of these “networks” are created each and every day. More specifically, he touches on how many assumptions are made when various networks are established as well as the risks associated with them. The analogy “when you’re given your first hammer, everything looks like a nail” outlines the practice of creating too many networks and complicating an idea or object’s true purpose or intention. The vast array of unique networks can each include a certain topic or specific event; rendering it difficult to be associated with one specific emotion or purpose.

Twitter-map

Twitter is a type of social network that incorporates millions of these “sub-networks”. The image above shows the most prevalent hashtags used on twitter following the Boston Marathon bombing. Every time a user posted a tweet using one of these hashtags, it immediately connected their post and their profile to a huge network of other Twitter users discussing the same related event. Users established networks using singular hashtags such as “#prayforboston” and “bostonstrong”, which then all combined to form the more comprehensive network shown in the picture that related to the entire Boston Marathon tragedy as a whole. This example shows how many extensive networks can branch off from one singular event or subject, and how many individuals can become instantly connected to a network through just one hashtag they post to Twitter.

Social media sites in general have exponentially increased the amount of networks created, and have made the process of creating a new network nearly effortless. Going off of Weingart’s personal beliefs pertaining to the subject, we need to ensure that we do not lose the “theoretical and logical caveats” associated with all of these networks. Naturally, everyone will have differing feelings and outlooks towards specific networks that are created, and we must recognize this difference and understand its implications. Although a certain hashtag on twitter may connect a string of user’s posts and ideas on a network, by no means does that make the unique feelings and ideas embedded within the posts exactly the same. It is difficult to decipher the level of connectedness across different networks, which is why we must use caution when analyzing any data drawn from them.

 

Sources:

  1. Scott Weingart, “Demystifying Networks” http://www.scottbot.net/ HIAL/?p=6279
  2. Image: http://www.washington.edu/news/files/2014/03/Twitter-map.jpg

Turning a Blind Eye to Racial Categorization

none-of-the-above-428x181[1]

The categorization of race has always been a controversial topic throughout history. The website titled “The Real Face of White Australia” highlights one of the instances of controversy pertaining to the policies put in place in the early 20th century that essentially excluded those who were not white from Australia. This excluded those who had lived in Australia for the entirety of their lives, just because of their families’ origins and the color of their skin. They “found themselves at odds with the nations’ claim to be white” and were confronted with discriminatory laws preventing them from leading their normal (or at least what used to be) Australian lives. For those confined by this “White Australia” that had never lived anywhere else, this seemed rash and unjust as they saw themselves as a part of the country just as much as the next person. This specific issue parallels the discrepancy between perceived and personal race identification both in the past as well as today.

In this article that I recently read (link), the history of the process of collecting census data is outlined. For nearly 200 years, before mail-in survey methods were used, the government collected data by sending out a government representative to evaluate households based on various categories (including race). The article states that the government workers, or “census enumerators” did not let the people characterize themselves. Instead, it was based on appearance and ultimately was determined by the census enumerator. This completely segmented race into a matter of appearance rather than identity, and brought up the same question seen within “The Real Face of White Australia” website of what the idea of race actually means at its core. When mail-in surveys became the method of collecting census data, the number of people identifying themselves as certain races drastically changed. This highlights the difference between personal identification and initial misclassification that has been occurring around the world for ages.

This error in classifying someone’s race solely based on his or her appearance can also be a problem for any large database management system. Databases cannot include personal thoughts or identification in their analysis of someone’s face or body without being given the appropriate additional information. When sorting solely through images, it may resort to placing people into different buckets based on physical features or attributes that it finds to be common or similar. While this may be appropriate in some settings, taking into account a person’s personal outlook and opinion is an aspect that can be easily overlooked. In the article “Humanities Approaches to Graphical Display”, Johanna Drucker touches on the concept of “humanistic interpretation” relating to the expression of digital information and the need for “a co-dependent relation between observer and experience”. I believe this directly applies to the idea of racial categorization and that there is an evident need to analyze all relevant potential factors before attempting to classify a human being.

 

Sources:

  1. http://www.psmag.com/culture/census-data-collection-changed-race-in-america-57221/
  2. Tim Sherratt, The Real Face of White Australia. http://invisibleaustralians.org/
  3. Johanna Drucker, “Humanities Approaches to Graphical Display,” Digital
    Humanities Quarterly 5, no. 1 (2011). http://digitalhumanities.org/dhq/
    vol/5/1/000091/000091.html.
  4. http://1.bp.blogspot.com/-7zTnJIf_wow/UzxE9H5Nv5I/AAAAAAAAo_Y/pLzHrIEd5Tw/s1600/none-of-the-above-428×181%5B1%5D.jpg

Disaster Prevention Through Database Utilization

The first sentence of David Kroenke’s piece, Database Concepts, provides the reader with a simple yet appropriate definition of what databases do: “help people keep track of things”. Database management systems lie at the core of all databases and are responsible for keeping the database wheels turning. These self-describing “collections of related records” are responsible for holding endless amounts of tables made up of like data that can be used later for analysis or simple reference. The number of instances in which databases can be useful or helpful is endless, and in today’s digital era they are becoming more crucial than ever.

This article I came across (link) illustrated the increasing importance of databases tied to the exponential advances in technology. The recent outbreak of Ebola throughout West Africa has kept the world on its toes, no one sure of where the next positive diagnosis might occur. Health specialists are in the process of developing effective methods to help track the geographical movement of the viruses associated with Ebola in order prepare the areas that are in potential danger. The article stresses the key role that data and metadata collection/interpretation can play in the attempt to thwart the spread of the deadly disease. It specifically references Harvard’s HealthMap service, which gathers and analyzes millions of social media posts in order to track where potential “global disease outbreaks” are occurring in real time. This service uses a massive database to store information collected from around the world and processes all of it to geographically locate where certain key words associated with various diseases are appearing. Relief organizations are aiming to use technology similar to this to be able to anticipate where diseases such as Ebola are heading, in order to be more prepared for immediate aid and support.

The collection of mass amounts of cell phone data, including both phone calls and text messages sent, has been one specific method of disease-related data collection that has provided promising results. Using the databases of cell providers, relief organizations have been able to see the patterns connected to disease outbreaks. Data collation and analysis by the cellular database management systems allows experts to see where high concentrations of emergency calls are being made, helping to pinpoint problem towns and regions. The article highlights how vital the development of large-scale data collection is in the near future, in order to collect and sort through data in a timelier manner. If we are able to utilize databases and the helpful data and metadata they store to their full potential, we will be able to more efficiently track world disasters as well as ensure that all those involved in helping the cause are up to date with what is occurring at all times.

 

Sources:

  1. Ebola Crisis and Big Data – http://recode.net/2014/10/24/the-ebola-crisis-and-where-big-data-can-help/
  2. David M Kroenke and David J Auer, Database Concepts (Upper Saddle
    River, N.J.: Pearson Prentice Hall, 2008), chapter one (link).

 

The Tiers of Categorization

Food Chain

All animals and living organisms are classified within a specific tier of the food chain. These classifications have been established and molded for centuries, and help to define the general flow of survival. In this week’s reading titled Sorting Things Out, the two concepts of classification and standards are broken down into concrete definitions. In my opinion, standards serve as the foundation that allows various forms of classification to occur. Without a specific set of standards or guidelines to associate with animals or objects, classification is essentially meaningless. The article states, “a standard spans more than one community of practice… it has temporal reach as well in that it persists over time”. The scope of the standards linked to the food chain has been transformed over the years and altered to allow new classifications to be possible as groundbreaking discoveries continue to be made regarding new species. All communities have accepted the set of standards that are tied to the multitude of unique food chains.

 

The food chain has become an accepted way of ranking superiority in our world. While the sun is often seen as the main cog that turns this wheel of life, different forms of food chains can be broken down and applied to more focused groups. This version of classification within certain pre-conceived categories helps to further specify and define the different levels of consumers and producers in our ecosystems. Without the ability to use intricate categorization, ensuring all aspects of all species involved within a food chain are hashed out, it is hard to tell where the public’s level of general knowledge towards other species and organisms would be. It is true that different communities may view the categorization of some species in separate ways, but as Sorting Things Out mentions, in practicing classification and the implementation of standards, objects must be “able to both travel across borders and maintain some sort of constant identity”. The overarching layout of the general food chain and its sub categories has become embedded in today’s society. The standards that have been developed over the years will continue to change with unpredictable discoveries and worldview changes. Certain classifications may seem to be set in stone and unarguable, but there is always potential that the standards could be slightly altered with time. Categorization under the boundaries set in place by certain standards is absolutely necessary to compartmentalize society and analyze its specificities, but the way people think and process information will never stop changing and will always have a direct affect on the categorization process.

 

Sources:

Selections from Bowker and Star, Sorting Things Out (Cambridge, Ma:
MIT, 1999).

 

http://education-portal.com/cimages/multimages/16/Trophiclevels.jpg

 

“Network thinking in ecology and evolution”. http://eeb19.biosci.arizona.edu/Faculty/Dornhaus/courses/materials/papers/Proulx%20Promislow%20Phillips%20networks%20ecol%20evol.pdf

Week 2 Blog – Security in Cyberspace

images

Metadata’s core principles have been molded alongside the recent advancements in technology over the years. As Anne Gilliland states in “Setting the Stage”, the increase in the amount of information being shared and created digitally has drastically expanded the importance and widespread use of metadata. What used to be a term only mentioned and understood by cataloging professionals has transformed into a concept known and practiced by all. Metadata is crucial for individuals to effectively locate and access needed information as well as for companies and organizations to archive important information. Attention to detail with regards to the creation of metadata has also become vital in this day and age, tied to this expansion of digitally accessible information.

This digital adaptation that we have seen over the years has also caused controversy. The rights surrounding access to certain metadata have been heavily argued and in a specific case concerning Yahoo, government threats have been discovered (Link to Article). This article covers Yahoo bringing light to a battle between itself and the US government in 2008 regarding the constitutionality of the government’s request of Yahoo’s data. The government threatened Yahoo with heavy fines if they failed to comply with their “surveillance efforts”. Yahoo released the documents associated with their legal dispute to the general public in an attempt to show the extensive effort they went through to keep certain personal data safe that was tied with user information. Yahoo ultimately ended up losing the legal fight and was forced to cooperate with the National Security Agency, granting them access to their information.

This controversial incident expands upon Gilliland’s article as it brings security into the conversation regarding metadata’s increased role over the years. It questions how much regulation should be involved with certain metadata that may include personal information. This is a very delicate issue with technology expanding, as the amount of personal information that is stored and managed online is only going to expand. With the recent security breaches around the web, it is safe to say that no personal information put online is absolutely secure. With this in mind, Gilliland’s question of “how much [metadata] is too much” can be looked at from a different perspective. It raises both questions of how much regulation should be allowed as well as how much personal information should be stored online to begin with. Metadata has been great for the purpose of recording information and making it available to the general public. However, with the always-present possibility of personal information being digitally transferred between organizations as well as recent security breaches, it will be interesting to see if metadata is managed differently in the coming years.

 

Sources:

“Yahoo was threatened with heavy fines by US government over metadata”. Washington Post. September  11, 2014. http://www.washingtonpost.com/business/technology/us-threatened-massive-fine-to-force-yahoo-to-release-data/2014/09/11/38a7f69e-39e8-11e4-9c9f-ebb47272e40e_story.html

Anne Gilliland, “Setting the Stage,” from Murtha Baca, ed., Introduction to Metadata (Los Angeles: Getty, 2008): http://www.getty.edu/research/publications/electronic_publications/intrometadata/setting.html

Image: http://www.ccmostwanted.com/wp-content/uploads/2013/08/Cyber-Security.jpg