3D Archaeology

http://blog.archaeology.institute/in-2014-practice-photogrammetry-and-3d-edition-software-applied-in-archaeology/

  • Lisa Snyder and Scott Friedman. “Software Interface for Real-Time Exploration and Educational Use of Three-Dimensional Computer Models of Historic Urban Environments.” National Endowment for the Humanities, September 16, 2013.

I apologize – I’m about to totally geek out on you!! In reading Synder and Friedman’s piece on 3D modeling, I was struck by the variety of applications this type of software has in archaeology – and the very best part is that it can be super simple, and therefore easy to apply in the field! This technology is helping to upgrade the excavation process, which still includes a lot of hand written notes and drawings, as well as advancing cultural preservation. There are incredibly exciting applications for this type of software in cases of rescue archaeology, or sites which are being destroyed due to natural processes. At Apollonia Arsuf in Israel for example, they are trying to implement a program using this type of tech to digitally reconstruct the Crusader Castle on site. It is a UNESCO World Heritage Site, and the castle (which was located strategically on the edge of a cliff overlooking the Mediterranean) is now crumbling piece by piece into the sea as the limestone cliff loses its structural integrity. It is important to note however, the incredible amount of research and scholarship that go into these types of reconstruction projects- I myself spent the better part of several months researching extant, contemporary examples of Crusader fortification architecture in the Near East and compiled a database of these forms. This information was to be used to help fill in the gaps, where pieces of the Arsuf castle were so badly damaged or missing that they needed to be fabricated, rather than photographed or scanned.

Another incredibly cool application for this tech is underwater archaeology! Water has very different preservation effects on different materials and artifacts, and it can often be incredibly detrimental and destructive to remove an object from the water it has been sitting in for centuries (probably the most well known example is of wooden shipwrecks, which can be incredibly well preserved in salt or brackish water, but can essentially disintegrate very rapidly when exposed to air). At a field school I attended on the island of Menorca in Spain, they are using underwater cameras to take photos of submerged artifacts (at even just above  5 megapixels). These images are reconstructed in programs like Agisoft, which use photogrammetry to stitch the photos together into a 3D mesh, which can be manipulated, exported to CAD programs like AutoCAD or ArcGIS, etc.! The photo above is an example from Menorca.

Consistency in Hieroglyphs

Shneiderman’s “Eight Golden Rules of Interface Design”

After reading Shneiderman’s eight rules about what makes up a good or effective user interface, I am drawn to the rule on consistency and how it makes the users’ lives so much simpler. If your primary goal in creating your interface is to have it be easy to use and to have it get used by many people, then consistency is key. Actions are faster, recurring visits or loyalty to your site are all higher, while the learning curve to use your site or program is much lower. This concept also links up well with issues of standard vocabulary and metadata in terms of consistency. Craigslist is a great example of a less-than-wonderful interface with consistency issues and a shocking lack of standardized vocabulary. People posting on Craigslist have full autonomy in the title of their item, its location, and the format of its price. This leads to unfruitful searches, wherein a search for “dresser” will not provide you with items such as “chest” or “bureau,” and the price is often misrepresented as very low when in fact it is a starting point for negotiation. This lack of consistency spills over into a not always user friendly interface as the post screens are often incredibly disparate in form as well.

The Kirschenbaum piece, especially towards the end when he discusses the future of user interfaces and even of screens themselves, made me think of the Digital Karnak project. He believes that the future of screens are outside of the 13 inch box of today’s average laptop screen, and that this fact alone will drastically effect the way interfaces are designed and interacted with but their users. In much the same way, the Digital Karnak project has great potential for operation outside of the laptop screen. At Brown University, there is a small visualization lab (basically a tiny IMAX dome) where for instance planetary geology students were taken to experience a Google Earth-esque trip to Mars and Olympus Mons. This kind of technological resource would be incredible in both educational and museum settings to provide people with a phenomenological, literal walk-through of archaeological sites such as Karnak. In this situation, not only is the interface an entire space rather than a screen, but it is an interface with physical human experience as well.

Evolution of Maps

http://www.archaeology.org/travel/interactivemap-texas/

Both the Intro to Web Mapping and Anatomy of a Web Map were excellent tools in understanding the development of web maps over time. This evolution began with the incredibly simple static map, which was essentially a paper map that had been digitized, and is still very commonly used in research or visual representations. The first great evolutionary leap was to dynamic or distributed maps, which reflected changing data by loading and presenting a new, current data set each time. These dynamic maps are ideal for things which we need up-to-the-minute information on, although they are also applicable for use with data sets which change more slowly. The next step were animated maps, and “real-time” maps that are essentially dynamic maps automated and linked to sensors which provide the real time data. Finally along came Google Maps (interactive maps) with their ability to toggle layers on and off, link map features to external websites, etc., followed by the analytic map  which used these interactive features for more in depth analysis of the data. The final stage, collaborative maps, are connected with distributive maps in that they have multiple sources – in collaborative mapping, anyone can contribute. A good example of an interactive, potentially analytic map is the one linked about about the different types of archaeological sites in Texas.

However, I wonder why no one has made mention that the use of maps for analysis, and even interactive maps (though not in the digital sense), existed long before the web, when maps were only on paper. Detwiler only details this evolution within web or digital maps themselves, even though humanist scholars have been using maps for analysis for decades. In much the same way as the Texas map above allows you to click on a site and read about it, so paper maps had sites or buildings within sites marked in order to reference you to a discussion somewhere else in the text. For instance, archaeological site reports often have a map with locations or buildings labeled (toggle ON your label layer), which you can then reference in the text (not quite a pop-up explanation box, but the same concept). Also in trying to understand the development of the state, archaeologists often plotted things like trade routes of least energy consumption (‘shortest route’ option rather than the ‘lightest traffic’ option).

Finding Paul Revere

Kieran Healy’s article on finding Paul Revere with metadata was very interesting. Aside from the clever point of view of a Royal Security Administration analyst, Healy had some very interesting points to make about metadata. Basically, using only information tracing individual membership to multiple “terrorist” groups, Healy could make some really interesting and useful insights deeper into the data. Initially, the author is able to convert the table from People vs. Groups to a People vs. People table – Healy does this by multiplying the matrix (table 1) by its flipped self (transpose of table 1). This simple equation allows the author to quickly manipulate the data and to start drawing relations between different people that it might otherwise have taken quite a bit of time to discover manually. Regardless, Healy is left with links between people as they are members of the same rebel group. This works in the same way but with the multiplying matrices equation flipped. In this particular case, the author was left with a table of Groups vs. Groups, elucidating how many members each set of groups shared in common. In either case, some quick and easy data visualizations make it obvious quite quickly which groups and/or people were at the heart of the rebel colonial cause.

I believe this case study has some interesting applications in situations where your metadata collection is limited for some reason. In this case, Healy had only the information on group membership to work with, and was able to tease out some very useful relationships (which were there all along but one would probably not have picked up on without the manipulation and data visualizations). For instance, in archaeology our data sets are often limited to information like what types of objects we find and where we find them. Since we study the distant past, it is very unusual to have more information, for instance the name of the artisan who made the item, or who it belonged to. However, Healy’s methods seem to have good applicability in these cases. If for instance we could put together a spreadsheet of pottery types vs. their find locations across a large region, nation -state, or even area like the Eastern Mediterranean, then perhaps we could begin to tease out some central nodes in the data. These nodes may then correspond with production centers, and could help us to understand trade or redistribution patterns in pottery.

Data as Capta: A Post Processual Approach

Johanna Drucker, “Humanities Approaches to Graphical Display,” Digital Humanities Quarterly5, no. 1 (2011)

New Stone Age TSA

http://structuralarchaeology.blogspot.com/2011/11/archaeo-toons-secrets-of-stonehenge.html

 

Johanna Drucker’s article focuses on the concept of data as capta. She argues that because all data is taken through observation and then interpreted, none of it is “given as a natural representation of pre-existing fact.” In other words, that knowledge is by default constructed, simply based on our interactions with it. She argues effectively that the traditional data visualization is a tricky and often murky issue. The traditional charts and graphs which present information so clearly and succinctly are often taken at face value as knowledge, when in fact so many decisions and assumptions are included in the visualization. She gives the example of amounts of men and women in certain countries at a certain time. The resultant bar chart is clear, and one can easily make the snap judgement that these are the final statistics pertaining to this question. However, as Drucker delves deeper into the prior assumptions and decisions made by the visualizer, the picture becomes much less clear. She begins with a discussion of the non-binary nature of gender, as well as how socio-cultural norms can effect these statistics – such as when a woman is only socially considered to be (and therefore recorded statistically as) a woman once she is of reproductive age. She goes on to consider how the interpreters have dealt with (or not dealt with) populations crossing national boundaries, skewing the entity of the “nation” represented on the graph, or transient populations which could skew the temporal component. She notes that while the traditional graphs are extremely useful, especially in the case of determining the location of a cholera outbreak, we have to be careful with the information we assume to be knowledge. It may be more useful to humanists to create more complex, messy visualizations that treat our prior assumptions and interpretations of the data up front.

A very similar debate can be found in the field of archaeological theory. In the 1960s, the processual school rose to dominance. This type of theory stresses scientific methods of hypothesis to create general, systems-based explanations for important cross-cultural themes such as the emergence of the state. These incredibly systematic solutions were meant to be diagnostic regardless of the context, and generally removed any focus on the specific culture or human agency. In the late 70’s and 80’s, a reactionary school called postprocessualism arose which was focused much more acutely on individual agency and were incredibly context specific in their analysis. These scholars, in much the same tone as all data being capta, believed that the material record could not be treated outside of its specific context and social interpretation.

Databases and the Study of Stuff

http://www.abebooks.com/Maadi-Vol-Predynastic-Cemeteries-Wadi-Digla/9410760608/bd

  • Stephen Ramsay,  “Databases,” in Companion to Digital Humanities, edited by Susan Schreibman, Ray Siemens, and John Unsworth (Oxford: Blackwell Publishing Professional, 2004)

Archaeology is essentially the study of stuff – material culture remains, or artifacts, are studied in various ways to extrapolate information about a wider extinct society. Certain case studies in archaeology are incredibly well suited to being organized and then further examined with the use of a database.

Stephan Ramsey defines  the purpose of a database (especially in the relational sense): “to store information about a particular domain, and to allow one to ask questions about the state of that domain.” He emphasizes that the particular usefulness of the Relational Model in database design is in the language; instead of simply storing large amounts of data, the Relational Model allows interaction between the individual data points. The example he uses is a database of American Novels, and he demonstrates how primary and foreign keys can provide links between the data points. Almost like a game of bingo, the primary and foreign keys allow one to relate information across categories: for example, assuming the bingo call is B7, B would be the primary key of “Author,” and 7 the foreign key “name of work,”  thus a search for B7 would tell us that Mark Twain had written Tom Sawyer.

This type of relational interaction of data can be extremely useful in the study of settlement layout and function, or even for mortuary archaeology. Imagine you have uncovered a grave yard with over individual 100 burials (which is actually a very modest data set). Within each burial you have specific data points such as sex of the deceased, approximate age, health, location of the grave, contents (did the person have burial goods? If so what, how many of each type, etc?). By inputting all of the information into a Relational Model database, the investigator can begin to draw comparisons between relative wealth or status (quantity/quality of burial goods) and the age or sex of the individual. A pattern in these types of correlations can begin to elucidate the mechanisms of social hierarchy and status within a society, whether status is achieved or inherited (finding infant graves with a lot of wealth is a great example of inherited status), how the society works in terms of gender roles, etc.

These databases can also produce a picture of the larger society by relating the location of artifact finds in a settlement site to their function. For instance, if a database search demonstrates that there was a high occurrence of food waste materials in a certain location, it may have been a cooking area. This can then be cross-referenced with the location of any ovens or firepits at the site to further the argument.

What Doesn’t Belong?

what doesnt belong

http://www.demotivers.com/5412/Who-Doesnt-Belong-Here

I was struck when reading Madrigal’s article by the phenomenon at the end which he dubbed the “Perry Mason effect.” It instantly made me think of these humor posters about which of these things don’t belong? It was incredible that in a categorization system with literally tens of thousands of genres, that such a strange little hiccup could occur in what one would consider a relatively important category: most popular actors. Plus, this weird occurrence was not linked to recommendations made to Netflix customers, nor did it indicate that tons of people were watching Perry Mason episodes or movies featuring Raymond Burr. In fact, it was just something that happened during the process of using human preferences, fed into a computer, to create these altgenres. There is really no explanation for the Perry Mason effect. Yet when extrapolating this to wider fields in Digital Humanities, I think this occurrence of computational serendipity may be one of the reasons that humanists are so drawn to analyzing their data with machines. The strange feed line of research to computational model or analysis, back to human presentation elucidates incredibly interesting “Perry Mason effects” which the researcher alone would not have seen.  However unlike Madrigal, I believe in some cases of research the explanatory reasoning behind the “something in the code and data” can be traced and found incredibly useful by the researcher.

For instance, archaeologists have been feeding information, spatial and quantitative data about artifacts, into databases and mapping programs to show distribution patterns over a whole site or region. Often, nothing strange happens in the translation of the data back to human presentation (the final map for instance), and it shows generally what it was expected to. But in some instances, new spatial relationships, groupings, etc. come to light during this final stage which were not readily apparent, either in the field or straight out of the field notes. Because these computer systems/programs are mechanical, they help the human researcher to investigate the data without our inherent biases and expectations (though those might still be present in the data itself), and let us see things that we would not have otherwise. Usually in these cases, once the “Perry Mason” effect has been identified, it is possible for the archaeologist to retrace how/why/where this might have happened, and to outline something about the site or culture that may otherwise have gone unnoticed.

Week 2 – Metadata

Field Notes (2)

 

This is an example of the process followed and field notes taken on a dig by the Museum of London (click on the image to zoom in).

Renfrew, Colin and Paul Bahn, 2000. Excavation in Archaeology:Theory, Methods, and Practice. 106-116.

National Information Standards Organization, “What is Metadata?” (Bethesda, MD: NISO Press, 2004)

The similarity between metadata for digital object and the cataloging data and field records given to archaeological artifacts is astounding. Because archaeology is a destructive science, precise documentation and careful records of almost every aspect of the excavation are crucial. These pre-set pieces of information are standardized across the field of archaeology, and include data such as the precise location and depth of the object when it was uncovered, its relationship to any stratigraphic elements (stratigraphy is just the study of the successive layering of different soil over time, and to over-simplify, allows one to assume that deeper layers are older, while layers nearer the surface are more recent), and even the consistency and color of the soil it was found in. These types of data are standard across the field – even to the point that there is a standardized scale of soil colors (same concept as the Dublin Core Metadata Element Set, TEI, METS, EAD, MODS)– so that archaeologists will have the interoperability that the Open Archives Initiative is striving for. In the same way, the interoperability allows archaeologists who were not in the field to look into the field records and draw their own conclusions or new questions/projects from the universally understandable data.

These archaeological records also include the idea of metadata schema. For instance, the location of the artifact is noted using a predetermined code, which refers to the general area of the excavation down to the actual square meter grid in which it was found. In the United States, archaeological sites themselves have standard coded titles, such as 47-DR153, which refers first to the state, by its numerical location in an alphabetized list, the abbreviation for the county, and the site number.

These records are also crucial for future identification and preservation, both of which are also main goals of metadata. One concern about these digital projects is that the objects themselves will be lost, outdated, or inaccessible in the future, a concern which metadata can alleviate. In much the same way, if an artifact ever becomes lost in transport or in a messy lab, archaeologists can still study the piece through its detailed field records, even well into the future (for example, we still have access to field notes from the late 1800s, even though some of the objects excavated went missing during WWII).

In keeping with this class, a new program is being developed here at UCLA’s Cotsen Institute of Archaeology where archaeologists can use iPads in the field to embed all of this information into a QR code. The code goes on the bag the object(s) are kept in, and all the pertinent data can be accessed immediately with a quick scan!