Electronic Literature- A Slave to the A Priority of Grand Narration?

ah

I enjoyed Drucker’s article on the Poetics of Electronic Textuality- it effectively summarized many of the works I was exposed to in a class on Digital Literature. One of the collections of the works we analysed can be found here-

http://collection.eliterature.org/2/

I found it interesting that many of the works illustrates the notions of Speculative Computing and Temporal Modelling.There are also examples of all 3 forms of creating electronic literature- hypertext, dynamic/kinetic manipulation and display, and programmable texts.

Highlighting the “contrast between the supposed linearity of print forms and the multi-linear hyper-textual forms of digital materials”, Drucker also restates McGann’s point that works of imagination “are not information structures”, “yet, to make them functional within digital formats, they are often treated as if they were”. However, this rhetoric seems trapped within the same structure of a priori knowledge it seeks to refute. Instead, we might find comfort in understanding that a digital production is not one that needs to fit into a grand narrative, but could be a standalone, autonomous reference, even extension of the work of art itself. As with any work that is adapted, translated, or simply a denomination of the “original” work, it is inspired by a precedent. Yet, the necessity of preserving or reproducing its original should be removed- it won’t be the same, but that’s also because it has a different purpose to serve; an alternative space to fit in and a new way of relating to others.

http://collection.eliterature.org/2/works/michel_ah.html

ah2

To illustrate my point, take the work “Ah” for example. Ah deconstructs the notion of run-on lines- the words scroll from right to left on the screen, much like updates about the stock markets on the bottom of television screens airing broadcast news. It appears to focus on an internal reconstruction, or a manipulation within the discipline of word and writing itself. In doing so, it comes to embody movement and sound to create story space. Words and letters run over/ fall over/ overlap one another in what seems like an uncontrollable stream of consciousness or thought- consistent only in its unpredictability. Even letters composing an individual word seem to break off and diffuse itself into other words, making for a very literally intratextual experience.

Further, shape and form appears to be completely abandoned as we can no longer rely on natural language structure to determine meaning. The slow, steady velocity of word movement literally mimic a constant stream of water, while the overlaps result in a stream of consciousness like inconstancy that does not permit anticipation or revisiting. In this way, the structure of the work reinforces the very images of sound and water that it refers to. Yet, it ironically is incapable of communicating meaning as it forgoes grammatical coherence to appeal to these physical entities. This central paradox highlights the very way in which we engage with the world- our attempts to understand and psychologically translate your phenomenological experience. We find sense drawn out from one part of a sentence, and we use understanding of structure from the previous moment to build understanding on the sentence arrangement of the next present moment. Ah in itself is therefore a constant experiment in building our own mental language with which to confront the world.

Rather than be upset at the work for bastardizing the meaning of run on lines, or neglecting  represent the essence of any precedent well, we must take into account what the work itself achieves and its “untapped potential for critical and creative investigation”. Removed from what Oscar Wilde would call the “shackles of verisimilitude”, we can then enjoy these works and interpret them without the sanction of Derrida’s critical distance or the counter-productive circularity of a priori knowledge that grand narratives seeks to perpetuate.

User Experience in the world of entrepreneurship- Evernote and Kickstarter

 

evernote ui

http://www.engadget.com/2014/10/02/evernote-web-client/

http://blog.evernote.com/blog/2014/10/02/new-beautiful-evernote-web/

Garrett defines user experience as “the design of anything with human experience as an explicit outcome and human engagement as an explicit goal”. This was an interesting shift for me as most of class has been covering aspects of machine learning and the limitations of communicating unstructured data to computers that lack the emotion, sensitivity and empathy of humans.

User experience, on the other hand, brings the viewer to the forefront. I enjoyed his breakdown of the structure of contemporary vision into the different layers- surface, skeleton, structure, scope and strategy, with the surface being the most concrete and strategy being the most abstract.  Expounding on the necessary qualities for good design, he also urges the need for designers to understand their product’s “emotional and psychological context of use” to inspire passionate connections.

This has become a pertinent concern in fields of computer science and app development, and most developing companies seek UI/UX designers who are able to translate the functions of their companies to technological devices and web applications. A recent update was Evernote’s newly redesigned web interface, which makes the process of notetaking more streamlined and expedites the process of note taking on a tech device. As opposed to a desktop app, the “interface fades away to showcase your thoughts” and “re-emerges” when you need it. The app also has the ability to guide you “to the content you’re looking for”. Rather than an appendix or secondary attachment, it has integrated itself to become a “destination for the creative mind” that simulates the interconnected space that probably sits between our ears.

While it often appears that a product’s needs govern its design, Evernote seems to have demonstrated that a good design alone is capable of rethinking and even dictating a product’s function given the capabilities of technology. This inversion has created interesting results, where several startups are focused on simple ideas that are executed thoughtfully and resourcefully. The idea of note taking is neither new nor revolutionary, but the design’s ability to engineer choices with underlying user psychology means that UI/UX is no longer a slave to the product in that it requires strategy defining a story- it embodies rather than merely represents the very personality of the product.

This makes me curious about skeletons that lend themselves to easy use and manipulation- take Kickstarter for example. As a company whose product is its service platform, Kickstarter relies heavily on developing an interface that is user-friendly, and flexible enough to allow different product creators to market their products. Given the competing needs of users to market their unique products in different ways, I am curious to know what their web designer was thinking given that their product was such a complex intangible i.e. to accommodate as many products as possible.

Constructing Memory as Space: 9/11 Memorial as a “Cultural Elaboration of Landscapes”

9-11

 

2014-09-11 21.31.22

This week’s Dunn reading examined the construction of space as artefact in the historical sense- with reference to how the “digital dimension accelerates and simulated the fundamentally creative process of historical reconstruction”. While this is discussed in the more historically rooted disciplines of archaeology and classics, the notion of constructed space reminded me of my visit to MoMA this summer, featuring an exhibition on “Conceptions of Space” that I relate to my experience with the 9/11 memorial. Some of the spaces mentioned in the exhibition were:

Envelope Space- A result of height restrictions and other building regulations in modern cities. Building’s outer limit, rather than the function of its interior, is the guiding principle of design. Envelope organizes creation of architectural space and has theme in its own right

Fictional Space- architects conceive space out of imagination and use stories to help their ideas unfold

Space on Steroids- pioneering contemporary designs often combine references to historic buildings with groundbreaking spatial experiments. Architects often seek to create new and vital experiences of space unrelated to a building’s functions. Even if diverse practical needs must be met, space and its interaction with architecture’s existing repertoire of forms and spatial possibilities can be the main focus of the design process

Spaces of Assemblage- artists adopt creative strategy called assemblage i.e. grouping found or unrelated objects. Juxtapose different forms, volumes, other spatial elements and by repurposing preexisting construction components.

Performative Space- design that crosses boundaries between architecture, installation art and props for performance

In the various ways explored, architects are able to rely not only on precedents, but present needs and future projections to think about a building, and how its design would shape human interaction and experience that would contribute to a productive “cultural elaboration of landscapes”.

While I did not take any pictures of the architectural exhibits, the exhibition also made me think about the 9/11 memorial and the thought put into engineering it. As a historical event that is close to the hearts of many Americans, the herculean task to build a structurally meaningful and symbolic memorial to represent history while being aware of the practical limitations of a crowded city like New York was perhaps daunting. While not necessarily performative, the resultant two light beams shone were purposive in that it served as remembrance and recognition in respect. It had a spatiality that was palpable, balancing a modest unintrusiveness (lights could be switched off) without neglecting to remind of past constructions of space, as opposed to merely “representing and describing them”. It was an endeavor that balanced a past occurrence with the need to address the current climate and future

The quote Dunn places at the beginning of the article therefore becomes exceptionally poignant- “That was how I saw it then, and how I continue to see it; along with the five senses. A child of my background had a sixth sense in those days, the geographic sense. The sharp sense of where he lived and who and what surrounded him”. The relation and interactions of people in a given space give it value, and the sixth geographic sense empowers one to situate himself as a constituent of his surroundings. As such, the notion of space becomes important in constructing identity- the way it is presented and permits us to interact with it not only help us understand things around us better, but offers us a more robust understanding of ourselves.

How is the “Duality of Persons and Groups” accounted for by death/ what do you do with a person’s Facebook profile after they die?

network

Aristotle is famously quoted as saying “Man is by nature a social animal…society is something that precedes the individual.” Interestingly, this quote remains significant in light of Kieran Healy’s article, “Using Metadata to find Paul Revere”, but provides an interesting conjecture to the way in which we might rethink the idea of man as a socially connected creature.

In Healy’s mention of Breiger’s paper “The Duality of Persons and Groups”, she recounts it as discussing a “basic way to represent information about links between people and some other kind of thing, like attendance at various events, or membership in various groups.” This was the foundation of a “new science of social network analysis” where you would be capable of gathering information about and understanding a person’s interests and social life solely based on metadata, “without much reference to the actual content of what they say.”

The point reminded me of a class on social network analysis I took my freshman year- the class exposed me to social network analysis tools like Wolfram Alpha, in which we attempted to create networks based on the relationships formed between over 100 characters in a Scandinavian mythological tale.

If we take Aristotle’s claim that “society precedes the individual” to be true, society absorbs the emotional cost of a death traditionally through rituals, prayers etc. There seems to be a need to create a non-physical presence in honor of remembering someone- this is why we commemorate death anniversaries, visit tombstones and erect memorials for the deceased in war.

However, when this is translated to metadata, it seems that death cannot be properly accounted for by a computer or social network analysis tool. What are we expected to do with someone’s Facebook profile when they die? Does the decreased retain the same privileges that one actively operating his Facebook profile would experience- unique in its individuality but also a combination of universally established metadata standards?

While not an immediately pressing concern in day-to-day human interaction, a death in a Scandinavian tale (there were many deaths, thus constantly changing the dynamic and direction of progress for the story) heavily impacted narrative progress and how characters would interact with one another. When this was translated onto a social network tool, neither removing the character’s profile completely nor leaving it as is seemed to work- if the connection remained we had to account for the fact that he could not introduce mutual friends, yet removing him would cause other characters to lose connection with one another.

Speaking from the point of view of a human, the deceased still remains connected to his networks in that everyone remembers him. However, these connections, when manifest in social network analysis, differ from those established between two people who are still alive- effectively questioning how the duality of persons and groups is accounted for in light of death. Does one relinquish his membership in a group through physical absence, or does others’ remembrance and retention of their emotional ties with him suffice as presence? How then, will this affect metadata standards and the accurate representation of one’s identity through social networks?

Drucker vs Otaku: Japan’s Database Animals

 

expanded stereotypes

 

original file: https://s-media-cache-ec0.pinimg.com/736x/96/ce/19/96ce193cba270dbad17940fd7c84a235.jpg

This week’s reading, “Humanities Approaches to Graphical Display” by Drucker really struck a chord in me. Coming from a humanities background (I am a Philosophy major also looking to create my own major), taking this class has been a challenge for me as I adapt my arguably more hermeneutical approach to processing information to the framework of the class.

 

This challenge manifested itself also as I was picking a topic for my final project on Subcultures. I was interested in the psychology behind Otaku Culture, which describes an obsessive consumerist culture dominated by manga and anime fans- but how was I to translate this information in data form? I was unsure of how numbers, statistics and figures possibly represent the depth of thought and complexity portrayed in interpretations of Otaku Culture.

 

Reading Drucker, however, attuned me to the idea of approaching information as “capta” rather than “data”. Noting the “etymological roots of the terms data and capta” could, in her view, “make the distinction between constructivist and realist approaches clear”. The idea of “capta” would appeal to the need for a humanistic, rather than robotic approach to handling and classifying information. This, in turn, would acknowledge the “partial, and constitutive character of knowledge production, and recognize that knowledge is constructed, rather than given as a natural representation of pre-existing fact”.

 

However, upon reading reviews of Japanese philosopher Azuma Hiroki’s book Otaku: Japan’s Database Animals, it was interestingly noted that Otaku funnels the works they worship into a kind of character-based slavery, as opposed to the narrative-based freedom that we expect forms of entertainment and escapism to offer us. In other words, Azuma’s approach seems to refute the need for “capta” and interpretive autonomy in light of consumer/ desire driven markets. Just as Drucker acknowledges that “realist approaches depend above all upon an idea that phenomena are observer-independent and can be characterized as data”, Azuma concedes to this by mentioning that “it is only the surface outer layer of otaku culture that is covered in simulacra”, and that underlying all of that is a database or factory for creation. He further argues that scrutiny to this database yields that beneath the “chaotic inundation of simulacra”,  anime and mange character constructions become “ordered” and “understandable”. In the picture above, we see how different female characters in anime and manga may be classified, effectively breaking down the person’s story and reducing it to a set of interchangeable characteristics. What is perhaps more unfortunate is that these stereotypes double up as rules/ guidelines for “simulacra to be successful”.

 

The prospect of anime culture standing behind a veil of originality and at its simplest being no more than a convoluted system of mixing and matching features from a fixed and limited database is heartbreaking- and while I am hesitant to too readily accept his view, I am also excited to read Azuma’s primary text further. I look forward to how his analyses will better inform the outcome of my final project.

Secondary source: http://eyeforaneyepiece.wordpress.com/2013/07/10/notes-on-otaku-japans-database-animals-part-4-moe-elements/

Primary source: Hiroki Azuma, Otaku: Japan’s Database Animals

Law and the Human Condition- How to Represent and Extrapolate Controversial Data?

http://demonstrations.wolfram.com/TheAppealsCourtParadox/

http://demonstrations.wolfram.com/ThePersuasionEffectATraditionalTwoStageJuryModel/

A comment raised in the Data+ Design article that really stuck with me was the notion that “data is around us and always been” and that “only recently have we had the technology to efficiently surface these hidden numbers, leading to greater insight into our human condition.” Given that the human condition features what is perceived to be an unalterable part of humanity that is tied to our tendency for error and fallibility, it is interesting to imagine instances where we might conceivably quantify such intangible concepts- let alone provide insight into such a topic. This standpoint is notable especially given the general aversion of the humanities toward the need to quantify everything in the world and see it in black and white.

This reminded me of the computational knowledge system Wolfram Alpha, which “takes the world’s facts and data” and computes it across a range of topics”. I went to their demonstrations website (where people can showcase the projects they have been working on), and found an interesting collection of projects, including a fair amount in the legal field. These include the projects linked above- The Appeals Court Paradox, in particular, takes into account the probability that each judge votes correctly, and factors in whether the judge votes independently, to determine the likelihood of a “correct” ruling being delivered.

The projects demonstrate a more pressing/ overarching issue in legal rulings and procedure, where judges’ bias, however reprehensible, is first difficult to identify and allege, and seems to be an inescapable part of the decision making process. Especially in the Hobby Lobby case and recent decisions that have been split 5-4 or 4-5, we now understand decisions also as a product of judges’ personal ideology or political affliation. This has resulted in a notable drop in confidence of the public toward the supposed objectivity that the judicial system is expected to deliver, such that the ruling system seems more a result of chance, rather than law.

Ignoring the assumptions made in deriving any such numbers for the initial calculation, the Wolfram Alpha project therefore seems capable of reconciling the need for grey areas/ in between spaces (as opposed to black and white) and statistics by calculating probability.

Then again, there seems to be something unsettling about basing the present on the past-gathering data from past occurences and extrapolating that to predict the future. Problems also arise as the data set of choice is conceptually fuzzy- what is the “correct” decision in relation to the law? If the notion of correctness is associated to our personal beliefs, how then might we represent that in an empirical data set?

At present, although data can be useful in representing non-contentious information, it remains to be seen whether it can assist us in illuminating controversial topics in the realm of ethics and law, both of which are underpinned by the human condition.

Classification, Standards and Aristotle’s 4 Causes- the Case of the Withering Rose

4 causes

Bowker and Starr’s discussion on classification and standards in “Sorting Things Out” led me to think about Aristotle’s 4 Causes, which he sees as an effective way of understanding objects in the world. This appears to be a more abstract but flexible way of thinking about objects in context, which would subsequently aid in its classification. The equation is as follows-

Material Cause + Formal Cause+ Efficient Cause= Final Cause

This is in contrast to a classification system with categories that are mutually exclusive- the article mentions that “a rose is a rose, not a rose sometimes and a daisy at other times”. Intuitively, this makes sense as distinct categories enable us to better identify things by reference to their specific properties. However, this seems to me to pose a problem in an age where knowledge is in flux and we look to gather information about an object as it morphs over time. While this may not always be the case for historical and archiving purposes, it is interesting to examine how classification works for objects that are highly susceptible to the progression of time and across worlds.

For instance, a rose bud differs from a blossomed or withered rose in that they are all in varied stages of development. Each stage of development in a rose will have, appended to it, a specific set of properties that are different and distinct. Yet a classification system would not be able to capture this progression and merely classify it under “rose”. This issue is exacerbated when placed in the context of the 4 Causes, as each type of rose would have a different material, formal and efficient cause, leading to a different final cause.

Material Cause: what constitutes the object (A rose consists of a thorny stem and veiny petals made of plant matter)

Formal Cause- the ratio or general form an object takes (A rose is 90% stem and 10% flower)

Efficient Cause: the thing that motivates creation/ change (pollination is the efficient cause of roses growing)

Final Cause: the aim or purpose it serves. (A rose is an organ for plant reproduction and an ornamental object)

This sort of ambiguity is one caused by a definitive change in properties of the object, as opposed to an ambiguity regarding the categories of classification a rose should fall into. This demonstrates that classification at its present stage is capable only of registering static information. While Aristotle’s system seems to account also for standards (as opposed to classification) that are able to withstand the test of time, his system is one that has standards relative to a specific object’s constituents, rather than taking these object’s constituents as change over time.

Since a withered rose has a different material, formal and efficient cause from a rosebud or a blossoming rose, Aristotle’s system would still consider the blossoming rose and withered rose to be two different things. If this is the case, how would we be able to prove and track (via either method of classification) that it was the same rose that blossomed and withered? In broader terms, how are our systems of classification working to address the idea that two objects cannot be proven to be one and same object when its properties and standards have morphed over time?

The funny thing, though, is that Aristotle’s 4 Causes continue to influence our notions of classification and standards now. This makes me curious as to the possibility of developing “real time” classification systems that can grow and track changes in the data of the object it describes.