Edit Page | Translate Page


Previous: Vector | Next: Conclusion

Progressive Vector Diagram of wind patterns over the US maritime coast [Law, 2000]

Cultural production that tracks trends, indices and metadata in order to draw larger inferences about large sets of data or interactions (another form of narrative) is a set of practices which track social and information “flows”. Expanding from the mathematical/logical method used by Deleuze regarding the vector, the flow is an acceleration or change in speed or direction of vectors (usually large numbers of them) that reveals a trend or “shape”. As discussed at the end of the section on structure, as hypernarratives become indeterminate in terms of authorship and reading, as well as being released to collective processes, the idea of trying to find closure appears almost silly. To derive or determine the narrative of large sets of lexia or interactions at macro-scales, one has to create indices or maps to get a feel for meta-structures of the overall terrain. Then one can (if desired) zoom down into more specific areas of information or narrative. The index and map are two prominent methods of mapping flows of interaction, and create an intuitive or affective relationship between the viewer and the data. In many ways, Debord’s psychogeography applies here, except now we are dealing with large terrains of information rather than a city or landscape, and the index or map orders it so one can wander around it. In my examples, the artists track market trends, program flows in classic video games, and the terrains of teenage heartbreak. Each one takes a very complex set of data and at least tries to create a visually tangible schematic of that data, one that is both intuitive and aesthetically compelling.

Martin Wattenberg has been creating cognitive maps that depict structural relationships in complex systems of data in order to reveal trends and correlations. His Idealine, the first new media artwork commissioned by the Whitney Museum of American Art in New York City, is a database driven timeline of net art of the 1990′s. He describes it as a “… timeline of net artworks, arranged in a fan of luminous threads. Each thread corresponds to a particular kind of artwork or type of technology. The brightness of each thread varies with the number of artworks that it contains in each year, so you can watch the ebb and flow of different lines of thought over time” [Wattenberg 2001]. The categories range from the conceptual, such as minimal, to genre, such as humor or activism, to technical, as in a given programming language, like c++. As the visitor to the site scrolls across the lines of affinity, mapped by luminosity for the frequency of their occurrence, the “fan” set or “concurrent timelines”, depending on the visualization method, pop open to reveal works in the database. Net artists were invited to submit to the work online, and their submissions were then placed into the piece. Idealine is one of the few existing attempts at mapping 90′s net art that currently exists online.

Wattenberg’s most noted work in mass consciousness is Map of the Market [Wattenberg et al., 1998-] created for SmartMoney.com. It uses the rectangular tree map scheme popularized by Ben Shneiderman, and used in part in programs like GrandPerspective for the Mac, where disk usage is shown by size and type of file. However, where Map of the Market intuitively gives a feel for the activity of the market by correlating capitalization versus change (from green to red), in looking at various sectors of the market, such as commodities, utilities, banking, and then varying my time snapshot, I was able to get a very quick idea which of the stocks selected for the Smartmoney.com map were performing short and long term, as well as a good overview of the “portfolio” in the map. This sort of work is a good index for the overall market and the performance of sectors and particular stocks; to deal with them discretely at first glance would be far more of a challenge.

Ben Fry’s best known work is Valence [Fry, 2002], an algorithm that tracks occurrences and associations in large sets of data that was featured in the display of the movie Minority Report, and in the 2002 Whitney Biennial. Fry has used Valence to analyze Mark Twain, the genomes and to make a comparative analysis of books. As the program reads through the data, it tracks several factors and creates intuitive associations between them. In the case of Twain, higher degrees of word occurrence will move the word further from the center for better visibility, and the more often two words are in sequence, the closer they will appear. Valence gives form to a gestalt; a “feel’ for the qualitative properties of a body of data, like Twain. As Fry states on his website:

The premise is that the best way to understand a large body of information, whether it’s a 200,000 word book, usage data from a web site, or financial transaction information between two multinational corporations, is to provide a feel for general trends and anomalies in the data, by providing a qualitative slice into how the information is structured. The most important information comes from providing context and setting up the interrelationships between elements of the data. [Fry 2002]

While the cultural acceptance of Valence gives evidence of its compelling nature, Fry appears to wonder whether ultimately it has utilitarian purposes. To this end, he has applied the general algorithm to visualize web traffic by page hit and user information, but within a straight-line form, rather than with the arcing method of the popularly known version. With the web version of Valence, frequent hits and the frequency of hits between immediate pages is evident. Fry successfully takes a body of data and its interactions and makes another clear set of legible associations.

Two sets of mapping that have similar functions but different ends are Fry’s Distellamap: Pac-Man [2005] shown in the MoMA’s Design and the Elastic Mind exhibition, and Wattenberg’s Shape of Song [2001] that was shown at Bitforms Gallery, NYC, and can be seen on the Turbulence.org site. Both are examples that map temporal structures, one being the execution pattern of the assembly code of classic Atari 2600 games, and the other an analysis of structures in musical composition. In Shape of Song, Wattenberg takes his algorithm and loads it with MIDI digital music data. It then looks for deep structures in the music, like repeating themes and motifs, and draws sets of arcs between them to signify their similarity. For example, the folk tune “Oh Clementine” is clearly a repetitive structure, while Madonna’s “Like a Virgin” shows a much more chaotic pattern. Shape of Song is visually seductive, though; but while it shows a compelling visual metaphor for the structural qualities of a song, one may wonder about the desire to believe a map because of its aesthetics or because it was computationally generated.

Fry’s Distellamap: Pac-Man is a similar sort of analysis of a temporal data structure, but here it is the execution of the 6507 assembly code of popular Atari 2600 videogames. For the Design and the Elastic Mind exhibition, Fry displayed a disassembly of the cartridge Pac Man. In the schematic, the “opcodes” (instructions) are seen to the left, and the raw character/ship data is visualized as a simple bitmap at the end, which is the structure of the cartridge code. The trajectory of execution is then shown as a series of arcs, jumping from block to block. The ghosts are clearly visible as is Pac Man himself. The significance of this particular body of work is that it provides a quick, intuitive glace at the programming styles. Compared to multiple disassemblies, the Pac Man code shows a convoluted map of the artificial intelligence within the game, showing its unusual complexity. I would imagine that for vintage Atari cartridge hackers, Fry’s mapping schema might give insights about the structure of a cartridge’s game engine (kernel) and how to modify it by using Distellamap. However, just as a display of the code and its execution, it evokes a sense of wonder about the magic of the limitations that vintage programmers had to contend with, as well as nostalgia for the first waves of video gaming. In a way, it shows a map of 70′s technoculture.

My last example of works that work with flows in information sets is Dumpster [Golan Levin, Kamal Nigam, Jonathan Freiberg, 2006]. This work analyzes millions of blogs (by definition online) and then parses entries to cull ones that have to do with teenage breakups. As Levin states, “visitors to the project can surf through tens of thousands of specific romantic relationships in which one person has “dumped” another.” The 20,000 or so breakup entries obtained over 2005 have maps for date, gender, and daily volume. But when I interacted with the piece, I was led to general conclusions that also had an emotional impact. The posts I read were predominantly female, (a quick scan through the left-hand side point cloud suggests this correlation), and reveal teenage women kicking out their young men, comforting one another, triumphing, and lamenting. On one hand, The Dumpster provides an affective, sympathetic map for young romance and its patters of demise. But as with many of our maps of flows, one asks what data are being analyzed under what criteria. Could The Dumpster be a tool for sociological research, or is it a social media index of a certain cross section during a certain time that suggests certain results. Or are all of these flow-tracking flows more important for their qualitative/affective aspects than their quantitative use? I believe that if nothing else, works that track trends and flows of interactions and processes give their viewers some inference about metastructures with huge amounts of data, and from that they can make intuitive leaps. More importantly, these works by Wattenberg, Fry, and Levin reflect a culture that is awash in data, and is developing new strategies to make sense of it.

Previous: Vector | Next: Conclusion

Edit Page | Translate Page