In an essay titled What Can a Network do?, Alexander Galloway discussed the need for humans to adopt machine-like capabilities for reading networks. Instead of treating a network as a text—as humanities scholars would want—we should instead read it as a machine would, through a process of parsing. This procedure takes data and sorts it into categories of relevance in order to create a meaningful analysis. The parsing machine par excellence is the algorithm, and it dominates much of our digital lives. In recent years, algorithms have been telling us what music to listen to, who we should date, what stocks we should buy, and even what we should eat. It comes as no surprise, then, that it should also tell us what art we should view. But what happens when the art we are looking at becomes the algorithm itself?
When I was initially asked by Artsy to discuss the sale of algorithms at an upcoming art auction, I was immediately perplexed. So many questions came to mind: Are algorithms art? What happens to the intellectual property at the point of sale? What is actually acquired when one purchases an algorithm? Who would even buy an algorithm?
When I discussed some of these initial questions with Sebastian Chan from The Smithsonian’s Cooper Hewitt Museum, he said, “It’s interesting what happens to things that aren’t able to be contained as a physical thing. We should consider them in another way and we should try to look at what their role is.” When collecting the Planetary algorithm in 2011, he further remarked, the museum was “trying to collect the idea and method, and this was one instance of that method rather than merely [an] app.”
Planetary, 2011–12, courtesy of Cooper Hewitt, Smithsonian Design Museum
The notion of collecting and preserving an idea is not all that uncommon to the art world. In a round table conversation with Artsy engineers two weeks ago, the notion of collecting code or algorithms from an institutional standpoint was likened to collecting conceptual art and performance pieces. Ephemera and documentation have certain similarities between code and conceptual art. However, Artsy engineer Daniel Doubrovkine commented that contemporary museums and institutions are still struggling to present code-based works in the same faithful fashion as conceptual art projects: “I think we need to put code in social context. For example, early programmers were mostly women, and creating exhibitions around women programmers and the art of their programming is a needed social context.”
As we spoke, it became clear that the social context for the collecting, distribution, selling, and preservation of algorithms is an initiative that is sorely needed. Chan noted, “I don’t think you can separate art from the context of history. Art is produced in a society, and it occurs in a society; you can’t separate it out.” Furthermore he claims that because “Software and code is very much part of how we create culture,” new standards for the long-term sustainability of algorithms must become responsibilities of institutions like Cooper Hewitt.
With that longevity in mind, I asked what programmers could do to reinforce the context of their code within their work, bringing up the End User License Agreement (EULA) of an early live video mixing software nato.0+55+3d as an example. I remembered that a stipulation within the EULA written by the anonymous author/entity Netochka Nezvanova made users agree that the US military was perpetrating crimes against humanity by conducting airstrikes in Kosovo. Thus, the social context of using nato.0+55+3d was embedded within the software itself.
Installation view of Immersion Room. Photo: Matt Flynn © 2014 Cooper Hewitt, Smithsonian Design Museum.
While Chan and Doubrovkine reflected on the general opacity of all EULAs, Artsy engineer Orta Therox commented on how this social context can manifest within an open-source community. Therox suggested that observing tracked changes and comments within a piece of software “are the real kind of social issues around building and collaborating on code.” By looking at the long-term development of an algorithm, an understanding of the social context can start to reveal itself. However, the transparency of the code itself does not provide all the necessary social context it needs. The desire for institutions to provide a proper social context for code becomes even more pressing when the value of algorithms is given more importance than their context. In other words, the value we place on these procedures and calculations comes at a cost.
A structural problem with algorithms is that they render the underrepresented into the invisible. If such a process is applied to culture, anything that falls outside the scope of an algorithm is viewed as an anomaly. As a result of the crunching and sorting of data, the process of culture becomes the product of an algorithm. Algorithms are “results-based,” designed objects—machines that use parsing in order to create significance, relevance, and meaning. Algorithms produce evidence to substantiate speculations of all types: financial, informational, social, ideological. What becomes truly troubling is not when statistical aberrations are left out of the mix, but when the results of algorithms create or substantiate a narrative of exclusivity.
Unfortunately, the narrative of contemporary algorithmic culture is one that is dominated by particular voices—mostly male, mostly white, and mostly from classes of some privilege. It is not that other voices within the development of code-based works don’t exist, but rather that these voices go unrecognized as a result of being filtered out through algorithmic processes. Although many initiatives are currently undoing and combating exclusion and under-representation, it becomes increasingly difficult to do so when the algorithms we use (and are impacted by) are built upon parameters that disavow the existence of populations that defy categorization or exist contrary to a privileged narrative.
In her germinal 1985 text, A Cyborg Manifesto, Donna Haraway identifies the emergent system of oppression within networked cultural as an “informatics of domination.” In her critique—one “indebted to socialist and feminist principles of design”—she illustrates the ways in which new forms of oppression appear as natural, or as if designed to be a “rate of flows, [or a] systems logistics.” Twenty years later, informatics of domination have become further naturalized through algorithmic processes. Haraway suggests that one way of working against this is to create networks of affinities that deliberately work against the “the translation of the world into a problem of coding.” Perhaps in the sale, acquisition, and the open-source redistribution of algorithms, new opportunities to subvert their systematic neglect will become possible.
Nicholas O’Brien is a net-based artist, curator, and writer. His work has exhibited in Mexico City, Berlin, London, Dublin, Italy, Prague, as well as throughout the U.S. He has been the recipient of a Turbulence Commission funded by the NEA and has curated exhibitions at the Museum of Contemporary Art in Chicago, 319 Scholes, and Eyebeam Center for Art and Technology.