- Programmed Visions has a few key arguments.
- It rethinks materiality in digital studies. In the first decade of the 2000’s, post-Kittler, digital studies hardened its conception of materiality in an effort to fix its site of analysis. Chun critiques this 180 degree turn toward the materiality of, say, source code, arguing that much of software is still slippery, vaporous, and spectral. The book calls for a more rigorous and nuanced analysis of this spectrality alongside (not as a replacement) for the “software as a thing” discourse of then-prominent software studies.
- It locates the production of this spectrality in subsumed and dematerialized (also racialized and gendered) labor: both of humans and of machines themselves. It takes a lot of work to make the machine vaporous.
- It links the rise of software to a concomitant shift in neoliberal governmentality: software encourages the profusion of a rhetoric of “programmability” that Chun argues is key to the rise of neoliberalism.
- It contends that this rhetoric of programmability, which depends on a conflation of storage and memory, fixity with ephemerality, space with time, produces omnipresent but false visions of the future (i.e., the book’s title).
- “programmed visions”: the dream of programmability to construe an extrapolable future from past data, “in which an all-knowing intelligence can comprehend the future by apprehending the past and present” (9).
- “This book addresses this concept of programmability through the surprising materialization of software as a thing in its own right. It argues that the hardening of programming into software and of memory into storage is key to understanding new media as a constantly inspiring yet disappointing medium of the future” (xii).
- In an effort to reduce new media to a single analyzable object, Chun contends that new media scholars have used software as a common frame (1). Knowing software then becomes an enlightenment key toward understanding new media as a whole. But this conceptual clarity should give us pause: why do we trust software as metaphor when we don’t completely understand the operations of the computer in aggregate at any given moment (2)? This paradox is in fact the appeal of software: what “makes it a powerful metaphor for everything we believe is invisible yet generates visible effects, from genetics to the invisible hand of the market, from ideology to culture” (2).
- Software is “barely a thing,” ephemeral, relational, slippery, prone to decay and obsolescence (3). We can’t talk about its time so we talk about its space in layers (think Stack); the Kittlerian holds that it doesn’t exist, that it’s all just voltage differences after all the abstraction is scraped away. Working through the legal issues re: software, Chun claims: “These changes, brought about by the ‘hardening’ of software as textual or machinic thing through memory, point toward a profound change in our understanding of what is internal and external, subject and object” (5). “Indeed, this book argues that the remarkable process by which software was transformed from a service in time to a product, the hardening of relations into a thing, the externalization of information from the self, coincides with and embodies larger changes within what Michel Foucault has called governmentality” (6)
- The rise of computing as key to neoliberal governmentality: computers facilitate the biopolitical projects of the neoliberal state; the definition of “productive individuals” (8).
- “This book, therefore, links computers to governmentality . . . at the level of their architecture and their instrumentality” (9).
Part 1. Invisibly Visible, Visibly Invisible
- “This notion of the computer as rendering everything transparent, however, is remarkably at odds with the actual operations of computation, for computers—their hardware, software, and the voltage differences on which they rely—are anything but transparent. When the computer does let us ‘see’ what we cannot normally see, or even when it acts like a transparent medium through video chat, it does not simply relay what is on the other side: it computes” (17). This conflation of transparency and obscurity is central to the computer’s appeal as a dominant metaphor for our times: “The linking of rationality with mysticism, knowability with what is unknown, makes it a powerful fetish that offers its programmers and users alike a sense of empowerment, of sovereign subjectivity, that covers over—barely—a sense of profound ignorance” (18).
Chapter 1. On Sourcery and Source Codes
- “Software emerged as a thing—as an iterable textual program—through a process of commercialization and commodification that has made code logos: code as source, code as true representation of action, indeed, code as conflated with, and substituting for, action” (19).
- Argument: By the simultaneous conflation and separation of instruction from execution, programming takes on a thrill of power constantly undermined by a nagging doubt that we have power over the inhuman “thing” of software. The point is not to return to some imagined time of direct access to the machine, but rather “to make our computers more productively spectral bu exploiting the unexpected possibilities of source code as fetish” (19–20).
- The Gallowayian argument that “code is the only language that is executable” is useful but also naturalizes code as logos, a process that Chun argues is in fact historically determined and non-neutral (22–23). Galloway erases the process of execution by arguing that uncompiled and compiled code are logically equivalent; Chun argues that these are technical rather than numerical relations, and as such require attention to processes of intermediation (24).
- Source code is only source after the fact, after its compilation and execution; as such it hovers in the space between “dead repetition [and] living speech”: information is “undead” (25).
- The abstraction of structural programming emerges from the economic imperatives of the software industry (37); “Source code become ‘thing’—the erasure of execution—follows from the mechanization of these power relations, the reworking of subject-object relations through automation as both empowerment and enslavement and through repetition as both mastery and hell. Embedded within the notion of instruction as source and the drive to automate computing—relentlessly haunting them—is a constantly repeated narrative of liberation and empowerment, wizards and (ex-)slaves” (41).
- Source code as fetish (object): “A fetish allows one to visualize what is unknown—to substitute images for causes. Fetishes allow the human mind both too much and not enough control by establishing a ‘unified causal field’ that encompasses both personal actions and physical events. Fetishes enable a semblance of control over future events—a possibility of influence, if not an airtight programmability—that itself relies on distorting real social relations into material givens” (50).
- “Computers, like other media, are metaphor machines: they both depend on and perpetuate metaphors. More remarkably, though, they—through their status as ‘universal
machines’—have become metaphors for metaphor itself” (55); this analysis depends a surprising amount on the moves of Edwards’ Closed World.
Chapter 2. Daemonic Interfaces, Empowering Obfuscations
- Paradox: Interfaces liberated us from the torture of the command line. “This freedom, however, depends on a profound screening: an erasure of the computer’s machinations and of the history of interactive operating systems as supplementing—that is, supplanting—human intelligence” (59).
- “Rather than condemning interfaces as a form of deception, designed to induce false consciousness, this chapter investigates the extent to which this paradoxical combination of visibility and invisibility, of rational causality and profound ignorance, grounds the computer as an attractive model for the ‘real’ world. Interfaces have become functional analogs to ideology and its critique—from ideology as false consciousness to ideology as fetishistic logic, interfaces seem to concretize our relation to invisible (or barely visible) ‘sources’ and substructures” (59). Main examples: mapping, esp. Jamesonian cognitive mapping.
- Mapping instantiates a god’s-eye view that provides imagined mastery (62), instituting the “direct manipulation” of neoliberalism (citing Boltanski/Chiapello); follows Edwards’ closed world / green world formation again (I’m hoping that Tenen provides an antidote to this pseudo-lit-crit).
- “Interfaces offer us an imaginary relationship to our hardware: they do not represent transistors but rather desktops and recycling bins” (66).
- A critique of cognitive mapping, which would seem useful to pair along critiques of media archaeological visibility more generally: “It would seem thus that instead of a situation in which the production of cognitive maps is impossible, we are locked in a situation in which we produce them—or at the very least approximations of them—all the time, in which the founding gesture of ideology critique is simulated by something that also pleasurably mimics ideology” (71).
- “Interfaces in general, however, are hardly radical and the demand that we map—and thus understand—often seems like the simple following of the network and its paranoid logic rather than an insightful, clarifying act. Mapping often seduces us into exposing what is ‘secret’ or opaque, into drawing connections between visible effects and invisible causes, rather than actually reading what one sees. It can become an endless pursuit of things, aimed at robbing them of their thingliness, in order to create a closed world in which every connection is exposed, every object reduced to a code” (74). “The question then is: how can we have a form of cognitive mapping that does not engage in nostalgia for sovereign power, for the subject (now multiplied everywhere) who knows? Also: how necessary is cognitive mapping? And to what extent is the desire to map not contrary to capitalism but rather integral to its current form, especially since it is through our mappings that we ourselves are mapped? That is, to what extent is our historically novel position not our ignorance and powerlessness, but rather our determination and our drive to know?” (75).
Part II. Regenerating Archives (& Chapter 4)
- “Memory hardens information—turning it from a measure of possibility into a ‘thing,’ while also erasing the difference between instruction and data (computer memory treats them indistinguishably). It seems to make digital media an ever-increasing archive in which no piece of data is lost and thus central to progress” (97).
- “The paradox: what does not change does not endure, yet change — progress (endless upgrades) — ensures that what endures will fade. Another paradox: digital media’s memory operates by annihilating memory” (137).
- “As this discussion makes clear, digital media’ s promise is also its threat; the two cannot be neatly divided into the good and the bad. Digital media, if it ‘saves’ anything, does so by transforming storage into memory, by making what decays slowly decay more quickly, by proliferating what it reads. By animating the inanimate—crossing the boundary between the live and the dead—digital media poses new challenges and opportunities for ‘the archive’” (138).
- Argument: “chapter 4 argues that memory became conflated with storage through analogies to analogies: through analogies to cybernetic neurons, to genetic programs, to what would become ‘analog’ media itself. Through these analogies (and their erasure), the new and the different have been reduced to the familiar. I uncover these differences and analogies not to attribute blame, but rather to reveal the dreams and hopes driving these misreadings: the desire to expunge volatility, obliterate ephemerality, and neutralize time itself, so that our computers can become synonymous with archives. These desires are key to stabilizing hardware so that it can contain, regenerate, and thus reproduce what it ‘stores.’ Further, they are central to the twin emergence of neoliberalism and computer programs as strategic games” (139).
- Von Neumann’s “First Draft of a Report on the EDVAC” as another “mythic, controversial, and incomplete” foundation of digital culture (140); this section presages Chun’s later work on hypothesis in Updating.
- Von Neumann differentiates analog from digital on how they produce errors (142); analog as “analogy” machine, but to what? “Depending on one’s perspective, analog computers either offer a more direct, ‘intuitive,’ and, according to Vannevar Bush, ‘soul-satisfying’ way of solving differential equations or they are imprecise and noisy devices, which add extra steps — the translation of real numbers into physical entities. The first, the engineer’s perspective, views computers as models and differential equations as approximations of real physical processes; the second, the mathematician’s perspective, treats equations as predictors, rather than descriptors of physical systems — the computer becomes a simulacrum, rather than a simulation” (151).
- “Memory, then, which enables a certain causality as well as an uncertainty as to time and place, threatens to overwhelm the system, creating networks that crowd out the new. A neural circuit, if it persists — programmability — makes prediction possible. It, however, also puts in jeopardy what for McCulloch is most interesting and vital about humanity: the ability to learn and adapt to the unknown, that is, the future as future” (156).
- “According to William Poundstone, the last anecdote of von Neumann’ s ‘total recall’ concerns his last days, when he lay dying of cancer at Walter Reed Hospital, a cancer caused by his work on nuclear weapons (the drive for nuclear weapons also powered the development of digital electronic computers; American computers and neoliberalism are both reactions to Nazism)” (161): To the extent that I am interested in nuclear weaponry, it is as a potent exemplar of how digital computing from its earliest moments has been entangled in the unconscious remaking of the planet in the service of a political-economic project.
- “This notion of an actual object is not outside of language, even if it is outside ‘literary description,’ for, to von Neumann, producing an object and describing how to build it were equivalent” (163); shades here of Harman.
- “Crucially, memory is not a static but rather an active process. A memory must be held in order to keep it from moving or fading. Again, memory does not equal storage. Although one can conceivably store a memory, storage usually refers to something material or substantial, as well as to its physical location: a store is both what is stored and where it is stored. According to the OED, to store is to furnish, to build stock. Storage or stocks always look toward the future. In computer speak, one reverses common language, since one stores something in memory. This odd reversal and the conflation of memory and storage gloss over the impermanence and volatility of computer memory. Without this volatility, however, there would be no memory. To repeat, memory stems from the same Sanskrit root for martyr. Memory is an act of commemoration — a process of recollecting or remembering” (167). The storage and stock sentence recalls Serres’ theory of storage in The Parasite.
- “Thus, as Wolfgang Ernst has argued, digital media is truly time-based media, which, given a screen’s refresh cycle and the dynamic flow of information in cyberspace, turns images, sounds, and text into a discrete moment in time. These images are frozen for human eyes only. Information is dynamic, however, not only because it must move in space on the screen, but also, and more important, because it must move within the computer and because degeneration traditionally has made memory possible while simultaneously threatening it. Digital media, allegedly more permanent and durable than other media (film stock, paper, etc.), depends on a degeneration so actively denied and repressed. This degeneration, which engineers would like to divide into useful versus harmful (erasability versus signal decomposition, information versus noise), belies and buttresses the promise of digital computers as permanent memory machines. If our machines’ memories are more permanent, if they enable a permanence that we seem to lack, it is because they are constantly refreshed — rewritten — so that their ephemerality endures, so that they may ‘store’ the programs that seem to drive them. To be clear, this is not to say that information is fundamentally immaterial; as Matthew Kirschenbaum has shown in his insightful Mechanisms: New Media and the Forensic Imagination , information (stored to a hard drive) leaves a trace that can be forensically reconstructed, or again, as I’ve argued elsewhere, for a computer, to read is to write elsewhere. This is to say that if memory is to approximate something so long lasting as storage, it can do so only through constant repetition, a repetition that, as Jacques Derrida notes, is indissociable from destruction (or in Bush’s terminology, forgetting)” (169–70). Question re: Matt’s forensic materiality: where are the traces left behind from massive systems like nuclear media or finance capital infrastructure? How do we do a forensic or media archaeological analysis on planetary scale?
- “Repetition and regeneration open the future by creating a nonsimultaneous new that confounds the chronological time these processes also enable” (172).
Archive and Impact
- Chun’s impact is well-known; this book is most important for its challenge to hard determinist Kittlerian materiality. I’m struck by her archive and methods: Chun spends much of her analysis in the familiar terrain of STS, roving over many of the same examples that power, say, Hicks or Edwards (the latter of whom is a major source for Chun). Indeed, there’s less critical code analysis than one might expect, which I take to also be part of the broader mapping work of this book: it’s pointing out holes in then-contemporary digital studies. Chun’s interests are in the ways that we have obscured the real history of the development of software as software in the service of neoliberal governmentality.
- In her critique of mapping, Chun makes some important interventions to the work of digital studies and studies of objects: cognitive mapping and the production of maps more generally have been subsumed into the information economy of neoliberal capitalism, which desires ever more information and encourages ever more visualization (careful to obscure the really shitty bits).
- Of course, the most important intervention that Chun makes for the study of media archaeology is in the book’s last chapter: the conflation of storage and memory and the importance of the archive for digital studies.