Up: Software Stories | [Related] «^» «T» |
Monday, January 27, 2003
More and More Like the Machine
By Paul Ford
Some miscellaneous thoughts on software and soul, which I will try to focus on forthcoming Mondays.
When I use software intensively, and especially when I program, I enter into the palaces of media memory established by the software developers; I internalize the concept of a document which Knuth pioneered in his TeX system. I grapple with the Microsoft Word paper clip. I break an image into a series of layers and alpha channels in Photoshop, pixels infinitely malleable, or I approach an image as pure data to be manipulated in a defined sequence by the command-line NetPBM utilities.
I am close to these tools, and every day I become slightly more like the machine I use. Part of this adaptation is unexpected: without wanting to, I dream of programming or see the landscape of my regular day in digital layers, ready to be peeled and blended. But much of the adaptation is voluntary, because the more I can “think” like my PC, the more it and I can interact without my feeling alienated from it, the more I can make it do, and the more productive I will become.
Each aspect of the computer I use was created by someone, from the circuits which process raw binary input and output to the series of programs which redraw the screen after I have moved a window with the mouse. I spend 10 hours a day, some days - maybe too many days - living in a world outlined by the programmers of Redmond or 1 Infinite Loop or in the basements of Helsinki universities, exploring the world they built. There are 400,000 files on my desktop, 400,000 different slices of text and libraries for programming and image-processing filters, each one the product of a set of decisions.
But that said, while I am cognizant of the effort that went into building the code I use daily, I feel no fundamental human connection with these programmers; there is no clear “I” of the programmer who addresses me, no way of reading Photoshop as I might read a Graham Greene novel. The programmer's face is the interface, the features of the face the windows and buttons of an application. We name our tools PhotoMunger, not “Donald” or “Mordecai”. They are qualified by their utility. And yet we name our computers, their hard drives, assigning our personality to them, making them mirrors of our quirks, or of our need for order.
Some exceptions: a few authors of software do not remain faceless, like Richard Stallman, who created the Emacs text editor and the GCC compiler, Donald Knuth, who created TeX, Larry Wall, the developer of Perl, or Guido von Rossum, the developer of Python. Most of these men (always men!) did their work below the interface, as designers of languages and systems, creators of tools that allow creation. To write a language, to abstract the computing process into a fundamental sequence of tokens which will be internalized and applied by others, is an individual act; thousands may add libraries, but Python is still Guido, Perl is still Larry, Emacs is RMS, and TeX is Knuth. The experts of abstraction who are able to catch into some zietgeist do not lose their identities as their products become popular; their faces are not replaced by buttons. They exist in a textual realm of code, as authors of fundamental texts.
What I wish I could understand more fully is the way that these things related, the cultural processes whereby we got to the windowing interface from the abstraction of the Turing machine, the way our needs prompted the tools we use into existence. There are a lot of ways it could have gone, but it went this way instead. That means something, and can tell us something about how humans think, at least I think it can.
Away from this site, I was writing about the need for a “genealogy of software,” or of “computing”; “computing” as a cultural practice. Too much of the history of computing is caught up in origin myths, in tales of great men doing great deeds in order to invent the future. But anyone who's worked in software can tell you that there is no origin; there is something like “progress,” (which some will dispute, for some will always prefer the command line to the windowing interface, or feel that assembly language programming is more “real” than writing code in C++), and there are inventions, there are increasing levels of abstraction which ultimately bring the machine to the desktop of millions. But there is no single act which brings “computing” as we know it into being, and there is no culmination, no ultimate interface or perfect system. Marketers tell us that every interface is the best ever, the solution, the answer to our problems, the right way to do it. But “computing” is a social act, done alone but in interaction with the product of other minds, the combination of millions of individual actors using tools to complete tasks, and it has no more of an origin than language, or sexuality, or music; it is no more the story of John von Neumann than television is the story of Philo Farnsworth.
But perhaps genealogy is not the right approach; looking at a draft of a paper I wrote, Dru Jay questioned:
The idea of a genealogy of software seems to me a curious choice, given the genealogies I've recently been reading by Foucault: Discipline and Punish, History of Sexuality Vol. 1, Madness and Civilization. Each of these tracks the twists and turns of an idea and its institutions. Sexuality-not-sex, insanity and unreason. But none of them primarily tracks technology (in the broad or "mostly mechanical tools" sense); only secondarily. Based on this, my expectation would be a genealogy of ideas like "productivity", "information", or "communication" and how they formed and were formed by the development of computer technology. Conversely, a genealogy of printing would seem awkward, since the ideas in-forming it ("publishing", "literature") would always be a little bit out of the picture, forcing the author to constantly back up and fill in the essentially formations which have made the particular uses of a printing press possible. Also: tracking an idea from pre-computer age to present seems to be a particularly fruitful way to understand the historical changes in an idea, through institutions and discourse.
Questions, then, perhaps naive: is there precedent for a genealogy of a technological practice? Are the shortcomings I mention not present in the case of software?
Questions, then, perhaps naive: is there precedent for a genealogy of a technological practice? Are the shortcomings I mention not present in the case of software?
I'm not really ready to answer those questions now, although they feel important to me. I do think there is room for discussing software as a form of discourse, as something that happens in a social context, something which can alienate or appeal to users according to their backgrounds, beliefs. That we are able to create the computer makes it seem possible that we could mold it into our image, that if we can imagine it, it can imagine us. In reading and writing about software for years, I feel a real need to come up with a way to talk about it that goes deeper than the review of a new release of MSWord, but does not turn into a tutorial or functional specification, a way to talk about software in the same way a literary critic discusses a text.
Some sort of theory. Not yet, but maybe someday, after I read and write quite a bit more.