There’s lots to like about Lawrence Lessig’s book, Code 2.0—particularly, I find, the distinction he draws between “East Coast Code” (i.e., the law) and “West Coast Code” (i.e., computer hardware and software). He sees both as modes of bringing order to complex systems, albeit through different means. Lessig is also interested in the ways in which West Coast Code has come to be used in ways that strongly resemble, and sometimes even supersede, its East Coast counterpart, as in the case of digital rights management technology. “Code is law,” as he so aptly puts it.
I’ve been playing with something like Lessig’s East Coast-West Coast Code distinction in my ongoing research on algorithmic culture. As I’ve said many times now, “algorithmic culture” refers to the use of computational processes to sort, classify, and hierarchize people, places, objects, and ideas, as well as to the habits of thought, conduct, and expression that flow from those processes. Essentially we’re talking about the management of a complex system—culture—by way of server farms and procedural decision-making software. Think Google or Facebook; this is West Coast Code at its finest.
Perhaps better than anyone, Fred Turner has chronicled the conditions out of which West Coast Code emerged. In From Counterculture to Cyberculture, he shows how, in the 1960s, Stewart Brand and his circle of countercultural compadres humanized computers, which were then widely perceived to be instruments of the military-industrial complex. Through the pages of the Whole Earth Catalog, Brand and company suggested that computers were, like shovels, axes, and hoes, tools with which to craft civilization—or rather to craft new-styled, autonomous civilizations that would no longer depend on the state (i.e., East Coast Code) to manage human affairs.
The deeper I delve into my own research, the more I discover just how complicated—and indeed, how East Coast—is the story of algorithmic culture. I don’t mean to diminish the significance of the work that’s been done about the West Coast, by any means. But just as people had to perform creative work to make computers seem personal, even human, so, too, did people need to perform similar work on the word culture to make it make sense within the realm of computation. And this happened mostly back East, in Cambridge, MA.
“Of course,” you’re probably thinking, “at MIT.” It turns out that MIT wasn’t the primary hub of this semantic and conceptual work, although it would be foolish to deny the influence of famed cybernetician Norbert Wiener here. Where the work took place was at that other rinky-dink school in Cambridge, MA: Harvard. Perhaps you’ve heard of it?
A good portion of my research now is focused on Harvard’s Department of Social Relations, an experimental unit combining Sociology, Psychology, and Cultural Anthropology. It had a relatively short existence, lasting only from 1946-1970, but in that time it graduated people who went on to become the titans of postwar social theory. Clifford Geertz, Stanley Milgram, and Harold Garfinkel are among the most notable PhDs, although myriad other important figures passed through the program as well. One of the more intriguing people I turned up was Dick Price, who went on to found the Esalen Institute (back to the West Coast) after becoming disillusioned by the Clinical Psychology track in SocRel and later suffering a psychotic episode. Dr. Timothy Leary also taught there, from 1961-1963, though he was eventually fired because of his controversial research on the psychological effects of LSD.
I’ve just completed some work focusing on Clifford Geertz and the relationship he shared with Talcott Parsons, his dissertation director and chair of SocRel from 1946-1956. It’s here more than anywhere that I’m discovering how the word culture got inflected by the semantics of computation. Though Geertz would later move away from the strongly cybernetic conceptualization of culture he’d inherited from Parsons, it nonetheless underpins arguably his most important work, especially the material he published in the 1960s and early 70s. This includes his famous “Notes on the Balinese Cockfight,” which is included in the volume The Interpretation of Cultures.
My next stop is Stanley Milgram, where I’ll be looking first at his work on crowd behavior and later at his material on the “small world” phenomenon. The former complicates the conclusions of his famous “obedience to authority” experiments in fascinating ways, and, I’d argue, sets the stage for the notion of “crowd wisdom” so prevalent today. Apropos of the latter, I’m intrigued by how Milgram helped to shrink the social on down to size, as it were, just as worries about the scope and anonymizing power of mass society reached a fever pitch. He did for society essentially what Geertz and Parsons did for culture, I believe, particularly in helping to establish conceptual conditions necessary for the algorithmic management of social relations. Oh—and did I mention that Milgram’s Obedience book, published in 1974, is also laden with cybernetic theory?
To be clear, the point of all this East Coast-West Coast business isn’t to create some silly rivalry—among scholars of computation, or among their favorite historical subjects. (Heaven knows, it would never be Biggie and Tupac!) The point, rather, is to draw attention to the semantic and social-theoretical conditions underpinning a host of computational activities that are prevalent today—conditions whose genesis occurred significantly back East. The story of algorithmic culture isn’t only about hippies, hackers, and Silicon Valley. It’s equally a story about squares who taught and studied at maybe the most elite institution of higher education on America’s East Coast.
Leave a Reply