Tag Archive for social theory

East Coast Code

There’s lots to like about Lawrence Lessig’s book, Code 2.0—particularly, I find, the distinction he draws between “East Coast Code” (i.e., the law) and “West Coast Code” (i.e., computer hardware and software). He sees both as modes of bringing order to complex systems, albeit through different means. Lessig is also interested in the ways in which West Coast Code has come to be used in ways that strongly resemble, and sometimes even supersede, its East Coast counterpart, as in the case of digital rights management technology. “Code is law,” as he so aptly puts it.

I’ve been playing with something like Lessig’s East Coast-West Coast Code distinction in my ongoing research on algorithmic culture. As I’ve said many times now, “algorithmic culture” refers to the use of computational processes to sort, classify, and hierarchize people, places, objects, and ideas, as well as to the habits of thought, conduct, and expression that flow from those processes. Essentially we’re talking about the management of a complex system—culture—by way of server farms and procedural decision-making software. Think Google or Facebook; this is West Coast Code at its finest.

Perhaps better than anyone, Fred Turner has chronicled the conditions out of which West Coast Code emerged. In From Counterculture to Cyberculture, he shows how, in the 1960s, Stewart Brand and his circle of countercultural compadres humanized computers, which were then widely perceived to be instruments of the military-industrial complex. Through the pages of the Whole Earth Catalog, Brand and company suggested that computers were, like shovels, axes, and hoes, tools with which to craft civilization—or rather to craft new-styled, autonomous civilizations that would no longer depend on the state (i.e., East Coast Code) to manage human affairs.

The deeper I delve into my own research, the more I discover just how complicated—and indeed, how East Coast—is the story of algorithmic culture. I don’t mean to diminish the significance of the work that’s been done about the West Coast, by any means. But just as people had to perform creative work to make computers seem personal, even human, so, too, did people need to perform similar work on the word culture to make it make sense within the realm of computation. And this happened mostly back East, in Cambridge, MA.

“Of course,” you’re probably thinking, “at MIT.” It turns out that MIT wasn’t the primary hub of this semantic and conceptual work, although it would be foolish to deny the influence of famed cybernetician Norbert Wiener here. Where the work took place was at that other rinky-dink school in Cambridge, MA: Harvard. Perhaps you’ve heard of it?

A good portion of my research now is focused on Harvard’s Department of Social Relations, an experimental unit combining Sociology, Psychology, and Cultural Anthropology. It had a relatively short existence, lasting only from 1946-1970, but in that time it graduated people who went on to become the titans of postwar social theory. Clifford Geertz, Stanley Milgram, and Harold Garfinkel are among the most notable PhDs, although myriad other important figures passed through the program as well. One of the more intriguing people I turned up was Dick Price, who went on to found the Esalen Institute (back to the West Coast) after becoming disillusioned by the Clinical Psychology track in SocRel and later suffering a psychotic episode. Dr. Timothy Leary also taught there, from 1961-1963, though he was eventually fired because of his controversial research on the psychological effects of LSD.

I’ve just completed some work focusing on Clifford Geertz and the relationship he shared with Talcott Parsons, his dissertation director and chair of SocRel from 1946-1956. It’s here more than anywhere that I’m discovering how the word culture got inflected by the semantics of computation. Though Geertz would later move away from the strongly cybernetic conceptualization of culture he’d inherited from Parsons, it nonetheless underpins arguably his most important work, especially the material he published in the 1960s and early 70s. This includes his famous “Notes on the Balinese Cockfight,” which is included in the volume The Interpretation of Cultures.

My next stop is Stanley Milgram, where I’ll be looking first at his work on crowd behavior and later at his material on the “small world” phenomenon. The former complicates the conclusions of his famous “obedience to authority” experiments in fascinating ways, and, I’d argue, sets the stage for the notion of “crowd wisdom” so prevalent today. Apropos of the latter, I’m intrigued by how Milgram helped to shrink the social on down to size, as it were, just as worries about the scope and anonymizing power of mass society reached a fever pitch. He did for society essentially what Geertz and Parsons did for culture, I believe, particularly in helping to establish conceptual conditions necessary for the algorithmic management of social relations. Oh—and did I mention that Milgram’s Obedience book, published in 1974, is also laden with cybernetic theory?

To be clear, the point of all this East Coast-West Coast business isn’t to create some silly rivalry—among scholars of computation, or among their favorite historical subjects. (Heaven knows, it would never be Biggie and Tupac!) The point, rather, is to draw attention to the semantic and social-theoretical conditions underpinning a host of computational activities that are prevalent today—conditions whose genesis occurred significantly back East. The story of algorithmic culture isn’t only about hippies, hackers, and Silicon Valley. It’s equally a story about squares who taught and studied at maybe the most elite institution of higher education on America’s East Coast.

Share

Algorithms Are Decision Systems

My latest interview on the topic of algorithmic culture is now available on the 40kBooks blog.  It’s an Italian website, although you can find the interview in both the original English and in Italian translation.

The interview provides something like a summary of my latest thinking on algorithmic culture, a good deal of which was born out of the new research that I blogged about here last time.  Here’s an excerpt from the interview:

Culture has long been about argument and reconciliation: argument in the sense that groups of people have ongoing debates, whether explicit or implicit, about their norms of thought, conduct, and expression; and reconciliation in the sense that virtually all societies have some type of mechanism in place – always political – by which to decide whose arguments ultimately will hold sway. You might think of culture as an ongoing conversation that a society has about how its members ought to comport themselves.

Increasingly today, computational technologies are tasked with the work of reconciliation, and algorithms are a principal means to that end. Algorithms are essentially decision systems—sets of procedures that specify how someone or something ought to proceed given a particular set of circumstances. Their job is to consider, or weigh, the significance of all of the arguments or information floating around online (and even offline) and then to determine which among those arguments is the most important or worthy. Another way of putting this would be to say that algorithms aggregate a conversation about culture that, thanks to technologies like the internet, has become ever more diffuse and disaggregated.

Something I did not address at any length in the interview is the historical backdrop against which I’ve set the new research: the Second World War, particularly the atrocities that precipitated, occurred during, and concluded it.  My hypothesis is that the desire to offload cultural decision-making onto computer algorithms stems significantly, although not exclusively, from a crisis of faith that emerged in and around World War II.  No longer, it seems, could we human beings be trusted to govern ourselves ethically and responsibly, and so some other means needed to be sought to do the job we’re seemingly incapable of doing.

A bunch of readers have asked me if I’ve published any of my work on algorithmic culture in academic journals.  The answer, as yet, is no, mostly because I’m working on developing and refining the ideas here, in dialogue with all of you, before formalizing my position.  (THANK YOU for the ongoing feedback, by the way!)  Having said that, I’m polishing the piece I blogged about last time, “‘An Infernal Culture Machine’: Intellectual Foundations of Algorithmic Culture,” and plan on submitting it to a scholarly journal fairly soon.  You’re welcome to email me directly if you’d like a copy of the working draft.


P.S. If you haven’t already, check out Tarleton Gillespie’s latest post over on Culture Digitally, about his new essay on “The Relevance of Algorithms.”

Share