Algorithms Are Decision Systems

My latest interview on the topic of algorithmic culture is now available on the 40kBooks blog.  It’s an Italian website, although you can find the interview in both the original English and in Italian translation.

The interview provides something like a summary of my latest thinking on algorithmic culture, a good deal of which was born out of the new research that I blogged about here last time.  Here’s an excerpt from the interview:

Culture has long been about argument and reconciliation: argument in the sense that groups of people have ongoing debates, whether explicit or implicit, about their norms of thought, conduct, and expression; and reconciliation in the sense that virtually all societies have some type of mechanism in place – always political – by which to decide whose arguments ultimately will hold sway. You might think of culture as an ongoing conversation that a society has about how its members ought to comport themselves.

Increasingly today, computational technologies are tasked with the work of reconciliation, and algorithms are a principal means to that end. Algorithms are essentially decision systems—sets of procedures that specify how someone or something ought to proceed given a particular set of circumstances. Their job is to consider, or weigh, the significance of all of the arguments or information floating around online (and even offline) and then to determine which among those arguments is the most important or worthy. Another way of putting this would be to say that algorithms aggregate a conversation about culture that, thanks to technologies like the internet, has become ever more diffuse and disaggregated.

Something I did not address at any length in the interview is the historical backdrop against which I’ve set the new research: the Second World War, particularly the atrocities that precipitated, occurred during, and concluded it.  My hypothesis is that the desire to offload cultural decision-making onto computer algorithms stems significantly, although not exclusively, from a crisis of faith that emerged in and around World War II.  No longer, it seems, could we human beings be trusted to govern ourselves ethically and responsibly, and so some other means needed to be sought to do the job we’re seemingly incapable of doing.

A bunch of readers have asked me if I’ve published any of my work on algorithmic culture in academic journals.  The answer, as yet, is no, mostly because I’m working on developing and refining the ideas here, in dialogue with all of you, before formalizing my position.  (THANK YOU for the ongoing feedback, by the way!)  Having said that, I’m polishing the piece I blogged about last time, “‘An Infernal Culture Machine’: Intellectual Foundations of Algorithmic Culture,” and plan on submitting it to a scholarly journal fairly soon.  You’re welcome to email me directly if you’d like a copy of the working draft.


P.S. If you haven’t already, check out Tarleton Gillespie’s latest post over on Culture Digitally, about his new essay on “The Relevance of Algorithms.”

Share

2 comments

  1. […] on historicizing the emergence of an “algorithmic culture” (Alex Galloway‘s term) available widely already, so my role here is really just to point at it and say: “Look!” (Then […]

  2. […] on historicizing the emergence of an “algorithmic culture” (Alex Galloway‘s term) available widely already, so my role here is really just to point at it and say: “Look!” (Then […]

Leave a Reply

Your email address will not be published.