Archive for Algorithmic Culture


I learned last month from Wired that something along the lines of what I’ve been calling “algorithmic culture” already has a name —

According to Jonathan Keats, author of the magazine’s monthly “Jargon Watch” section, culturomics refers to “the study of memes and cultural trends using high-throughput quantitative analysis of books.”  The term was first noted in another Wired article, published last December, which reported on a study using Google books to track historical, or “evolutionary,” trends in language.  Interestingly, the study wasn’t published in a humanities journal.  It appeared in Science.

The researchers behind culturomics have also launched a website allowing you to search the Google book database for keywords and phrases, to “see how [their] usage frequency has been changing throughout the past few centuries.”  They follow up by calling the service “addictive.”

Culturomics weds “culture” to the suffix “-nomos,” the anchor for words like economics, genomics, astronomy, physiognomy, and so forth.  “-Nomos” can refer either to “the distribution of things” or, more specifically, to a “worldview.”  In this sense culturomics refers to the distribution of language resources (words) in the extant published literature of some period and the types of frameworks for understanding those resources embody.

I must confess to being intrigued by culturomics, however much I find the term to be clunky. My initial work on algorithmic culture tracks language changes in and around three keywords — information, crowd, and algorithm, in the spirit of Raymond Williams’ Culture and Society — and has given me a new appreciation for both the sociality of language and its capacity for transformation.  Methodologically culturomics seems, well, right, and I’ll be intrigued to see what a search for my keywords on the website might yield.

Having said that, I still want to hold onto the idea of algorithmic culture.  I prefer the term because it places the algorithm center-stage rather than allowing it to recede into the background, as does culturomicsAlgorithmic culture encourages us to see computational process not as a window onto the world but as an instrument of order and authoritative decision making.  The point of algorithmic culture, both terminologically and methodologically, is to help us understand the politics of algorithms and thus to approach them and the work they do more circumspectly, even critically.

I should mention, by the way, that this is increasingly how I’ve come to understand the so-called “digital humanities.”  The digital humanities aren’t just about doing traditional humanities work on digital objects, nor are they only about making the shift in humanities publishing from analog to digital platforms.  Instead the digital humanities, if there is such a thing, should focus on the ways in which the work of culture is increasingly delegated to computational process and, more importantly, the political consequences that follow from our doing so.

And this is the major difference, I suppose, between an interest in the distribution of language resources — culturomics — and a concern for the politics of the systems we use to understand those distributions — algorithmic culture.


Algorithmic Culture, Redux

Back in June I blogged here about “Algorithmic Culture,” or the sorting, classifying, and hierarchizing of people, places, objects, and ideas using computational processes.  (Think Google search, Amazon’s product recommendations, who gets featured in your Facebook news feed, etc.)  Well, for the past several months I’ve been developing an essay on the theme, and it’s finally done.  I’ll be debuting it at Vanderbilt University’s “American Cultures in the Digital Age” conference on Friday, March 18th, which I’m keynoting along with Kelly Joyce (College of William & Mary), Cara Finnegan (University of Illinois), and Eszter Hargittai (Northwestern University).  Needless to say, I’m thrilled to be joining such distinguished company at what promises to be, well, an event.
Кровля из металлочерепицы. Ее достоинства и недостатки.

The piece I posted originally on algorithmic culture generated a surprising — and exciting — amount of response.  In fact, nine months later, it’s still receiving pingbacks, I’m pretty sure as a result of its having found its way onto one or more college syllabuses.  So between that and the good results I’m seeing in the essay, I’m seriously considering developing the material on algorithmic culture into my next book.  Originally after Late Age I’d planned on focusing on contemporary religious publishing, but increasingly I feel as if that will have to wait.

Drop by the conference if you’re in or around the Nashville area on Friday, March 18th.  I’m kicking things off starting at 9:30 a.m.  And for those of you who can’t make it there, here’s the title slide from the PowerPoint presentation, along with a little taste of the talk’s conclusion:

This latter definition—culture as authoritative principle—is, I believe, the definition that’s chiefly operative in and around algorithmic culture. Today, however, it isn’t culture per se that is a “principle of authority” but increasingly the algorithms to which are delegated the task of driving out entropy, or in Matthew Arnold’s language, “anarchy.”  You might even say that culture is fast becoming—in domains ranging from retail to rental, search to social networking, and well beyond—the positive remainder of specific information processing tasks, especially as they relate to the informatics of crowds.  And in this sense algorithms have significantly taken on what, at least since Arnold, has been one of culture’s chief responsibilities, namely, the task of “reassembling the social,” as Bruno Latour puts it—here, though, by discovering statistical correlations that would appear to unite an otherwise disparate and dispersed crowd of people.

I expect to post a complete draft of the piece on “Algorithmic Culture” to my project site once I’ve tightened it up a bit. Hopefully it will generate even more comments, questions, and provocations than the blog post that inspired the work initially.

In the meantime, I’d welcome any feedback you may have about the short excerpt appearing above, or on the talk if you’re going to be in Nashville this week.


How to Have Culture in an Algorithmic Age

The subtitle of this post ought to be “apparently,” since I have developing doubts about substituting digital surveillance systems and complex computer programs for the considered — humane — work of culture.

Case in point: about six weeks ago, Galley Cat reported on a new Kindle-related initiative called “popular highlights,”which had just rolled out onto the web for beta testing.  In a nutshell, Amazon is now going public with information about which Kindle books are the most popular, as well as which passages within them have been the most consistently highlighted by readers.

How does Amazon determine this?  Using the 3G connection built into your Kindle, the company automatically uploads your highlights, bookmarks, marginal notes, and more to its server array, or computing cloud.  Amazon calls this service “back up,” but the phrase is something of a misnomer.  Sure, there’s goodwill on Amazon’s part in helping to ensure that your Kindle data never gets deleted or corrupted.  By the same token, it’s becoming abundantly clear that “back up” exists as much for the sake of your convenience as it does for Amazon itself, who mines all of your Kindle-related data.  The Galley Cat story only confirms this.

This isn’t really news.  For months I’ve been writing here and elsewhere about the back up/surveillance issue, and I even have an academic journal article appearing on the topic this fall.  Now, don’t get me wrong — this is an important issue.  But the focus on surveillance has obscured another pressing matter: the way in which Amazon, and indeed other tech companies, are altering the idea of culture through these types of services.  Hence my concern with what I’m calling, following Alex Galloway, “algorithmic culture.”

In the old paradigm of culture — you might call it “elite culture,” although I find the term “elite” to be so overused these days as to be almost meaningless — a small group of well-trained, trusted authorities determined not only what was worth reading, but also what within a given reading selection were the most important aspects to focus on.  The basic principle is similar with algorithmic culture, which is also concerned with sorting, classifying, and hierarchizing cultural artifacts.

Here’s the twist, however, which is apparent from the “About” page on the Amazon Popular Highlights site:

We combine the highlights of all Kindle customers and identify the passages with the most highlights. The resulting Popular Highlights help readers to focus on passages that are meaningful to the greatest number of people.

Using its computing cloud, Amazon aggregates all of the information it’s gathered from its customers’ Kindles to produce a statistical determination of what’s culturally relevant. In other words, significance and meaningfulness are decided by a massive — and massively distributed — group of readers, whose responses to texts are measured, quantified, and processed by Amazon.

I realize that in raising doubts about this type of cultural work, I’m opening myself to charges of elitism.  So be it.  Anytime you question what used to be called “the popular,” and what is now increasingly referred to as “the crowd,” you open yourself to those types of accusations. Honestly, though, I’m not out to impugn the crowd.

To my mind, the whole elites-versus-crowd debate is little more than a red-herring, one that distracts from a much deeper issue: Amazon’s algorithm and the mysterious ways in which it renders culture.

When people read, on a Kindle or elsewhere, there’s context.  For example, I may highlight a passage because I find it to be provocative or insightful.  By the same token, I may find it to be objectionable, or boring, or grammatically troublesome, or confusing, or…you get the point.  When Amazon uploads your passages and begins aggregating them with those of other readers, this sense of context is lost.  What this means is that algorithmic culture, in its obsession with metrics and quantification, exists at least one level of abstraction beyond the acts of reading that first produced the data.

I’m not against the crowd, and let me add that I’m not even against this type of cultural work per se.  I don’t fear the machine.  What I do fear, though, is the black box of algorithmic culture.  We have virtually no idea of how Amazon’s Popular Highlights algorithm works, let alone who made it.  All that information is proprietary, and given Amazon’s penchant for secrecy, the company is unlikely to open up about it anytime soon.

In the old cultural paradigm, you could question authorities about their reasons for selecting particular cultural artifacts as worthy, while dismissing or neglecting others.  Not so with algorithmic culture, which wraps abstraction inside of secrecy and sells it back to you as, “the people have spoken.”