Amazonfail and Algorithmic Culture

I’m rather late in adding my two cents to the recent controversy over Amazon.com, which broke a little over two weeks ago.  For all that I write about the late age of print (and Twitter, blog, etc.), my difficulty in keeping pace with the internet makes me suspect that I’m a Gutenberg guy at heart.

In any case, for those of you who may be even further behind than I, a PR disaster came crashing down around Amazon.com over Easter weekend.  Author Mark R. Probst, who writes gay-oriented fiction for young adults, noticed on Friday, April 10th that there were no sales rankings listed for two recently-released — and quite popular — gay romance novels.  He later discovered a similar trend among hundreds of gay, lesbian, bisexual, and transgender (GLBT) titles on Amazon, including his own book, The Filly.  An initial inquiry into the situation brought this response from Amazon: “In consideration of our entire customer base, we exclude ‘adult’ material from appearing in some searches and best seller lists.  Since these lists are generated using sales ranks, adult materials must also be excluded from that feature.”

Needless to say, many people were outraged by the company’s apparent decision to classify GLBT books as “adult” and effectively to de-list them from its website.  The rest is pretty much history at this point.  Folks began Twittering en masse to #amazonfail, where details about — and inconsistencies in — Amazon’s listing process were revealed.  Among the more painful revelations?   As Feministe reported, A Parent’s Guide to Preventing Homosexuality and related anti-GLBT screeds continued to be listed and ranked.  Meanwhile, the LA Times blog Jacket Copy noted that Amazon hadn’t classified Playboy: Six Decades of Centerfolds as “adult” (duh) but had given the label to philosopher Michel Foucault’s provocative but hardly titillating History of Sexuality, volume I.

Once Amazon had a chance to regroup, it began issuing this statement:

This is an embarrassing and ham-fisted cataloging error for a company that prides itself on offering complete selection.  It has been misreported that the issue was limited to Gay & Lesbian themed titles — in fact, it impacted 57,310 books in a number of broad categories such as Health, Mind & Body, Reproductive & Sexual Medicine, and Erotica. This problem impacted books not just in the United States but globally. It affected not just sales rank but also had the effect of removing the books from Amazon’s main product search.

The Seattle Post-Intelligencer added: “Amazon managers found that an employee who happened to work in France had filled out a field incorrectly and more than 50,000 items got flipped over to be flagged as ‘adult,’ the source said. (Technically, the flag for adult content was flipped from ‘false’ to ‘true.’).”

Some people are understandably skeptical of Amazon’s explanations.  Though the company has admitted to making a huge mistake and taken steps to rectify the situation, regaining the trust of its customers will undoubtedly take time.  Clearly the whole situation was hurtful to a great many people, and a disaster for Amazon.com.

I wonder, in retrospect, what might it all tell us about the late age of print?

If Amazon is to be believed, the root of the problem lies not with any one person per se (the “ham-fisted” employee in France notwithstanding) but with what Alex Galloway, a professor at NYU, calls “algorithmic culture.”  By this he refers to the abrogation of the work of culture — the sorting, ordering, classifying, and judging of people and things — from human beings to machines.  You might think of algorithmic culture as an operational layer that sits on top of another, informational layer — call it database culture.  Put the two together and you realize just how much cultural work actually takes place more or less independent of human action.

I’m currently teaching a graduate seminar about mass culture.  In these days of interactive media and extraordinary customization, it’s become popular — even required — to rail against mass culture as dehumanizing, repetitive, and more.  But a question I always insist my graduate students confront is, “What did the mass culture paradigm do well in its day?”

The Amazon situation from a few weeks ago poses an analogous scenario.  It’s become de rigueur among many to decry traditional cultural work as “elitist,” given how it sets up a privileged few to determine what’s worth paying attention to, and why.  The assumption seems to be, if we could just make the process more open and democratic, then we’d move further in the direction of a more inclusive public culture.

The folks over at #amazonfail, and indeed all those who chimed in on the book ranking and listing controversy, have begun to show us that algorithmic culture has its weaknesses, too, and that there may be benefits to a more “traditional” approach to cultural valuation and classification.  If nothing else, the latter has an immediate doer behind the deed, who can be questioned about her or his choices.  Algorithmic culture may provide for more “democratic” forms of participation, particularly in the area of tagging and reviewing.  On the flipside, accountability exists at a much further remove.  If handled improperly, algorithmic culture can open large swaths of material to the  threat of “global replace,” in which a one becomes a zero and all hell subsequently breaks lose.

Share

2 comments

  1. […] Amazonfail and Algorithmic CultureThoughts on relying upon algorithms as part of the decision-making process. […]

  2. […] What that means, then, is that Amazon does not subscribe to the liberal sensibilities with which book culture has long been associated.  In other words, it holds little regard for the sanctity of property (other than its own), privacy, or free expression.  For Amazon these are values only insofar as they can contribute to the company’s value stream.  When they don’t, or when they prove too costly, those values are dispensed with algorithmically. […]

Leave a Reply to Anonymous Cancel reply

Your email address will not be published.