March 17, 2009

Thinking the Unthinkable

Clay Shirky’s widely circulated post, Newspapers and Thinking the Unthinkable, has got me thinking about the “unthinkable” in humanities scholarship. According to Shirky, in the world of print journalism, the unthinkable was the realization that newspapers would not be able to transfer their scarcity-of-information-based business model to the internet. It was publishers’ inability to imagine a business model for a world in which information is easily distributed that led to the crisis in which newspapers find themselves today. He writes,

The unthinkable scenario unfolded something like this: The ability to share content wouldn’t shrink, it would grow. Walled gardens would prove unpopular. Digital advertising would reduce inefficiencies, and therefore profits. Dislike of micropayments would prevent widespread use. People would resist being educated to act against their own desires. Old habits of advertisers and readers would not transfer online. Even ferocious litigation would be inadequate to constrain massive, sustained law-breaking. (Prohibition redux.) Hardware and software vendors would not regard copyright holders as allies, nor would they regard customers as enemies. DRM’s requirement that the attacker be allowed to decode the content would be an insuperable flaw. And, per Thompson, suing people who love something so much they want to share it would piss them off.

In our world, easy parallels to newspaper publishers can be made, for instance, with journal publishers or the purveyors of subscription research databases (indeed the three are often one and the same). I’m sure you can point to lots of others, and I’d be very happy to hear them in comments. But what interests me most in Shirky’s piece are his ideas about how the advent of the unthinkable divides a community of practitioners. These comments hit a little closer to home. Shirky writes,

Revolutions create a curious inversion of perception. In ordinary times, people who do no more than describe the world around them are seen as pragmatists, while those who imagine fabulous alternative futures are viewed as radicals. The last couple of decades haven’t been ordinary, however. Inside the papers, the pragmatists were the ones simply looking out the window and noticing that the real world was increasingly resembling the unthinkable scenario. These people were treated as if they were barking mad. Meanwhile the people spinning visions of popular walled gardens and enthusiastic micropayment adoption, visions unsupported by reality, were regarded not as charlatans but saviors.

When reality is labeled unthinkable, it creates a kind of sickness in an industry. Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times. One of the effects on the newspapers is that many of their most passionate defenders are unable, even now, to plan for a world in which the industry they knew is visibly going away.

Again, we probably pretty easily can point to both “realists” (who get it) and “fabulists” (who don’t or won’t) in academic publishing. But the analogy extends deeper than that. There are strong and uncomfortable parallels within our own disciplines.

The question is this: Just who are the pragmatists and who are the radicals in our departments? Maybe those of us who spend our time taking digital technologies seriously aren’t radical at all. Maybe those of us in digital humanities centers (read: “Innovation Departments”) are simply realists, while our more traditional colleagues are fabulists, faithfully clinging to ways of doing things that are already past. Listening to some colleagues talk about the dangers of Wikipedia, for instance, or the primacy of university-press-published, single-authored monographs, or problems of authority in the social tagging of collections, it certainly sometimes feels that way. Conversely what we do in digital humanities surely feels pragmatic, both day-to-day and in our broader focus on method.

Obviously we can’t and shouldn’t divide scholars so neatly into two camps. Nor do I think we should so casually dismiss traditional scholarship any more than we should uncritically celebrate the digital. Yet it’s worth thinking for a minute of ourselves as realists rather than revolutionaries. If nothing else, it may keep us focused on the work at hand.

3 Comments

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.