Twitter, Downtime, and Radical Transparency


Listeners to the most recent episode of Digital Campus will know that I’m a fairly heavy user of Twitter, the weirdly addictive and hard-to-describe microblogging and messaging service. But anyone who uses the wildly popular service regularly will also know that the company’s service architecture has not scaled very well. During the last month or so, as hundreds of thousands have signed up and started “tweeting,” it has sometimes seemed like Twitter is down as often as it’s up.

Considering the volume and complexity of the information they’re serving, and the somewhat unexpectedness of the service’s popularity, I tend not to blame Twitter for its downtime. As a member of an organization that runs its own servers (with nowhere near the load of Twitter, mind you), I sympathize with Twitter’s situation. Keeping a server up is a relentless, frustrating, unpredictable, and scary task. Yet as a user of Twitter, I still get pretty annoyed when I can’t access my friends’ tweets or when one of mine disappears into the ether.

It’s clear, however, that Twitter is working very hard to rewrite its software and improve its network infrastructure. How do I know this? First, it seems like some of the problems are getting better. Second, and more important, for the last week or so, Twitter has been blogging its efforts. The Twitter main page now includes a prominent link to the Twitter Status blog, where managers and engineers post at least daily updates about the work they’re doing and the problems they’re facing. The blog also includes links to uptime statistics, developer forums, and other information sharing channels. Twitter’s main corporate blog, moreover, contains longer posts about these same issues, as well as notes on other uncomfortable matters such as users’ concerns about privacy under Twitter’s terms of service.

Often, an organization facing troubles—particularly troubles of its own making—does everything it can to hide the problem, its cause, and its efforts to fix it. Twitter has decided on a different course. Twitter seems to have realized that its very committed, very invested user base would prefer honesty and openness to obfuscation and spin. By definition, Twitter users are people who have put themselves out there on the web. Twitter’s managers and engineers have realized that those users expect nothing less of the company itself.

As a Twitter user, the company’s openness about its difficulties has made me more patient, more willing to forgive them an occasional outage or slowdown. There is a lesson in this for digital and public historians. Our audiences are similarly committed. We work very hard to make sure they feel like we’re all in this together. We should remember this when we have problems, such as our own network outages (CHNM is experiencing one right now, btw) and technical shortcomings.

We are open with our successes. We should be open with our problems as well. Our audiences and partners will reward us with their continued loyalty and (who knows?) maybe even help.

Sunset for Ideology, Sunrise for Methodology?

Sometimes friends in other disciplines ask me the question, “So, what are the big ideas in history these days?” I then proceed to fumble around for a few minutes trying to put my finger on some new “-ism” or competing “-isms” to describe and define today’s historical discourse. Invariably, I come up short.

Growing up in the second half of the 20th century, we are prone to think about our world and our work in terms of ideologies. Late 20th century historical discourse was dominated by a succession of ideas and theoretical frameworks. This mirrored the broader cultural and political discourse in which our work was set. For most of the last 75 years of the 20th century, Socialism, Fascism, Existentialism, Structuralism, Post-Structuralism, Conservatism, and other ideologies vied with one another broadly in our politics and narrowly at our academic conferences.

327577395_991a9ab4e4_m.jpg But it wasn’t always so. Late 19th and early 20th century scholarship was dominated not by big ideas, but by methodological refinement and disciplinary consolidation. Denigrated in the later 20th century as unworthy of serious attention by scholars, the 19th and early 20th century, by contrast, took activities like philology, lexicology, and especially bibliography very seriously. Serious scholarship was concerned as much with organizing knowledge as it was with framing knowledge in an ideological construct. Take my sub-discipline, the history of science, as an example. Whereas the last few decades of research have been dominated by a debate over the relative merits of “constructivism” (the idea, in Jan Golinski’s succinct definition, “that scientific knowledge is a human creation, made with available material and cultural resources, rather than simply the revelation of a natural order that is pre-given and independent of human action”), the history of science was in fact founded in an outpouring of bibliograpy. The life work of the first great American historian of science, George Sarton, was not an idea, but a journal (Isis), a professional society (the History of Science Society), a department (Harvard’s), a primer (his Introduction to the History of Science), and especially a bibliography (the Isis Cumulative Bibliography). Tellingly, the great work of his greatest pupil, Robert K. Merton, was an idea: the younger Merton’s “Science, Technology and Society in Seventeenth Century England” defined history of technology as social history for a generation. By the time Merton was writing in the 1930s, the cultural climate had changed and the consolidating and methodological activities of the teacher were giving way to the ideological and theoretical activities of the student.

I believe we are at a similar moment of change right now, that we are entering a new phase of scholarship that will be dominated not by ideas, but once again by organizing activities, both in terms of organizing knowledge and organizing ourselves and our work. My difficulty in answering the question “What’s the big idea in history right now?” stems from the fact that, as a digital historian, I traffic much less in new theories than in new methods. The new technology of the Internet has shifted the work of a rapidly growing number of scholars away from thinking big thoughts to forging new tools, methods, materials, techniques, and modes or work which will enable us to harness the still unwieldy, but obviously game-changing, information technologies now sitting on our desktops and in our pockets. These concerns touch all scholars. Our Zotero research management tool is used by three quarters of a million people, all of them grappling with the problem of information overload. And although much of the discussion remains informal, it’s no accident that Wikipedia is right now one of the hottest topics for debate amongst scholars.

Perhaps most telling is the excitement that now (or really, once again) surrounds the library. If you haven’t been to a library conference lately, I suggest you do so. The buzz amongst librarians these days dwarfs anything I have seen in my entire career amongst historians. The terms “library geek” and “sexy librarian” have gained new currency as everyone begins to recognize the potential of exciting library-centered projects like Google Books.

All of these things—collaborative encylcopedism, tool building, librarianship—fit uneasily into the standards of scholarship forged in the second half of the 20th century. Most committees for promotion and tenure, for example, must value single authorship and the big idea more highly than collaborative work and methodological or disciplinary contribution. Even historians find it hard to internalize the fact that their own norms and values have and will again change over time. But change they must. In the days of George Sarton, a thorough bibliography was an achievement worthy of great respect, and an office closer to the reference desk in the library an occasion for great celebration (Sarton’s small suite in Study 189 of Harvard’s Widener Library was the epicenter of history of science in America for more than a quarter century). As we tumble deeper into the Internet age, I suspect it will be again.

[Image credit: Alex Pang; Quote: Jan Golinski, Making Natural Knowledge (Cambridge University Press, 1998), p. 6.]

Twitter as a tool for outreach

In an earlier post I wrote about the early buzz around Omeka, both in the forums and among education, museum, public history, and library bloggers. One thing I didn’t mention—and frankly did not expect—was the buzz about Omeka on Twitter, the popular SMS-centered microblogging, won’t-get-it-till-you’ve-used-it social networking platform.

twitter.pngTwitter has been getting a lot of attention lately as a tool for use in the classroom, including an insightful blog post and front-page video segment on the Chronicle of Higher Education website by University of Texas at Dallas professor David Parry. It turns out Twitter has also been a great way to build a community around Omeka—to get in touch with possible users, to keep in touch with existing users, to give the product a personality, and to provide information and support. Among other things, we have been answering technical questions using Twitter, connecting far-flung users with Twitter, and pointing to blog posts and press coverage on Twitter. Because the barrier to participation is so low—Twitter only allows messages of 140 characters or less—people seem more willing to participate in the discussion than if it were occurring on a traditional bulletin board or even in full length blog posts. Because every posting on Twitter is necessarily short, sweet, informal, and free from grammatical constraints, I think people feel freer just to say what’s on their minds. Because Twitter asks its users to respond to a very specific and very easily answered question—”What are you doing?”—it frees them (and us) from the painstaking and time consuming work of crafting a message and lets people just tell us how they’re getting on with Omeka. And because Twitter updates can be sent and received in many different ways from almost anywhere (via text message, on the web, via instant message), the Omeka Twitter community has a very active, very present feel about it.

I’m very encouraged by all this, not just for the narrow purposes of Omeka, but for digital humanities and public history outreach in general. Interactivity, audience participation, and immediacy are longstanding values of both public history and digital humanities, and Twitter very simply and subtly facilitates them all. The experience of the last week has proved to me that we should be doing this for all future projects at CHNM, not just our software projects like Omeka and Zotero, but also for our online collecting projects like the Hurricane Digital Memory Bank, our public exhibitions like the forthcoming Gulag: Many Days, Many Lives, and our education projects like the forthcoming Making the History of 1989.

For now, if you’d like to join the Omeka Twitter community, you can sign up for a Twitter account and start following Omeka. If you’re not quite ready to dive in head first, or if you just want to keep an eye on what other Omeka followers are doing, you can simply subscribe to the “Omeka and Friends” public feed. Finally, if you want to see what I’m up to as well, you can find me on Twitter at (no surprise) FoundHistory.

U2's Kite

I’ll stick with music for one more post.

“Kite” is one of my favorite of U2‘s more recent songs. In keeping with the title, Edge’s guitar is alternatingly lilting and soaring, and Bono’s vocals are more than usually impassioned. The chord progression is classic rock simple, but the rhythms are changeable and complex. In many ways “Kite” marks the high point or cresendo of the band’s return its rocking roots in its November 2000 release All That You Can’t Leave Behind.

However, the best moment for me—and not coincidentally the one that will be most interesting to Found History readers—is the song’s nearly spoken-word epilogue:

Did I waste it?
Not so much I couldn’t taste it
Life should be fragrant
Rooftop to the basement

The last of the rocks stars
When hip hop drove the big cars
In the time when new media
Was the big idea

That was the big idea

Buried just beneath the surface of this apparent afterthought is what amounts to an historical apology to fans.

Emerging from the same late-70s post-punk, post-prog crucible as The Police and The Clash, U2 outlasted its equally talented competition to become what many consider the band of the 80s. The high point of this success was undoubtedly 1987’s landmark release The Joshua Tree, which by almost any measure must rank among rock’s greatest achievements.

If the Joshua Tree launched U2 into the pantheon of rock and roll, it also presented the band with the familiar problem of finding a suitable second act. The Joshua Tree was haunting and profound, but it was also chokingly serious and unsustainably earnest. U2’s 1988 follow up album-cum-tour film Rattle and Hum presents a band that has taken itself too seriously. 1991’s sometimes brilliant, faintly inane, and thoroughly self-regarding release Achtung Baby gave us a band on the verge of collapse. Often mistaken to be a love song, the album’s biggest single “One” is in fact a desperate plea to keep the band together.

Yet what U2 recognized in Achtung Baby that nearly all top bands miss is that they had to stop taking themselves so seriously. It wasn’t at all obvious how to do this: the history of rock doesn’t provide many good examples of humility. Indeed a late-90s observer likely would have determined that U2 had failed. 1993’s Zooropa, 1997’s Pop, and the band members’ string of unremarkable solo projects took the inanity of Achtung Baby to new heights. In the increasingly fragmented media and music environment of the mid-1990s that now included Rap, House, Grunge, Electronica, Alternative and many more, U2s brand of Led Zeppelin-style superstardom just seemed all the more ridiculous.

In fact, consciously or not, U2 had hit upon an ingenious reinvention strategy. The only way to combat the overwhelming earnestness of The Joshua Tree and the art house self-seriousness of Achtung Baby was to tear the band down and rebuild from the ground up. Pop in particular was a scathing, humiliating, almost self-flagellating parody of wealth, fame, technology, and rock itself. Yet at the time it just seemed like a bust. Most late-90s observers would have determined that U2 was finished.

2000’s All That You Can’t Leave Behind was therefore and by all accounts a renaissance. With songs like “Beautiful Day” and “Stuck In A Moment You Can’t Get Out Of,” the band returned to the tested formula of big vocals, socially aware lyrics, building guitars, and rock-solid bass and percussion that served it so well in the mid-1980s. (You find hints of U2’s earlier work throughout the album, for example the tinkling piano at the end of “Walk On” is clearly a throwback to 1983’s “New Year’s Day.”) This time, however, tempered by experience, they did it without so much (though admittedly still some – hey, they’re rock stars) ego and condescention. For long time fans—like me, if you hadn’t already guessed—it was a welcome return to the band’s roots.

This is the history told in the last two stanzas of “Kite.” In the first of these, Bono acknowledges the band’s mid-90s collapse and explains that its absurdist turn was at least partly intentional: “Did I waste it? / Not so much I couldn’t taste it / Life should be fragrant / Rooftop to the basement.” In the second, he provides fans with a rationale: “The last of the rocks stars / When hip hop drove the big cars / In the time when new media was the big idea / That was the big idea.”

Together I think these two stanzas prove I’m not totally off my rocker in pointing to the historical work being done by U2 in this song. In them Bono recognizes the essential anachronism of a 1970s stadium rock band (the “last of the rock stars”) in a world of satellite television, iTunes, and general media fragmentation (in the mid- to late-90s marginal hip hop artists truly “drove the big cars” viz. the Notorious B.I.G. and Coolio). He also recognizes that the only way to deal with the historical predicament in which U2 finds itself is alternatingly to embrace and reject that new media culture—that was the “big idea.”

In terms of found history, “Kite” shows not only that U2 has thought about and understands its place in the larger sweep of rock and roll history, but also that its long time fan base expects fidelity to that history or at least some explanation when it deviates from it. “Kite” also suggests that at this late stage, die hard fans may well expect U2 to make history and autobiography as much as they expect them to make good music.

What is a Museum?

This one comes from Found History reader Tim, who wanted to hear my thoughts on NPR’s recent story about the Museum of Online Museums (MOOM), a directory of online collections. Aside from being a treasure trove of found history, MOOM raises the question—at least for NPR’s editors—of what constitutes a museum. Should we or should we not call MOOM’s listings “museums”?

Arguing the affirmative is Jim Coudal, one of MOOM’s founders, who points to one of two definitions of “museum” in Webster’s dictionary: “a place where objects are exhibited.” Arguing the negative, is Wilson O’Donnell, director of the museology program at the University of Wasington, who says that calling MOOM’s listings “museums” is “like calling Wikipedia an encylopedia.” I actually take issue with both lines of reasoning, but ultimately I come down on the side of Coudal and MOOM.

You could say that Coudal and O’Donnell make converse mistakes. On the one hand, Coudal employs a definition that is too vague and too broad and leaves the museum without a distinct identity. If anyplace that displays objects is a museum, then we should consider department stores, the Home Shopping Network, the fun house at the county fair, the row of expensive whiskeys behind the bar, the auto show, and a million other things “museums.” Historians of museums know that our modern notion of the museum was born out of a 19th century “exhibitionary culture” that included things like World’s Fairs and department stores, as well as museums. But no one mistakes Macy’s for the Met.

O’Donnell, on the other hand, makes the opposite mistake, attempting to reify and dehistoricize the museum. In fact, things called “museums” have been around in one form or another for 400 years, and for most of that time they have borne little resemblance to our modern museums. I’m not sure whether it is Wikipedia’s amateurism or its unfamiliar digital format that irks O’Donnell, but the truth is that for much of their history, museums were both largely amateur endeavors and existed in formats that would be unfamiliar to us today. Many of the great European museums (the Ashmolean and Pitt Rivers museums in Oxford are good examples) were founded as private collections in private homes and were organized around criteria and displayed in formats that today would seem very foreign indeed.

For my part, I’d pick Webster’s second definition: “an institution devoted to the procurement, care, study, and display of objects of lasting interest or value.” I probably have to think about this more, but to me it’s not the simple act of display, nor is it “professionalism,” that makes something a museum. Rather it is the collection and display of stuff with a preservative intent and historical mindset that makes a museum. That is, by my definition, MOOM’s “museums” are really museums … and all museums are pieces of found history.

Apologies to Tim for the long delay in answering his very good question.

Calendars as Timelines

Jeremy had a post yesterday about the buzz over timelines at CHNM. For the last year or so, we have been talking a lot about timelines, all of us coming to the topic at slightly different angles. Jeremy, for instance, is especially interested in the user interface challenges that online timelines present, and he’s toying with some solutions in CSS/XHTML/JavaScript and emerging data standards like HEML (Historical Event and Markup Linking) and the HCalendar microformat. I’m most interested in the centrality of timelines to public understanding of history.

In many ways, timelines are general public’s favored mode of representing historical change. Timelines figure prominently in most history classrooms. They provide newspaper editors a column inch-saving shorthand for contextualizing current events (see, for example, the sidebar on this recent article in USA Today about the Balco doping scandal). And the most energetic among amateur historians—genealogists—traffic almost entirely in a particular type of timeline known commonly as the “family tree.”

Over the past year I have been trying to move forward a project at CHNM called “Timeline Builder,” which would provide an easy-to-use tool for people looking to generate online timelines. A public beta of Timeline Builder is up and running at CHNM Tools, and although it’s a little clunky, it will give you an idea of what we have in mind. (I should say that I have had very little—read “nothing”—to do with the actual building of this system. When I say “move forward,” I mean begging my more skilled colleagues to build it for me. Josh Greenberg over at Epistemographer has been especially generous, both intellectually and technically, and a summer intern, Josh West, has done most of the programming work to date.)

To launch an effective timeline builder we need two things. First, we need an elegant way to render timelines visually within the space of the browser. Here I think the work Josh West has done in Flash is great, and I’m hoping that Jeremy will be able to replicate the best features of his display mechanism (e.g. the slide and zoom functions) in CSS/XHTML/Javascript.

Second, and just as important, we need an easy and familiar way for people to enter events. As my coworkers can attest (and I’m sure to their annoyance), I had a brainstorm on this second point a couple of weeks ago: such a system already exists and is already in the hands of users. This system is called the calendar. Why should we invent a new standard and build an event builder system when people already have one on the desktop in their calendar applications? Isn’t a calendar just a “timeline” laid out on a grid rather than on a line? If we can afford some facility for people to upload .ics files created in their calendars to our server and then dump that event data into an online timeline, wouldn’t that be a lot better than inventing our own event standard and our own event-creation interface?

This possibility got me to thinking more broadly about calendars as digital objects and historical artifacts. The increasing universality of the .ics standard (currently used by Apple’s iCal and Mozilla’s calendar projects, and supported by both 30Boxes and Google Calendar) presents historians with an amazing opportunity. If we could develop strategies for collecting and preserving standards-based calendar data and the right tools for analyzing it, we could gain unprecedented insight into the daily and even hourly activities of historical actors. What if, for instance, we had the daily calendars of everyone at Los Alamos during the Manhattan Project in an identical format and we could nail those calendars to a single timeline for comparison? What questions could we answer about the extent to which those scientists worked collaboratively and/or individually? More immediately, I want this for our forthcoming Mozilla Digital Memory Bank project. I’m sure everyone at Mozilla keeps a calendar in the .ics format or in some other format easily exportable to .ics. What will we be able to say about the nature of the that diffuse and complex community of developers if we are able to collect and easily compare who was where and doing what with whom when? Incredible.

So what does all this have to do with Found History? Since most people nowadays keep some kind of digital calendar, I’m also interested in the question of whether this calendar-keeping can be considered history-making. On the one hand, calendar-keeping is time-based, event-centric, and preservational. This argues the affirmative. On the other hand, calendar-keeping is largely future-focused (reminding us of upcoming events rather than past events), and while it’s concerned with preserving time-based information, I’m not sure it entails or encourages any interpretation of or reflection on that information. This argues the negative.

I’m still trying to sort out where I come down on this question. My guess is that it varies from person to person—that some people keep calendars with a historical or memorial purpose in mind, and others do it simply to keep from forgetting their next anniversary. In any case, personal digital calendars represent a historical resource of enormous potential breadth and depth, and we should all be thinking about ways to collect, preserve, parse, and present the information they contain.

A Long Time Ago in a Galaxy Far, Far Away …

The topic of this spring’s Washington DC Area Technology and Humanities Forum was just announced on CHNM News, and I couldn’t be more excited. On May 15, 2006 Mark Sample, Jason Rhody, and Michelle Roper will discuss “Taking Games Seriously: The Impact of Gaming Technology in the Humanities” at Georgetown University’s Car Barn. This is right up Found History’s ally.

The forum’s topic touches on something I’ve been thinking about for a long time: the extent to which fantasy and science fiction (both closely tied to gaming culture) are indebted to history for both substance and narrative structure and style—that is, the extent to which fantasy and sci-fi are written as history.

I don’t think it’s too much of a stretch to say that fantasy is just alternative history and science fiction the imagined history of the future. The sources seem to say as much. The original Star Wars, for example, is framed from the outset as a story from the past. Introduced by the words, “A long time ago in a galaxy far, far away,” the movie (and its sequels and prequels) goes on to present a plot based loosely in Roman history (“the Republic” vs. “the Empire”) and characters based loosely in Greek epic (Han Solo as the unseasonal hero, for example). Each Star Trek episode reproduces an entry in Captain Kirk’s diary, invariably beginning with a reading of the “star date.” Tolkien’s The Lord of the Rings is a presented as a history of the third age of “middle earth” and even begins with an explanation of “archival” sources in its “Notes on the Shire Records.” A professor of Anglo-Saxon literature and language at Oxford and an expert in the chivalric romances of the middle ages, Tolkein borrowed heavily from the genre, which was itself a kind of fiction masquerading as true history. Finally, like Star Wars, The Lord of the Rings also has its prequels in The Hobbit and The Silmarillion. Indeed, the “prequel” seems a distinctive feature of science fiction and fantasy, and is yet another giveaway of the genres’ preoccupation with the past.

I first noticed the connection between sci-fi and history in my doctoral research, which examined the history of inter-war interest in science’s past, both in higher education and in more popular contexts such as World’s Fairs and museums. Among the most important figures in this story are George Sarton and Charles Singer, the founding fathers of academic history of science in America and Britain respectively. Exploring the correspondence of these endlessly-fascinating giants of early-20th century history, I noticed that both men (themselves close friends) enjoyed long personal acquaintances with H.G. Wells, the renowned author of War of the Worlds, The Island of Doctor Moreau, and other science fiction classics. This led me to look more closely at Wells, and it turns out that while we remember him only for fiction, he and his contemporaries may rather have identified him as an historian. In fact, in terms of total number of words, Wells probably wrote more history than he did fiction, and his thousand-page Outline of History: Being a Plain History of Life and Mankind easily went to as many editions in the author’s own lifetime as the sci-fi books for which he is better remembered. Moreover, during his lifetime Wells traveled the world on paid speaking engagements, where he usually spoke on topics in history, religion, and ethics, rather than reading from his fictional works. Thus in Wells we see that sci-fi and fantasy are tied not only to history internally and textually, but also externally in the circumstances of their production and the interests of their authors.

Of course, I’m not the first person to make these connections. More recent authors of science fiction and fantasy most certainly have. Neal Stephenson, for example, definitely recognizes the connection, switching easily and expertly between stories set in the future (Snow Crash, etc.) and stories set in the past (his incredible Baroque Cycle). He sometimes even carries characters over from the past into the future (the mysteriously immortal Enoch Root, for instance). Another example is The Years of Rice and Salt by acclaimed science fiction author Kim Stanley Robinson, which in its account of what might have happened had the Black Plague destroyed European civilization entirely, is really alternative history rather than science fiction.

I’m not a gamer, so I can’t speak at length about how historical models play out in video games. But it seems to me that at least one genre of fantasy and sci-fi games, in which players retrace a highly-authored (albeit forked) narrative through a historically-inspired space (e.g. the Myst and Zelda franchises), seems ripe for this kind of analysis. I’m really interested to see what the excellent panel at the Tech & Humanities Forum has to say about that.

Finding History in the September 11 Digital Archive

Because it follows from some talks I’ve given in the past, this may be cheating on my resolution to start writing more. But I think it really belongs here on Found History, so I’m going to post it anyway. In some ways my work on the September 11 Digital Archive inspired this blog, and I think I should explain how.

If there was ever a time when public history could be defined simply as history written for the public, that time is surely past. The counterculture movements of the 1960s, 70s, and 80s, the postmodernist turn, the culture wars of the 1990s, and now the Internet have made our publics aware of multiple narratives, competing sources, and wary of our authority as historians. Our publics are now instinctively attuned to the discursive nature of history, and they are unwilling to sit quietly at the receiving end. Public history—as it’s now commonplace to say—demands a “shared authority.”

towers.jpgThis new reality is more easily accommodated by our intellects than by our institutions. Archival and library collections, for instance, remain inherently authoritative—archivists and librarians collect and manage collections and publics are (or are not) given access to these materials. The situation is much the same in most other historical outlets. In museum exhibitions, for example, curators exhibit collections and publics are exhibited to. While trends toward “interactivity” have done something to alleviate this situation, in most cases professionals still set the terms, telling the public where, when, and how they may interact with historical materials and predetermined content. This does not always sit well amongst an increasingly sophisticated and choosy public. New forums such as the Internet allow for more than pre-determined interactivity, but also for real authorship, and an experienced public now expects productive participation in our stacks and public programming.

The situation is all the more acute when dealing with topics in contemporary history. Certainly in the case of September 11, 2001, there is little we as historians can tell the public that they don’t already know for themselves. September 11 was undoubtedly the most experienced event in American history. There must be very few Americans who haven’t seen the collapse of the world trade center from every angle, in color and in black and white, in slow motion and in time lapse, set to music, set to speeches, and overlain with photographs of victims, their families, their attackers, and their elected officials. In many respects—and with no intended disrespect to those families directly affected by the attacks—we have all experienced September 11 equally. At this point nobody needs or wants an historical expert to tell him or her what it was all about. Five years after the attacks, a better role for historians and historical institutions may be simply to sit and listen.

pentagon.jpgThe September 11 Digital Archive is in some respects an attempt to define this new role for the historical professions, to deal with the problem of “history as it happens”, and to accommodate the public’s new conviction that it should and will be heard. Specifically, the Archive works to collect stories, emails, voicemails, digital images, office documents and other “born-digital” materials relating to the attacks and their aftermath, not only from those directly affected by the attacks, but from the general public as well. Intended as an experiment to determine whether or not it is possible to collect large numbers of source documents over the Internet, the Archive has proven its hypothesis and now stands among the nation’s premier repositories of September 11 history.

Yet, though our collecting efforts were always firmly on the public, we didn’t fully anticipate the role the Archive would serve among that public. This was to meet, at least in some small part, those new public expectations I described earlier—to provide an institutional location for public authorship of history and bottom-up interaction in historical endeavor.

pyramids.jpgAs it stands today, the Archive has collected more than 150,000 digital objects. Some of these materials are truly unique in the history of collections—real time transcripts of wireless email conversations, Internet chat logs, digital voicemail recordings—and stand unambiguously as important primary source documents. Other materials are more easily recognizable—for example, the thousands of personal narratives, memorial objects and pieces of artwork produced and contributed in the aftermath of the attacks—but are less clear in their status as historical documents. On the one hand these narratives, memorial objects, and artworks are primary documents: that is, they are contemporary representations of historic events. On the other hand it is clear that many of these materials were created with a real historical self-consciousness: that is, the people who contributed these materials were very much aware of their participation as actors in the historical process. In this sense, these materials stand not as primary documents, but as secondary narratives or works of historiography.

iwojima.jpgIn fact, many contributors come right out and say so, and the ones who don’t often let on in other ways. All indications point to the fact that people are creating materials specifically to be placed in the Archive. Our logs show that our contributors return over and over again to review their contributions, to see where they stand in the Archive and how they are being categorized, displayed and used. Moreover, this is true not only of the stories we solicit, but also for the digital artworks and digital animations people submit to the Archive. In both cases, there’s a clear concern about ownership and authorship and, by extension, about participation in making history. Look at these images and read these stories, and you’ll see our contributors wrestling not only with their grief and anger, but also with September 11’s place in history, either among the pyramids of the ancients or the iconic images of the First and Second World Wars. In this way the September 11 Digital Archive is not simply comprised of passive remnants of the past, but rather stands as an institutional location for the active and intentional historical participation of the general public. Visitors to the Archive do not come to receive history, they come to navigate historical sources, to engage historical discourse, and to produce their own. In this the Archive points toward new ways of accommodating our sophisticated public’s sophisticated expectations. From the outset, we saw the Archive as an experiment, and like any good experiment, the unintended outcomes have been easily as interesting as the hypothesized results. One of these is a treasure trove of found history.

Finding History

With the launch of CHNM blogs this week, I thought I’d better get in the game.

In fact, I have been thinking about this site for quite a while. Since at least my undergradute days, I’ve been interested not only in history and historiography, but more specifically in the kinds of history done by non-professionals outside the university, the museum, or the publishing house. My undergraduate senior thesis, which examined heroic narratives of geology’s early history among scientists, was the first organized expression of this interest. This spark caught fire in my first job out of college, where my work with the Colorado Historical Society consisted mostly of traveling to small towns pegged for inclusion in the State’s historical marker program and working with community groups to write narratives that were both historically sound and acceptable to local sensitivities. (I often say that getting an American Legion Hall full of ranchers, farmers, indians, ski bumbs, and hippies in Craig, La Vita, or Ault, Colorado to agree on a single interpretation of environmental, military, or agricultural history is the hardest thing I’ve ever done.) After Colorado, I took my expanding enthusiasm for the processes of amateur, community, and other kinds of non-professional history to Oxford, where my dissertation examined the pratice of science history during the 1920s and 30s in the university, the museum, the industrial exhibition and World’s Fair, and other popular contexts. And though that paper spent a lot of pages on academic historians of science and science museum professionals, what fascinated me most were the kinds of historical processes, narratives, and uses devised by scientists, engineers, businessmen, politicians, enthusiasts, and members of the general public.

Now at CHNM, I’m constantly bumping up against this kind of non-professional history. Indeed, our own work in digital history is considered by many of our colleagues to be unconventional in its own right. But even more exciting to me than the particular digital work we do is the opportunity this work gives me to interact with a whole range of non-academic, non-professional, and amateur historians. The uniquely public nature of web work demands contact with new and unintended audiences, and the increasing interactivity of digital history means increased paricipation by those audiences in making historical knowledge. Probably the best example of this at CHNM is the September 11 Digital Archive, where tens of thousands of people have come to share their personal histoires and interpretations of 9/11. (Some examples of popular historymaking from the 9/11 collection will appear in a future post.) CHNM’s latest project, the Hurricane Digital Memory Bank, promises more of the same. Finally, I should also say that CHNM is run by the guy who literally wrote the book on popular engagement with history, so these sensibilities are really simply in the air.

To get to the point, I’m starting Found History as a place to log my encounters with this other history, to chronicle the myriad ways and places non-professionals do history — sometimes without even knowing it. Most of my posts will be textual or visual snapshots from TV, the web, and the world around me: those often unintentional forays into history by non-historians. A smaller number of posts will share my own thoughts on popular historical participation and practice. My ultimate aim is to foster a broader understanding of what history is and who should be called a historian. Over the past couple of decades, we professionals have come to realize that we don’t do history only for ourselves. It’s now about time we came to realize that we’re not the only ones who do it.