Looks Like the Internet: Digital Humanities and Cultural Heritage Projects Succeed When They Look Like the Network

A rough transcript of my talk at the 2013 ACRL/NY Symposium last week. The symposium’s theme was “The Library as Knowledge Laboratory.” Many thanks to Anice Mills and the entire program committee for inviting me to such an engaging event.

cat When Bill Gates and Paul Allen set out in 1975 to put “a computer on every desk and in every home, all running Microsoft software” it was absurdly audacious. Not only were the two practically teenagers. Practically no one owned a computer. When Tim Berners-Lee called the protocols he proposed primarily for internal sharing of research documents among his laboratory colleagues at CERN “the World Wide Web,” it was equally audacious. Berners-Lee was just one of hundreds of physicists working in relative anonymity in the laboratory. His supervisor approved his proposal, allowing him six months to work on the idea with the brief handwritten comment, “vague, but exciting.”

In hindsight, we now know that both projects proved their audacious claims. More or less every desk and every home now has a computer, more or less all of them running some kind of Microsoft software. The World Wide Web is indeed a world-wide web. But what is it that these visionaries saw that their contemporaries didn’t? Both Gates and Allen and Berners-Lee saw the potential of distributed systems.

In stark contrast to the model of mainframe computing dominant at the time, Gates and Allen (and a few peers such as Steve Jobs and Steve Wozniak and other members of the Homebrew Computing Club) saw that computing would achieve its greatest reach if computing power were placed in the hands of users. They saw that the personal computer, by moving computing power from the center (the mainframe) to the nodes (the end user terminal) of the system, would kick-start a virtuous cycle of experimentation and innovation that would ultimately lead to everyone owning a computer.

Tim Berners-Lee saw (as indeed did his predecessors who built the Internet atop which the Web sits) that placing content creation, linking, indexing, and other application-specific functions at the fringes of the network and allowing the network simply to handle data transfers, would enable greater ease of information sharing, a flourishing of connections between and among users and their documents, and thus a free-flowing of creativity. This distributed system of Internet+Web was in stark contrast to the centralized, managed computer networks that dominated the 1980s and early 1990s, networks like Compuserve and Prodigy, which managed all content and functional applications from their central servers.

This design principle, called the “end-to-end principle,” states that most features of a network should be left to users to invent and implement, that the network should be as simple as possible, and that complexity should be developed at its end points not at its core. That the network should be dumb and the terminals should be smart. This is precisely how the Internet works. The Internet itself doesn’t care whether the data being transmitted is a sophisticated Flash interactive or a plain text document. The complexity of Flash is handled at the end points and the Internet just transmits the data.

480px-Internet_map_1024 In my experience digital cultural heritage and digital humanities projects function best when they adhere to this design principle, technically, structurally, and administratively. Digital cultural heritage and digital humanities projects work best when content is created and functional applications are designed, that is, when the real work is performed at the nodes and when the management functions of the system are limited to establishing communication protocols and keeping open the pathways along which work can take place, along which ideas, content, collections, and code can flow. That is, digital cultural heritage and digital humanities projects work best when they are structured like the Internet itself, the very network upon which they operate and thrive. The success of THATCamp in recent years demonstrates the truth of this proposition.

Begun in 2008 by my colleagues and I at the Roy Rosenzweig Center for History and New Media as an unfunded gathering of digitally-minded humanities scholars, students, librarians, museum professionals, and others, THATCamp has in five years grown to more than 100 events in 20 countries around the globe.

How did we do this? Well, we didn’t really do it at all. Shortly after the second THATCamp event in 2009, one of the attendees, Ben Brumfield, asked if he could reproduce the gathering and use the name with colleagues attending the Society of American Archivists meeting in Austin. Shortly after that, other attendees organized THATCamp Pacific Northwest and THATCamp Southern California. By early-2010 THATCamp seemed to be “going viral” and we worked with the Mellon Foundation to secure funding to help coordinate what was now something of a movement.

But that money wasn’t directed at funding individual THATCamps or organizing them from CHNM. Mellon funding for THATCamp paid for information, documentation, and a “coordinator,” Amanda French, who would be available to answer questions and make connections between THATCamp organizers. To this day, each THATCamp remains independently organized, planned, funded, and carried out. The functional application of THATCamp takes place completely at the nodes. All that’s provided centrally at CHNM are the protocols—the branding, the groundrules, the architecture, the governance, and some advice—by which these local applications can perform smoothly and connect to one another to form a broader THATCamp community.

As I see it, looking and acting like the Internet—adopting and adapting its network architecture to structure our own work—gives us the best chance of succeeding as digital humanists and librarians. What does this mean for the future? Well, I’m at once hopeful and fearful for the future.

On the side of fear, I see much of the thrust of new technology today to be pointing in the opposite direction, towards a re-aggregation of innovation from the nodes to the center, centers dominated by proprietary interests. This is best represented by the App Store, which answers first and foremost to the priorities of Apple, but also by “apps” themselves, which centralize users’ interactions within wall-gardens not dissimilar to those built by Compuserve and Prodigy in the pre-aeb era. The Facebook App is designed to keep you in Facebook. Cloud computing is a more complicated case, but it too removes much of the computing power that in the PC era used to be located at the nodes to a central “cloud.”

On the other hand, on the side of hope, are developments coming out of this very community, developments like the the Digital Public Library of America, which is structured very much according to the end-to-end principle. DPLA executive director, Dan Cohen, has described DPLA’s content aggregation model as ponds feeding lakes feeding an ocean.

As cultural heritage professionals, it is our duty to empower end users—or as I like to call them, “people.” Doing this means keeping our efforts, regardless of which direction the latest trends in mobile and cloud computing seem to point, looking like the Internet.

[Image credits: Flickr user didbygraham and Wikipedia.]

An Unexpected Honor

Yesterday I received a letter from Google addressed to Robert T. Gunther at Found History. As founder of the Museum of the History of Science at Oxford, where I did my doctoral work, and a major figure in my dissertation, I am very honored to welcome Dr. Gunther to the Found History staff. Despite having passed away in 1940, it is my hope that Dr. Gunther will make significant contribution to this blog’s coverage of the history of scientific instrumentation.

E-Book Readers: Parables of Closed and Open

During a discussion of e-book readers on a recent episode of Digital Campus, I made a comparison between Amazon’s Kindle and Apple’s iPod which I think more or less holds up. Just as Apple revolutionized a fragmented, immature digital music player market in the early 2000s with an elegant, intuitive new device (the iPod) and a seamless, integrated, but closed interface for using it (iTunes)—and in doing so managed very nearly to corner that market—so too did Amazon hope to corner an otherwise stale e-book market with the introduction last year of its slick, integrated, but closed Kindle device and wireless bookstore. No doubt Amazon would be more than happy with the eighty percent of the e-book market that Apple now enjoys of the digital music player market.

In recent months, however, there have been a slew of announcements that seem to suggest that Amazon will not be able to get the same kind of jump on the e-book market that Apple got on the digital music market. Several weeks ago, Sony announced that it was revamping its longstanding line of e-book readers with built-in wifi (one of the big selling points of the Kindle) and support for the open EPUB standard (which allows it to display Google Books). Now it appears that Barnes & Noble is entering the market with its own e-book reader, and in more recent news, that its device will run on the open source Android mobile operating platform.

If these entries into the e-book market are successful, it may foretell of a more open future for e-books than has befallen digital music. It would also suggest that the iPod model of a closed, end-to-end user experience isn’t the future of computing, handheld or otherwise. Indeed, as successful and transformative as it is, Apple’s iPhone hasn’t been able to achieve the kind of dominance of the “superphone” market that the iPod did of the music player market, something borne out by a recent report by Gartner, which has Nokia’s Symbian and Android in first and second place by number of handsets by 2012 with more than fifty percent market share. This story of a relatively open hardware and operating system combination winning out over a more closed, more controlled platform is the same one that played out two decades ago when the combination of the PC and Windows won out over the Mac for leadership of the personal computing market. If Sony, Barnes & Noble, and other late entrants into the e-book game finish first, it will have shown the end-to-end iPod experience to be the exception rather than the rule, much to Amazon’s disappointment I’m sure.

Briefly Noted: FOSS Culture; Digital Humanities Calendar; Guardian API; WWW Turns 20

GNOME Foundation executive director Stormy Peters has some advice on bridging the gap between institutional and open source cultures. Useful reading for digital humanities centers and cultural heritage institutions looking to participate in open source software development.

Amanda French has posted a much-needed open calendar of upcoming events in Digital Humanities, Archives, Libraries, and Museums.

The Guardian newspaper unveils an open API to more than 1,000,000 articles written since 1999.

20 years ago today: Tim Berners-Lee produced his first written description of the Web.

Motto

I came across this old quote last night in finishing up David Post’s In Search of Jefferson’s Moose: Notes on the State of Cyberspace. It seems a fair approximation of how things work (should work?) in the new digital humanities:

“We reject: kings, presidents and voting. We believe in: rough consensus and running code.”

David Clark, “A Cloudy Crystal Ball: Visions of the Future.” Internet Engineering Task Force, July 1992. [PDF].

Briefly Noted for February 10, 2009

Jessica Pritchard of the American Historical Association blog reports on a panel at last month’s annual meeting that asked what it takes to be a public historian. Entitled “Perspectives on Public History: What Knowledge, Skills, and Experiences are Essential for the Public History Professional?” the panel was chaired by George Mason’s own Spencer Crew.

Going back a bit to the December issue of Code4Lib Journal, Dale Askey considers why librarians are reluctant to release their code and suggests some strategies for stemming their reluctance. I have to say I sympathize completely with my colleagues in the library; I think the entire Omeka team will agree with me that putting yourself out there in open source project is no easy feat of psychology.

The Bowery Boys, hosts of the excellent NYC History podcast, give us The History of New York City in Video Games, a thoroughgoing look of how New York has been pictured by game designers from the Brooklyn of the original Super Mario Brothers to the five boroughs of Grand Theft Auto IV’s “Liberty City.”

John Slater, Creative Director of Mozilla, rightly notes that, however unlikely, t-shirts are important to the success of open source software. In his T-Shirt History of Mozilla, Slater shows us 50 designs dating back to late 1990s.

Briefly Noted for April 8, 2008

Friend of CHNM, Stan Katz provides some perspective on The Emergence of the Digital Humanities in his excellent Chronicle of Higher Education “Brainstorm” column.

Timelines.tv presents 1000 years of British history through a series of film clips organized along three parallel and interlinked timelines, one each for social, political, and national (English, Irish, Welsh, Scottish) history. Very high quality content (originally filmed for the BBC) distributed in a very popular format (the timeline). And a pretty slick website to boot.

Open Source Decade. Ars Technica recalls Tim O’Reilly’s 1998 “Freeware Summit” where “open source” first emerged as a term of choice in the free, open, libre, etc. software movement.

Briefly Noted for March 11, 2008

How to make a Leyden jar out of a two-liter Coke bottle, from MAKE Magazine.

Top Ten Moments in Sitcom History. I think you’d have to put Lucy and Ethel’s stint at the conveyor belt at the top of the table, but a good list nevertheless. (Thanks, Jerm.)

Prolific “junior ranger” Chance Finegan on the history of Mt. Rainier National Park.

Keeping with my management kick, here are 14lessons from 37signals for good digital project management and organizational development.

Netscape RIP

So long Netscape. You were a good friend (for a while). Though official support for the first widely used web browser ends next week, Netscape’s hapless stewards at AOL have kindly left us a lasting(?) memorial. The Netscape Archive offers a brief history of the browser and a download page for discontinued releases of the software. But even the Archive’s creators acknowledge that you’re better off downloading Flock or Firefox.