Contact Us | Members Only | Site Map

Association of Research Libraries (ARL®)

  Resources Contact:
Lee Anne George
Publications, Reports, Presentations
Scholarly Publishing on the Electronic Networks

Time Present and Time Past: An Introduction to the Papers of the Symposium

Share Share   Print

James J. O'Donnell

Classics Department, University of Pennsylvania

Background

To a student of ancient texts contemplating the impending transformation of our forms of discourse by electronic media, the scene that stretches before our imagination is very much like Yogi Berra's deja vu all over again. We have been here before, more than once. Consideration of a piece of that past may make it easier to think about where we are.

Much attention has been paid to two transitions the written word has gone through in our culture's past: the adaptation of writing itself, and the replacement of manuscript culture by the printed book. But to me there was a decisive third moment: when the papyrus roll began to be replaced, sometime in the first centuries of the common era,[1] by the codex manuscript book. The arrangement of words on a rectangular page and the binding of those pages together to form our familiar book shape was a decisive moment in shaping the way we would receive and process information. By comparison, the contribution of the addition of printing lay not in the organization of words on the page but in the way the traditional manuscript page could be standardized, duplicated, and distributed farther faster than ever before.

I would suggest that the particular contribution of the codex form to the history of written literature was one that has increased significance today. Put briefly, a fiction that usually hides behind written text is that the text is a running sequence of words almost mimicking speech, and meant to be read in a linear, sequential way. Spoken discourse can only be received this way, and the written word on the papyrus roll gave strong encouragement to go on reading this way: unrolling and re-rolling the bulky book gave strong impetus to follow the words through from beginning to end, in the order the author ordained.

But in fact much of what we ourselves do with the written word is non-linear and defeats this fiction. If you pause to think of the last two or three books you held in your hand before this one -- perhaps even examine the way you are leafing and skimming through this one -- you will probably see yourself looking up numbers in a phone book, checking a reference in a bibliography, finding the population of Vancouver in an almanac, or trying to get the computer software manual to tell you how to arrange text in parallel columns. If in any of those cases, you had to begin at page one and read sequentially through the text until you came upon the fact you were seeking, you would be at a serious disadvantage. (And of course even in the boldly linear world of narrative fiction, non-linear reading is possible and practiced: sometimes we go back to reread an earlier page or chapter, and there are even people who flout the sternest law of linear narrative and go to the end of the murder mystery first to see how it comes out.[2])

The iconic image used here as a cover shows how deeply this habit runs in the world of the codex. The particular manuscript page comes from the Lindisfarne Gospels, a book produced on an island off the northeast coast of Great Britain in the early eighth century, but it represents with unusual clarity a tradition already several centuries old at that time. The page is from the so-called "Eusebian canon tables" that many early Gospel books display, and it represents a triumph of technique over narrative. Briefly, in each of the Gospel accounts of Jesus' life, the old manuscripts had a running series of marginal numbers identifying separate episodes or sayings. Then at the front of the manuscript, there were several pages like the one reproduced here.

This page has one column each for the four gospel writers, and in each column a series of Roman numerals. The point is to take for example item 310 at the head of the first column: that parallels the number 191 in the second column 297 in the third, and 69 in the fourth. If you pursue those numbers, you will see that next to the number 310 in Matthew you will find Mt. 26.64, "Hereafter shall ye see the Son of man sitting on the right hand of power, and coming in the clouds of heaven." Then, by the corresponding numbers in Mark and Luke you will find passages that closely resemble. In the (often divergent) version of John, the marginal number 69 will take you to Jn. 6.62, "What and if ye shall see the Son of man ascend up where he was before:" not quite the same thing, but the parallel is instructive to contemplate.

Now if we imagine a medieval reader following those marginal numbers and this canon table, we can certainly see that he is approaching the texts in a way that differs from the naive reading of one who goes through each Gospel from beginning to end in sequence. To put the four versions side by side calls into question their differences and their similarities and challenges the reader to decide how to interpret the variations. None of the four Gospel writers could have expected such a reader, and none of them catered to his needs.

But what I mean to call to mind here is not the content but the technique. If you look again at the manuscript page and imagine it a colorful screen on your computer, it is an easy matter to think of "clicking" on one of those numbers and being led instantly to the appropriate page in the Gospel that the number marks; or vice-versa, of clicking on the marginal number in the Gospel, being led to this page, then clicking on a number in a parallel column to be led instantly to the passage most readily compared to the first. But if you did the search that way, you would call it "using hypertext hot links" and feel very brave and postmodern, when in fact it would be only the speed and facility of the reference, not the method or the content, that would be new.

I go on this way at some length to make it easier to say two different things that may seem at the outset contradictory, and human beings often have a very hard time holding two such propositions in mind at once.

First, it is reasonable to think that in the change of outward forms and techniques of information preparation, storage, distribution, and retrieval there can and will lie great change in the way users structure the information and thus implicitly structure the world they live in. To consider the power of the media to shape us is to think in almost determinist ways.

Second, at the same time, the intellectual trajectory of the culture that is making and shaping these instruments is one that has a life of its own. If we look to electronic information technology to make non-linear data more accessible and more useful, we should remember that we are continuing here an intellectual enterprise that is many centuries old. Even as the electronic environment transforms the culture, the cultural impulses that seek to shape that environment are profoundly, and in some ways reassuringly, conservative. To think in these terms is to restore a sense of freedom and control to the enterprise at hand.

Neither freedom nor determinism is a fully satisfactory description of human behavior, at least not as the alternatives are at present described: we need them both, as physicists needed both wave and particle theory to explain the various phenomena of light.

Symposium II: Visions & Opportunites

I hope by saying such things to give some comfort to those who find the present moment vertiginous and disorienting. The first symposium on scholarly publishing on the electronic networks was co-sponsored by the Association of Research Libraries, the American Mathematics Society, and the National Science Foundation in the spring of 1992. The experience was for one of the publishers present "like being a deer caught in the headlights of an onrushing truck." The second symposium, now with the added and crucial sponsorship of the Association of American University Presses (AAUP), was held in Washington on 5-8 December 1992. The hallmark of the second symposium was a deeply impressive pragmatism on all sides. The shock of the new had been survived, and from all directions institutions and individuals were to be seen coming forward with well-formed experiments, prototype projects, and crisply formed questions about the ways and means of making the new technology serve both old and new demands from the scholarly and scientific community.

The keynote speaker on 5 December was Yuri Rubinsky, a leading developer of SGML-based (and thus network-friendly) editing software. Mr. Rubinsky adopts a deliberately visionary line, sketching the nature of the electronic text `the day after tomorrow'. His list, laid down almost by way of challenge, of seven characteristics of that future text takes us far forward along the trajectory that can be plotted from the Gospel book's canon tables, but the continuity is still discernible.

The most pragmatic complement to Rubinsky's visionary performance is Susan Hockey's paper, talking in very practical terms about what it takes to make an electronic text that can be widely used in the Internet and NREN environments. The bulk of the presentations at the symposium show, with verve and imagination, current work in a wide variety of fields: real projects with real use in the contemporary scene.

For example, David Solomon of North Carolina State describes the start-up of a new electronic journal in statistics education, with detailed discussion of organization, funding, and operational structures. Bernard Rous of the Association for Computing Machinery outlines the efforts of a large learned society's publishing arm seeking to manage a transition from print-oriented to electronic publication. Michael Van Steenberg of NASA provides a similar overview of a government agency's venturesome plans. Terri Harrison and Tim Stephens of Rensselaer Polytechnic Institute, working in the social sciences discipline of communication studies, describe a project somewhere between a discussion group and a journal (or set of journals) called COMSERVE, with a wide variety of kinds of information and discussion available to specialists and students.

Other presentations talk about ways in which existing materials can be organized and accessed. From AT&T, the Right Pages project shows how existing journals can be scanned and accessed quickly, with information delivered to the user who needs it in a timely way; from the University of Virginia, David Seaman of their library's cutting-edge e-text facility demonstrates how they can bring together on-line resources of text, reference works (the OED), and images to give the traditional literary scholar new power and range of exploration.

The community of traditional publishers is represented in several untraditional ways. Ken Arnold of Rutgers University Press discusses the future of the scholarly monograph, offering a vision that integrates press and library involvement through technological linkage, while at the same time distributing demonstration disks of Rutgers' innovative "Floppybacks" program, offering inexpensive floppy disk editions of out-of-print backlist titles still in demand. (Arnold also remarked that one pressing need on university campuses is to educate and involve the senior administrative leaders in what have previously been thought the disparate and marginal concerns of libraries and presses.) Evan Owens of the University of Chicago Press provides a more nitty-gritty survey of what one press is actually doing with the electronic manuscript at various stages of the traditional publishing process. Finally, Elli Mylonas of the Perseus project, a CD-ROM based hypertext database comprising texts and images of Greek antiquity, shows how a user- and classroom-friendly teaching and learning tool has been created and is now being actively marketed in traditional and untraditional ways by Yale University Press.

Common concerns run through many of the presentations. Broadly speaking, economic concerns seem the most naggingly disconcerting. Questions of cost recovery are often brought into the open, and beyond them questions of intellectual property laws and the way the new technology makes those laws both more difficult to enforce and at the same time in many ways more necessary to protect and assure the possibility of creating and distributing important new work.

The last morning of the symposium saw a riveting sequence of presentations by Robert Oakley, Georgetown University Law Librarian, giving a lawyer's tour of the realities of the copyright law and its contemporary interpretation and application; then by Anita Lowry, e-text librarian from the Columbia University Libraries, describing in sometimes chilling detail the problems faced by a working librarian trying to assemble a useful collection in the face of a bewildering variety of restrictions and conditions placed on use; and finally by Joseph Esposito, Executive Vice President of the Encyclopedia Britannica corporation and head of the Merriam Webster Dictionaries, who took the participants on a detailed case study of how that commercial enterprise looks at a variety of electronic projects, both networked and disk-based.

To sketch the highlights and attach the majority of the papers, does not exhaust the riches of the symposium, especially the informal discussions impossible to represent here. After lunch on the last day of the symposium, a panel of participants, including a publisher, a librarian, a journals publisher, and two working academics gave brief presentations in turn of their concerns and their hopes on the point of going back to take up their day-to-day responsibilities. Peter Grenquist (of the AAUP) had opened the proceedings on the first day with a quotation from Borges' famous short story, Funes the Memorious, a reminder of just how astonishingly various the contents of our culture's storehouse has become already; to leave this symposium was to go away wondering just what will become of all of us when that storehouse begins to expand geometrically in all dimensions at once at ever-increasing rates of speed.

Endnotes

[1]James J. O'Donnell, "St. Augustine to the NREN; the tree of knowledge and how it grows," forthcoming in: Proceedings of the NASIG 7th Annual Conference, Serials Librarian, Winter 1993. Also: C. H. Roberts and T. C Skeat, "The Birth of the Codex," (London: Published for The British Academy by Oxford University Press, 1983), revising a lecture originally pubished in the Proceedings of the British Academy 40(1954): 169-204.

[2] The narrative world of the murder mystery is actually instructive in this way: consider how often the last chapter of the story is really a non-linear recursion to the beginning of the narrative, to repeat the main points of the story with suitable commentary and linkage made to bring out the hidden story that has been present all along.