Frank Quinn, a mathematician at the Virginia Polytechnic Institute and State University and a member of various American Mathematical Society decision-making committees, adds a further voice foreseeing radical change to the discussion.
Date: Thu, 25 Aug 1994 11:46:16 -0400 To: pub@math.ams.org From: quinn@math.vt.edu (Frank Quinn) Subject: electronic pub. in physics
Consequences of Electronic Publication in Theoretical Physics
Frank Quinn quinn@math.vt.edu
The development of electronic scholarly communication as a whole is still impossible to forecast. Theoretical physics, however, is further along in this development and definite trends are taking shape. The purpose of this note is to describe some of the trends, and some things to watch for in the future. Some of the changes, and some of the mechanisms, are special to physics. Nonetheless this is an illuminating "natural experiment" with important lessons for science and scholarship in general.
===The collapse of traditional journals ===
It is widely expected that by 2010 the bulk of scholarly communication will be electronic. The wild success of new access tools suggests it may happen sooner (note 1). But in any case it will definitely happen much sooner in physics: a powerful mechanism is set to act. There is steady erosion of journal subscriptions due to pressure on library budgets. Cuts have to be made, and librarians are very concerned that they cause as little damage as possible. They are aware of the physics preprint databases, and at the first sign of weakness in the defense of physics journals they will beginning cutting them preferentially. The argument will be: "you really don't use the paper versions, and you have other, easier, access to the information. It is appropriate that we cut physics in order to protect backward areas which would really be disadvantaged by the loss of paper." (note 2)
This argument suggests a sudden decline in subscriptions in the next two to five years (depending largely on library financial problems). Some journals may reconfigure in electronic form, but the termination of the revenue stream will mean most of them will just die. After this happens most transactions in physics will take place through the preprint databases. These databases should remain essentially the same as they are now, except for advances in access tools. (note 3).
=== The advent of hypertext ===
A very significant new development is the addition of hypertext capabilities. Paul Ginsparg and others have developed tools which, with very little additional effort for authors, allow active references: selecting the reference on screen immediately calls up a copy of the other paper (note 4). For this the paper being referred to must reside in an electronic database, and the URL must be added to the reference. This is a dramatic and very attractive increase in functionality (note 1), and will have many consequences.
The first consequence of the new functionality is that papers not in the database will be cited less frequently, and citations will obviously result in less-frequent retrieval of the paper. This reinforces the motivation to put papers into the database, so use will become more universal. There may also be a "filling out" of the archive as authors add older papers to encourage citation and to use self-citation as a way to lead readers to them. Both of these trends will accelerate the collapse of the paper journals.
Another consequence of hypertext citation is that readers will access papers without knowing where they are, or whether or not they have been published. Even if they have been published the hypertext link will frequently be to preprints, and any modifications made to final printed versions will be lost. In particular the prestige and quality filtering aspects of the traditional publication process will be hidden or lost. These factors will certainly reduce interest in, and benefits of, the traditional process.
A final consequence of hypertext citations has not yet arrived, but will provide another big advance in functionality. This is automatic forward referencing. It will be trivial to invert citations, and for each paper maintain a list of the papers which cite it. This will be a powerful tool for exploring the literature. To some extent this can be done now with the Science Citation Index. But it will be far superior to SCI in many ways: data will be available faster; it will give instant access to the citing papers (by hypertext links, if they are in the database); and it will be higher quality since citations containing a URL will not be "lost" because the cited paper cannot be located.
=== Sociological consequences ===
So far there is only anecdotal information about sociological fallout, so the following discussion is more forecast than observation. But it is important to watch this development very closely. It will give us the first glimpse of upheavals soon to be visited on all of science.
The first thing to watch for is a decline in average quality of papers. There already seems to be a trend in this direction (in theoretical physics). It should accelerate as writers no longer worry about being subjected to a refereeing process (note 5). Inevitably, also, cranks and "flamers" will find the databases. Some filtering may be instituted to eliminate the worst offenders, but it will have to be minimal since there is no mechanism to pay for careful review.
The reactions to the decline in quality will be very revealing. With the loss of peer review as the first line of defense, the main literature-level opportunities for quality control will be selective citations and review articles. At present the custom is to cite all (known) previous work on a subject. Will this change? Will authors refuse to dignify defective papers with a citation? Or will they give a "dead" reference which does not link to the defective work, and does not show up in citation data? Review articles which sift and consolidate the primary literature are likely to become more important for quality control. "Acceptance" for citation in a major review article may serve as a replacement for acceptance in a journal.
The next thing to watch is the impact on the "reward structure." Currently there are still plenty of submissions to physics journals, presumably because "credit" is still attached to formal publication. Candidates for promotion are certainly still concerned about this. What will happen when the journals have thinned out enough so that fewer than 50% of physics papers can be published? Probably new "impact indicators" will emerge (note 6). The best candidates for such indicators are citation data, though this will bring a new set of problems (note 7).
Other adaptations to watch for are changes in work habits. Some of this is specialized to theoretical physics, and lessons from it will not be universal. Many theoretical physicists think of themselves more in terms of analytical skills than a specific subject. They are not anchored to a specific topic (by equipment, for instance), and from time to time change to different topics accessible to their skills (note 8). On a social level this shows up as "fads" in which areas are tremendously popular for a while and then are abandoned. Will lower quality mean that fads pass more quickly (note 9)? Or will the overhead of having to sort through more trash slow down the process?
=== Conclusion ===
Bernard Naylor has written:
"We are now seriously contemplating the most dramatic change in the working habits of scholars for some centuries, but I look in vain for convincing signs that our sector appreciates this and is collectively bending its mind to preparing for the consequences."
These words may haunt us in the next decade. It seems to be too late for theoretical physics to "bend minds and prepare," and this is somewhat foreign to the physics mind-set anyway. But the rest of the scholarly enterprise has much to learn by watching these bold pioneers, and may yet be able to prepare for the consequences.
NOTES
1) The new tools, particularly http and Mosaic, seem to be dramatically more attractive to users, and usage is exploding. See Science v.265 (12 August 1994) pp.895-901.
2) Bernard Naylor (Univ. Librarian, U. Southampton) has noted that so far physics journals are still being defended against cuts. (He is obviously watching, though). Paul Ginsparg (LANL) replied that it is still a bit early, but he expects a shift in the next few years. It may also be that Naylor is not distinguishing between theoretical and experimental physics: the change of attitude will come first in the theoretical areas.
3) Naylor has suggested that the impending demise of journals may trigger changes in the Net to "level the playing field" and allow commercial journals to compete "on an equal footing." This is unlikely to make any difference. Ginsparg has already demonstrated that it is politically impossible to close down the preprint database. The argument against closure will strengthen as journal numbers and access declines. Access charges might conceivably be instituted, but since there are no processing expenses in a preprint database, the charges would be far below what would be required to support a traditional journal. In fact access charges would accelerate the process. Subscriptions to the database would certainly be necessary and this would make it even less attractive to also pay for journals containing essentially the same information.
4) For information about the hypertext tools see the following URL: http://xxx.lanl.gov/hypertex/
5) Stevan Harnad refers to good-quality writing resulting from the anticipation of being reviewed as the "invisible hand" of the reviewing process. In other words, reviewing is a "pump" as well as a "filter."
6) Theoretical computer scientists seem to have had some success in convincing deans and employers to accept non-traditional indicators of impact. In particular an abstract in the right conference proceedings is more prestigious than a refereed paper in a journal.
7) We may see promotion documents offering "245 total citations, including 33 from the Institute for Advanced Study, and two in a review article written by a Harvard professor" as evidence of quality. Unfortunately there are enormous and obvious opportunities for abuse of any such system.
8) There is concrete evidence that this characterization by skills rather than subject is correct: financial institutions have found that PhD training in theoretical physics is very effective preparation for sophisticated economic analysis.
9) Quality problems probably play a role in the "fad" phenomenon. As errors and guesses accumulate in theoretical analysis a point is reached where further work is pointless. The area is then abandoned by theorists until it can be cleaned up by more compulsive life-forms like experimentalists, mathematicians, and "distillers" (see Herring, "Distill or drown: the need for reviews," Physics Today 21 No.9 (1968) pp. 27-33).
[My comments on Frank Quinn's "Consequences of Electronic Publication in Theoretical Physics" are followed by some comments by Paul Ginsparg. -Stevan Harnad]
Date: Thu, 25 Aug 1994 11:46:16 -0400 To: cpub@math.ams.org From: quinn@math.vt.edu (Frank Quinn)
Some [paper] journals may reconfigure in electronic form, but the termination of the revenue stream will mean most of them will just die. After this happens most transactions in physics will take place through the preprint databases. These databases should remain essentially the same as they are now, except for advances in access tools. (note 3).
I must unfortunately disagree entirely with both of these predictions. The termination of paper publication need NOT mean the death of many or most refereed paper journals. These need only reconfigure (at much lower cost, and on the author-side subsidy model, rather than the trade model) as refereed electronic journals. There are already many ab ovo refereed electronic journals starting up currently; there is no reason to believe that taking to the skies will not be a preferable option to being interred for paper journals that can no longer make ends meet in paper. It is hard to imagine, in this era of information explosion in science and scholarship, that information-sources will prefer to implode rather than simply switch media!
The second point is related to this (and "peer review" is the key word linking the two): I am an enormous admirer of the electronic preprinting initiative in physics (mostly arising from the efforts of Paul Ginsparg), but I have to keep reminding everyone that this initiative is COMPLETELY PARASITIC at the present time on the (so far intact) paper flotilla, for which all these prepublication goods are ultimately intended. It is what I have called the "Invisible Hand" of that flotilla, namely, peer review, under which virtually all of these papers are destined to pass, that ensures the quality of what appears in the electronic archives -- and not just AFTER it has been refereed (when the authors of course quietly swap the refereed reprint for the preprint in the e-print archive) but even BEFORE, for all these papers are written with the expectation and intention (and necessity) of being submitted to peer-reviewed journals.
Human nature is such that if you were to pull that peer-control mechanism out from under this system quality would drop radically (just as Quinn predicts it would), but Quinn is simply wrong that peer review will be absent from the Net: Why on earth should it be? It is a completely medium-independent means of controlling the quality of human output. Just as the papers themselves migrate to the Net, so will peer review.
Having now challenged these two predictions (that the literature will simply shrink as paper journals die, rather than migrating to the Net, and that papers on the Net will just constitute a vast, unrefereed preprint archive, rather than the usual hierarchy of refereed journals, as in paper), let us see how the rest of the analysis fares.
There may also be a "filling out" of the archive as authors add older papers to encourage citation and to use self-citation as a way to lead readers to them. Both of these trends will accelerate the collapse of the paper journals.
The electronic archive will indeed "fill out" with the rest of the retrospective paper corpus that is worth recreating electronically, but not because of citation and self-citation motives but for scholarly/scientific reasons: I want and need the prior literature at my beck and call on the Net just as I want and need the current literature. All those incommensurable "value-added" features that the Net, with its speed, scope, interactivity and global interwebbing, provides for preprints, it can also provide for reprints, and offprints (and out-of-prints) of paper provenance.
Another consequence of hypertext citation is that readers will access papers without knowing where they are, or whether or not they have been published. Even if they have been published the hypertext link will frequently be to preprints, and any modifications made to final printed versions will be lost. In particular the prestige and quality filtering aspects of the traditional publication process will be hidden or lost. These factors will certainly reduce interest in, and benefits of, the traditional process.
This, if I might be permitted to point it out, is a rather circular prophecy: Once one has prophesied that peer review will not migrate to the Net (without giving any reason why not), it quite safely follows that quality and discriminability will decline. But is it not much more reasonable to suppose that peer review WILL migrate to the Net along with the journals themselves, and that it is trivially easy to devise a CODING system that will not only distinguish whether a paper is a preprint or a refereed reprint, but exactly where in the journal prestige hierarchy (corresponding to the rigor of the peer review each journal can be counted on to provide) a particular article is located? Such a prestige hierarchy currently exists in paper: Is there any reason it cannot take to the skies too?
The first thing to watch for is a decline in average quality of papers. There already seems to be a trend in this direction (in theoretical physics). It should accelerate as writers no longer worry about being subjected to a refereeing process (note 5). Inevitably, also, cranks and "flamers" will find the databases. Some filtering may be instituted to eliminate the worst offenders, but it will have to be minimal since there is no mechanism to pay for careful review.
These are excellent Darwinian reasons why, if the Divine Hand of peer review does not (for some reason) see fit to rise with it as the paper flotilla takes to the skies, then It will simply have to be re-invented up there.
The reactions to the decline in quality will be very revealing. With the loss of peer review as the first line of defense, the main literature-level opportunities for quality control will be selective citations and review articles... "Acceptance" for citation in a major review article may serve as a replacement for acceptance in a journal.
I do not believe for a minute, even in our absurdly populist age, that a popularity contest and box scores can or will replace the systematic scrutiny administered by editors and referees (imperfect as that is; see bibliography appended to these comments). Powerful electro-bibliometric analysis is a supplement, not a substitute, for peer review.
The next thing to watch is the impact on the "reward structure." Currently there are still plenty of submissions to physics journals, presumably because "credit" is still attached to formal publication. Candidates for promotion are certainly still concerned about this. What will happen when the journals have thinned out enough so that fewer than 50% of physics papers can be published? Probably new "impact indicators" will emerge (note 6). The best candidates for such indicators are citation data, though this will bring a new set of problems (note 7).
The present "impact indicators" are certainly insufficient, and the Net will indeed provide many valuable and informative supplements, including dynamic citation analysis, forward and backward, and probably even more sophisticated bibliometric measures of "air time" and "mileage" in the increasingly transparent and measurable embryology of knowledge. But up there with the other indicators will be the perfectly classical one, inherited from bygone paper days, namely, the altitude in the prestige hierarchy of the peer reviewed journal in which the paper was accepted. And the busy, rational lector will always be able to calibrate that all too finite reading time -- the difference will be that that that quality-tagged information will now be infinitely more easily accessible, once the lector has decide how to set those information filters.
3) Naylor has suggested that the impending demise of journals may trigger changes in the Net to "level the playing field" and allow commercial journals to compete "on an equal footing". This is unlikely to make any difference. Ginsparg has already demonstrated that it is politically impossible to close down the preprint database. The argument against closure will strengthen as journal numbers and access declines. Access charges might conceivably be instituted, but since there are no processing expenses in a preprint database, the charges would be far below what would be required to support a traditional journal. In fact access charges would accelerate the process. Subscriptions to the database would certainly be necessary, and this would make it even less attractive to also pay for journals containing essentially the same information.
I continue to preach, patiently, that the trade model is not, and never was appropriate for no-market esoteric writing and reading. The true per-page costs of a fully quality-controlled (edited, peer-reviewed, copy-edited) ELECTRONIC literature will be so low compared to paper (less than 25% according to my estimate -- which of course becomes no more accurate by dint of my repeating it) that it will make much more sense for the institutions (Universities, Research Funding Agencies, Research Libraries) that currently support and subsidize scholarly and scientific research by paying huge research library costs to instead subsidize these minimal electronic page-charges up front, making the product -- the esoteric corpus -- free for all. There is absolutely no reason for many journals, or for peer review itself, to disappear in this transition. Quality control need only be re-implemented in the new medium. (And there is no reason at all why the traditional scholarly publishers should not likewise reconfigure to make them themselves skyworthy too, in this new, rescaled, nontrade model for esoteric scholarly publishing.)
5) Stevan Harnad refers to good-quality writing resulting from the anticipation of being reviewed as the "invisible hand" of the reviewing process. In other words, reviewing is a "pump" as well as a "filter".
Not just good quality writing, but also good quality research, and reports you know you can trust (or trust as much as you could in paper).
Stevan Harnad Professor of Psychology Director, Cognitive Sciences Centre Department of Psychology University of Southampton SO17 1BJ UNITED KINGDOM
harnad@ecs.soton.ac.uk harnad@princeton.edu phone: +44 703 592582
Date: Thu, 25 Aug 94 16:16:53 -0600 From: Paul Ginsparg 505-667-7353 ginsparg@qfwfq.lanl.gov
From: quinn@math.vt.edu (Frank Quinn) Subject: electronic pub. in physics
A final consequence of hypertext citations has not yet arrived, but will provide another big advance in functionality. This is automatic forward referencing. It will be trivial to invert citations, and for each paper maintain a list of the papers which cite it. This will be a powerful tool for exploring the literature.
actually this has already been on-line for a while, courtesy of the spires-hep database maintained by the slac library. if you bring up the abstract view of an e-print from one of hep-th et al on the www interface, there is a link to "cited by" which brings up a list of papers that cite it (and the ones that are available electronically -- typically almost all for papers submitted in the past two years -- automatically appear as hyperlinks into the database).
=== Sociological consequences ===
my main comment here is that my stance re the utility of peer review is frequently misunderstood. we are aiming for a system in which much more stringent standards are applied, so that the truly significant is easily distinguished from the very good, average, irrelevant, and just plain wrong. my point has long been that the current journal system with its all-or-nothing accept or reject does not play that role, and hence we manifestly lose nothing by abandoning it. (i speak here of course of the much-maligned theoretical physics literature, and speaking of subversion many of its practitioners are as critical of its overall quality as is frank q.) the ultimate plan is to adopt a much more flexible system, with far more precise tools for extracting signal from noise. since this "experiment" is being conducted in a global goldfish bowl, details will be visible before too long (but not til i'm back from abroad).
Paul Ginsparg