Association of Research Libraries (ARLĀ®)

http://www.arl.org/resources/pubs/mmproceedings/124mmwest.shtml

Publications, Reports, Presentations

Membership Meeting Proceedings

Task Force on Scientific and Technical Information

Austin, Texas
May 18-20, 1994

The Research Library the Day After Tomorrow

Task Force on Scientific and Technical Information

Richard West
California State University System

It is a pleasure to be here to report on the Task Force on a National Strategy for Managing Scientific and Technical Information. You have heard some of this before, since we have given interim reports. I do think it is important to give you the context, once again, about the charge and some of the approaches we took because they are fairly important and critical to the way we came to our conclusions.

The challenge of the task force was to look at specific ways of managing Scientific and Technical Information (STI). There are some things that are unique to this field that you may need to be able to extrapolate other disciplines. This is in part because of the currency of the information as well as the need for a rapid exchange of information among the scholars and scientists. Also, the STI scholars and scientists are often more familiar with the international networks and the national electronic networks than scholars and scientists in other disciplines, although this is changing as the Internet becomes more ubiquitous.

We were also asked to examine the document delivery strategy; specifically, to determine if there are economies of scale with respect to document delivery within the resource-sharing concept and to test those economies of scale within the existing model at the Canadian Institute for Scientific and Technical Information for applicability to the U.S. environment. The task force vision, basically, is that networks are extremely important to our future, that they hold great promise for improved productivity within universities in the scientific disciplines, and that our institutions need to be prepared to make investments for these new challenges in electronic networks. Dorothy Gregor addressed some of this, and I will repeat it for emphasis.

My view is that technology is now almost good enough to use. By that I mean that we now have very quick networks--probably not as quick as they need to be, but they are getting to the point where you do not have to wait for the technology to start to use it. More importantly we can represent information as it is represented on the printed page, with integration of graphics, pictures, and data. We can also represent multimedia concepts with current technology innovation that have not been possible before. We have now reached the point where we can get beyond the technology and involved in some of the really tough discourse that deals with the economics of the electronic information environment. These missions are quite tough, and they are probably the real motivating factors. They are going to change our behavior.

First, I want to give you a sense of how we approached our work, that is, the analytical framework for the STI Task Force. We looked at three key characteristics of the productive communication process. One was the functions of the scholarly communication process independent of technological means: the steps involved in the process of creating information and knowledge, getting it through an editorial process, published, distributed, and recorded or indexed--the information management component--and then having secondary users access and use that information. These are all steps or functions of the scholarly communication process. We thought that it was important to outline this process of scholarly communication before testing different models of electronic distribution.

The second part included the attributes of three different models as they are tested against the functions. Some models do a better job with respect to performance attributes such as the success rate and the costs associated with a particular model for storage, distribution, and circulation. The cost factor was an important part of our approach to assessing how we interpreted the models we wanted to discuss.

Finally, we looked at the roles and responsibilities of various players in the scholarly communication process: scholars, publishers, librarians, and secondary users.

The ARL community is probably quite familiar with the terms that we use with respect to the three models that we tested against this framework: classical, modernized, and emergent. Basically, they deal with different levels of electronic distribution or the amount of electronic application for each model. The classical model is an extreme case--it isn't true anywhere these days--traditional print-based storage, strictly card catalog-oriented, where everything is still print-based.

The modernized model is the application of technology to the traditional print environment; this is naturally true in the evolution of any technological innovation. The first thing you tend to do is to automate the old processes. We started to automate a card catalog, and we transformed a print-based journal into an electronic format. We view this as a modernization process. Document delivery is an example of the modernized model.

The emergent model is a collaboratory approach among scholars incorporating different approaches to the organization of information and the scholarship process. Already there are examples of the emergent model beginning to appear. For example, within the global change environment, there is a lot of collaboratory activity among institutions and scholars and across a number of disciplines using very large databases as their scientific test bed for information exchange. Roles are changing, and it is less clear when something is published, or disseminated, or whether it is just an exchange among scholars with respect to some piece of information on the network. In particular it is unclear when we organize or design the information databases. In fact, the management function of the scholarly communication process needs to move up in the cycle of collection and generation of scientific information.

As we looked at these models, we were able to report to the presidents that improved performance and productivity can be expected. We also recognized, however, that we are going to have all of these models present in our environments for many years. No models will become dominant in that period of time. From the point of view of economics, we have cost pressures on all points, and we have to innovate. At the same time we have to maintain a realistic view. Those who are impatient among us will say we need to move quickly to the emergent or modernized model. Rather, we will have to recognize gradually that we are going to adapt our mechanisms and processes to accommodate these new models that may require behavioral changes.

With respect to our conclusions, and there are several in our report, we outline many different actions, some of which are incorporated into the other two task forces as we incorporated their activities into ours. Some members of the task force felt very strongly that the emergent model will change and reconceptualize how scientific research is done. It is more a scholar view than a librarian/information view that changes the way we exchange information and emphasizes that published information is out of date. We need to go through that process for some of the reward structures that we have. There is a strong feeling that this is an exciting time with respect to using the information network-based technology that will change how scholars do their work.

The improving technology does away with the presumption that print performs the functions of communications. Partly related to that emergent model is the fact that we do not rely on print as the primary means, among the primary scholars in any case, in exchanging or linking the communication of information about scientific and technical processes. Whether you think it has to go through a print stage first or whether we can bypass the print stage is another whole discussion about the differences between the modernized and emergent models. If the technology is good enough to replace print images, then we can talk about ways of not having that information in print.

One of the things that I will talk more about is recognizing this technological change and reexamining how the functions get done in our traditional library environment as we move away from print as the exchange medium. As I said earlier, the economics of this environment are really the tough area, not so much the technology, although we have a lot to learn about how to deploy our technology. There is much good work going on in that area.

From my perspective, many of us have fallen into the trap of thinking that if there are savings to be had in this technology environment they are going to come through the extension of the interlibrary loan or resource-sharing model. That has always been a tradition with librarianship: we conserve resources by coordinating development and cooperating activities sharing. As we examine these issues, there are serious marketing conditions in the scholarly information environments--copyright ownership, rights of use, access, pay-per-view--and we are not going to change those underlying property rights associated with ownership of intellectual property. While considering the imperfect market with respect to technical scholarship rather than a cost-based market we acknowledged that as the number of copies of a journal goes down in this field, the price per copy goes up. Many of you in this room have to acquire those documents, regardless of the price, because they are such prestigious journals, and there is a value associated with those prestigious journals. People who own these journals know that and can charge a price independent of cost-based factors. The studies in The Andrew W. Mellon report that Deborah produced showed that our costs for these kind of journals have increased substantially above the rights of inflation.

As we looked at this issue and surveyed the literature, we found that this marketplace is not competitive in the traditional economic point of view. Attempts to use the modernized or document delivery models focused on scarce resources such as the printed journal article or document, because those are getting more expensive. This expense seems large, but in fact it is only 30 percent or so of our budgets. In fact, that price will keep going up, even if we do only have one logical copy on the network, because of the pay-per-view strategy. We are not in a good bargaining position to reduce those costs, and it is an illusion to think we can save with that kind of strategy.

In the Task Force report, we emphasize that the document delivery strategies were short-term--high payoff, but short-term. In the longer term, the market conditions for the underlying costs of materials would be the driving force with respect to those costs. So where do we go? The hope is that there are significant economies of scale and efficiencies with respect to the other 67 percent of our costs associated with the way we conduct our business: storage, distribution, circulation, and the library environment. This is complicated by my earlier observation that we are going to have a mixed set of models here.

Scientific and technical journal pricing does not reflect a competitive, cost-based economic marketplace. There is a "scarce good" nature of scientific and technical journals. Our objective is to create a cost-based marketplace for scientific and technical information. The economic issues for content will determine if the savings benefits will extend to universities.

We have had print around for some time, so we are not going to get rid of all of our print material, even if we wanted to or if we thought that was cost-effective. But we do have to prepare for use. Where the payoff will be is in the reduced cost associated with circulating, storing, and organizing that information as we do a better job. We will also become more familiar and able to tap into the access model with respect to acquiring or getting access to information. The modernized and the emergent models can help identify savings in the storage, access, and circulation areas.

We discussed with the presidents that this has fairly significant implications from a collections point of view. What should one collect? In the report on the Foreign Acquisitions Task Force we talked about some areas where we will test this model by looking at the way collection decisions are made, who is going to hold these copies, and the issue of preservation of electronic information area. The technology will cause us to reassess these issues as to whether we should do things differently or not. We have a danger of collecting the same things and not having differentiation of the kinds that we have now.

As we looked at the cost marketplace, we were very concerned about its lack of competitiveness. We looked hard for examples that would encourage a more positive strategy, and we talked about changing rewards of faculty. Institutions must invest in campus-wide networks and keep them current; in fact, the emergent and modernized models require this. We must support a public stake in the federal national information infrastructure initiatives. One way to exploit the modernized model, for example, is to use document delivery strategies as a collaborative strategy among AAU institutions in the short term for scientific and technical information. We want to encourage experimentation of emergent model projects with campus faculty, technology managers, and libraries. This includes exploring the appropriate institutional interest in university-based scholarship. We also need to establish a national repository of STI to encourage preservation of electronically created material and allow the testing of federally funded network dissemination of STI research.

Since so much scientific research is funded by the federal government, we suggested that people who accept contracts or grants with federal funds be required to put the results of their information on the network without charge. It would go through the peer review process, editorial review, and all the functions of the scholarly communication process, but the faculty member would not release the copyright to the publishers as an exclusive right; it would be a nonexclusive right. The information could be placed in a database, allowing scholars to get access to current information. The network version of the information would have all of the editorial quality and trappings that a published journal does. Plus, you would eliminate the obligation of going through the costs and the redundancy, from our point of view, of publishing something, when in fact the information assimilation function has already been performed. There is a specific suggestion in the report about how to make this more operational using the federal government. One of the things we want to explore as the next step is to see if we can get the National Science Foundation or some other agency to help us produce such a model.

We looked at the document delivery strategy, what we call the modernized model. We want to caution about too much excitement with respect to long-term savings in that area. Although it improves access, the economics prevent it from having a long-term payoff.

We want to look at the changing relationship of ownership among institutions, faculty, and disciplines as one model, but we also want to look at the funding source as one model. We want to do whatever can be done to encourage a more cost-based strategy or exchange of information. We do not have a particular model in mind that we believe is the right one to implement. But if we are going to collect more information, we are going to have to move money from our existing support budget or our operational budget into our materials budget to acquire more information. I do not see the cost per unit of information coming down.