Contact Us | Members Only | Site Map

Association of Research Libraries (ARL®)

  Advancing Scholarly Communication Contact:
Julia Blixrud
Scholarly Journals at the Crossroads: A Subversive Proposal for Electronic Publishing

XVII. Systemic and Structural Costs -- Networks & Connectivity

Share Share   Print

A wider context for costs is invoked. How cheap will the infrastructure be? How expensive is a good network? Will universities and scholars have to pay more? How much?


Date: Mon, 5 Sep 1994 11:21:30 EDT From: Ann Okerson ann@cni.org To: Multiple recipients of list VPIEJ-L

The 12 August 1994 issue of the prestigious journal SCIENCE (of the AAAS organization in Washington, DC) carries a special section of articles on electronic networks/computing and science (pp. 879-914). It's worth a look for its generally upbeat overview of the kinds of important activities that broad-based network communications are facilitating and enhancing.

It is the lead article, "Culture Shock on the Networks," that reminds one of the recent "subversive" discussions here, though. The subtitle is, "An influx of new users and cultures could threaten the Internet's tradition of open information exchange, while commercialization is raising fears that pricing changes will squeeze e-mail and database browsing." The article expands on these themes and what it says is true: large economic and political forces and enormous growth are pressuring the system we have known, a system which is beginning a period of great change. We do not know what the future NII will look like.

Quoted is Rick Weingarten, executive director of the Computing Research Association here in DC and a strong advocate of the public interest. "What's the life expectancy of the culture of open information exchange if users have to pay a toll for every byte they send [NB: which, btw, is indeed the position of a number of publishing spokespeople]? ... We have to make sure that some public space is preserved. Otherwise, research, education, museums, and libraries really could get trampled."

"There is tremendous distrust and worry in the community about how this all working out," says Scott Shenker of the Xerox Palo Alto Research Center (PARC), who feels that some sort of usage pricing is inevitable.

The period of transition does indeed raises many such concerns and makes me, for one, less than sanguine that the indefinite continuation of the freely accessible world that Harnad, Ginsparg, Odlyzko, etc. imagine is much assured. If such access is not continued, then things will simply cost a whole lot more than all the projections -- which I want to believe but feel are somewhat unrealistic. It is important that we not only use the Internet for new ways of communicating research and scholarship and ideas, but that we also participate, however we can, in the telecommunications policy debates at whatever level we are able, so that widespread, cheap use can easily continue. The government doesn't just do things on its own -- it R US and the more voices that keep saying it, the better for the education, science, scholarly and library community.

Ann Okerson/Association of Research Libraries ann@cni.org


Date: Mon, 5 Sep 1994 13:01:44 -0400 From:quinn@calvin.math.vt.edu (Frank Quinn)

Dear Andrew,

Thanks for your comments. I would like to reply to two of your remarks. First, you write:

. . . you neglect the countervailing trends that are already visible. Peer review evolved to meet a real need. This need will continue to exist, and some mechanism will be set up to meet this need on the Net. The only question, in my opinion, is what this mechanism will be.

I quite agree with this. Our difference is in time scales. You have remarked that you tend to think of what the future might look like in 20 years. I concentrate on the next 10. I have no doubt that some mechanism will evolve. But if we simply let nature take its course there will be an unpleasant period AFTER the present system has declined, and we become acutely aware of the needs met by peer review, and BEFORE a replacement is in place. Two comments: the simple fact that it is unclear what mechanism will develop means it will be at least 10 years before it can be in place and credible. Maybe more like 20. The other comment is that evolution is a response to a change in environment. Relying on evolution commits us to feeling the pain before we begin to adapt. So my point is that we should be proactive to try to minimize the discomfort of the transition.

You also write:

A final point I would like to make is related to Paul [Ginsparg]'s comment that with electronic publications we should aim "for a system in which much more stringent standards are applied." It is hard to overemphasize the inadequacies of the present system.

and go on to discuss some of these inadequacies. Please remember that not all areas are the same in this regard. Paul has been quite open that the peer review system in his area has no credibility, and is so weak that dispensing with it would be little loss. I have had some contact with his area, and this is also my impression. So indeed ANYTHING he can do will be "much more stringent." This is certainly not the case in topology, and is really wrong in the most stringent area I have had contact with, algebra. These areas have a whole lot to lose.

I wonder, from your comments, if your field is more like theoretical physics than algebra in the effectiveness of quality control. Or perhaps you have not had enough contact with physics literature to appreciate what a real breakdown looks like. Anyway I urge you not to generalize too much from experiences in physics and psychology.

Best regards, Frank


From: "B.Naylor" B.Naylor@soton.ac.uk Date: Mon, 5 Sep 1994 18:15:21 +0100 (BST)

I'm interested to see that the question of charging for the use of the INTERNET has popped up again. I have to say that if it does once again go away it will only be temporary. I don't think that the argument about the flea on the tail of the dog will eventually carry decisive weight. From the way that dogs behave, I get the impression that they're well aware that fleas are about and they're going to "make them pay," in their case by some pretty fundamental disturbance.

One of the reasons why some librarians are perhaps more phlegmatic about this prospect is that we have been paying for online access to some journals (namely, secondary sources such as indexing and abstracting tools) for twenty years or more. We have already made the migration from an exclusively pay as you use or just in time tariff framework, to one which (via CD-ROM etc) allows us to mix pay as you use and pay up front (just in case) in accordance with what we perceive as our best interests, and the best interests of our users. We have even been paying for some information sources (e.g., in the field of law) which are crucial primary sources and not available in any other form except the electronic form, for about fifteen years. It's not clear to me why the growth (dramatic, I agree) in our ability to access information over the networks should be predicated on an assumption of a change in cost recovery practices which are already quite well established over more than a decade, albeit in a relatively small (but very important) part of the sector.

While I am on the ether, could I revert to the question of esoteric versus trade-scholarly, on which there has been some previous discussion? One factor making for differences in the debate between those on the two opposite sides of the pond is the great difference in the number of current journals taken. For example, the University of Wisconsin takes something like seven or eight times as many journals currently as the University of Southampton (something like 45,000 as against 6,000 according to the most recent figures I have seen). And many other big American research libraries are in the twenty odd or thirty odd thousand current subscriptions range. So it wouldn't surprise me if they carry a lot more esoteric material by comparison with their trade-scholarly accessions than we do. Certainly, not all our journals are used as heavily as the protestations of some of our scholars at the prospect of their cancellation might imply. But we are too inclined to forget this pretty stark difference between the big American research libraries (like Princeton I would guess, though they didn't cite their number of current journals in the source I used).

Another important concept that has to be weighed in this discussion (as I have mentioned in previous papers I have given) is the concept of redundancy. One simple way of pointing it up runs as follows: "If all papers worth publishing are to get published, it is inevitable that some papers not worth publishing will get published." The redundancy principle works in lots of walks of life (hospital beds, seats on trains etc etc); it is a factor of the human condition. The point that follows is that the esoteric papers should not be in any way separated off from the others; they constitute an essential part of the whole scholarly context. One should no more do that than one should say: "We'll have two constituencies of theatre, one where only the plays that are going to survive down the years will get mounted, the other for the rest."

One of the essential filtering processes will be: what are people prepared to pay for?

Bernard Naylor


From: Stevan Harnad (harnad@soton.ac.uk)

I disagree with Bernard on the same points that have come up before, so I will try to put it differently so as not to repeat myself:

From: "B.Naylor" B.Naylor@soton.ac.uk

One of the reasons why some librarians are perhaps more phlegmatic about this prospect is that we have been paying for online access to some journals (namely, secondary sources such as indexing and abstracting tools) for twenty years or more... It's not clear to me why the growth (dramatic, I agree) in our ability to access information over the networks should be predicated on an assumption of a change in cost recovery practices which are already quite well established over more than a decade, albeit in a relatively small (but very important) part of the sector.

The fact that we are in the habit of paying for things when we have no choice is hardly relevant to what we will be inclined to do when we do have a choice. But I must repeat: text for which there was a paying market in paper (such as indexing/abstracting tools, which become even more valuable in electronic form) will continue to have a paying market on the Net, and there is no reason it should not continue to be sold, on the classical trade model. (Hence the above example, besides being a minoritarian outlier in its proportion of the paper corpus, is also highly unrepresentative.) The issue is text for which there is no paying market, even on paper; text that the libraries and universities are ALREADY subsidizing now (but in a highly RubeGoldberg way, with hostage library budgets).

"If all papers worth publishing are to get published, it is inevitable that some papers not worth publishing will get published." The redundancy principle works in lots of walks of life (hospital beds, seats on trains etc etc); it is a factor of the human condition. The point that follows is that the esoteric papers should not be in any way separated off from the others; they constitute an essential part of the whole scholarly context. One should no more do that than one should say: "We'll have two constituencies of theatre, one where only the plays that are going to survive down the years will get mounted, the other for the rest."

Unfortunately, as pointed out when this same kind of inference was made by Frank Quinn, this reasoning is circular. There is very little correlation between the market-value of scientific/scholarly writing and its scientific/scholarly value. Hence "esoteric" does not mean of lesser epistemic value, it just means of lesser MARKET-value. Nor are redundancy and esotericity that tightly coupled. Most of it may be chaff, but what's not chaff is not measured by whether it's bought, but by whether it breeds: whether further knowledge is built on it. And the principle (if it's true, and it probably is) that we must be prepared to countenance a high chaff/wheat ratio in all fields of human endeavor in order to ensure the inevitable proportion of wheat -- again suggests that something other than market indicators might be desirable here.

One of the essential filtering processes will be: what are people prepared to pay for?

Indeed; and the likes of myself will always be looking out for the fate of what people are NOT prepared to pay for. Fortunately, we have in the skies a much stronger potential ally than in the Faustian medium of the paper trade...

Stevan Harnad


From: amo@research.att.com (Andrew Odlyzko) Date: Mon, 5 Sep 94 20:52 EDT Subject: Science article

I agree with [Stevan Harnad's] three claims, all of which say that Ann Okerson's alarm is not justified. The article in the August 12 issue of Science that Ann cites is rather confused on the two factors that are leading to changes in the way Internet is run: (a) participation of commercial organizations, and (b) growth of multimedia services (including Mosaic usage as well as the more exotic videoconferencing). It is factor (b) that is much more likely to force institution of a pricing scheme because of its dramatically higher bandwidth requirements. A better discussion than in the Science article of why this is so, and what kind of pricing schemes might be adopted can be found in

MacKie-Mason, J.K. and H. R. Varian, Some economics of the
Internet, in "Networks, Infrastructure and the New Task for
Regulation}, W. Sichel, ed., to appear. (Available together with other related papers: 
<http://www-personal.umich.edu/~jmm/papers.html#sei>.)

Some simple back-of-the-envelope calculations show that the fears that Ann and some of the experts quoted in Science express about the Internet being priced out of the reach of scholars are baseless. Reasonable videoconferencing systems run at about 400 kilobits per second. This is about 50,000 bytes per second, or 200 MB (megabytes) per hours. Now a typical paper is somewhere between 250,000 bytes (uncompressed PostScript) and 20,000 bytes (compressed TeX). In any case, the transmission of a one-hour videoconference takes about as much capacity as the transmission of between 1,000 and 10,000 papers. Further, as the Science article does point out, the videoconference transmission cannot tolerate any significant delays, whereas paper transmission can. Thus any rational pricing scheme will require substantially higher payments per byte for videoconferencing with a service guarantee than for a "best-effort" paper transmission that might be delayed by minutes. Thus we can expect that transmitting a paper might cost 1/10,000 or even 1/1,000,000 of the cost of a one-hour videoconference. (Some of the schemes discussed in [MacKieV] involve fees only for services with a service guarantee, which would let most scholarly communication go through for free.) However, videoconferencing cannot cost too much, or else it won't be used. Therefore scholarly electronic communication will have trivial costs.

We might have charges based on bytes transmitted, as opposed to capacity of the link to the Internet, but if so, the charge per byte will be so small as not to merit attention, at least for the kinds of transmissions that are required for publishing of today's scholarly literature (*). If your department gets charged for each word you write with a pen the department provides, would it affect how much scratch paper you filled with your jottings, if the total charge for a few months' work still came to the $1 cost of a cartridge refill?

Andrew Odlyzko

(*) The arguments above apply only to traditional publications, since that is all that is relevant in evaluating the feasibility of electronic versus print journals. Scholars will surely avail themselves of the novel services, such as videoconferences, and presumably their usage of such will be rationed by price. We can already see substantial loads on the network generated by genetic and astronomy data. Even mathematicians are becoming bandwidth hogs. For example, I cited the average paper as being 20,000 bytes in compressed form. However, my colleague David Applegate has now made available on the Internet proofs of the optimality of some Traveling Salesman Tours (an important combinatorial optimization problem he has been working on with collaborators across the country) that are 20 MB each, even in compressed form! (These proofs are not made to be checked by people, only by computers.) There will surely be many more such cases, as scholars do things electronically that are not possible in print. Technical and economic constraints will always be present, it's just that they have moved far enough away to enable print journals to be replaced by electronic ones at much lower cost.


From: amo@research.att.com (Andrew Odlyzko) Date: Mon, 5 Sep 94 21:51 EDT To: quinn@calvin.math.vt.edu Subject: Re: electronic pub. in physics

I don't think [Frank Quinn and I] differ much on what should be done. I agree with [Frank] completely that "we should be proactive to try to minimize the discomfort of the transition." However, the picture [Frank] presented in "Consequences of electronic publication in Theoretical Physics" seemed to be too bleak. Even if indeed the trends [Frank] describe[s] do continue uninterrupted in theoretical physics, without any counter-vailing forces coming into play, it's not clear how much that means for other areas, such as mathematics. {Frank himself] says that the refereeing system in theoretical physics is broken. If we accept that, then it is no wonder that there is no great rush to set up a rigorous system for electronic publication in that field. I do not think that mathematicians, say, should allow that to happen, and I have been arguing for an even more rigorous standard for e-journals.

I will spend a bit more time on [Frank's] second point:

ao> A final point I would like to make is related to Paul [Ginsparg]'s ao> comment that with electronic publications we should aim "for a system ao> in which much more stringent standards are applied." It is hard to ao> overemphasize the inadequacies of the present system.

fq> and go on to discuss some of these inadequacies. Please remember that fq> not all areas are the same in this regard. Paul has been quite open fq> that the peer review system in his area has no credibility, and is so fq> weak that dispensing with it would be little loss. I have had some fq> contact with his area, and this is also my impression. So indeed fq> ANYTHING he can do will be "much more stringent". This is certainly fq> not the case in topology, and is really wrong in the most stringent fq> area I have had contact with, algebra. These areas have a whole lot fq> to lose.

fq> I wonder, from your comments, if your field is more like theoretical fq> physics than algebra in the effectiveness of quality control. Or fq> perhaps you have not had enough contact with physics literature to fq> appreciate what a real breakdown looks like. Anyway I urge you not to fq> generalize too much from experiences in physics and psychology.

It is true that I have not had too much contact with physics literature, but what I had did not inspire me with any confidence in its editorial and refereeing system. However, all the areas I have worked in (and there are quite a few, such as number theory, cryptology, probability theory, combinatorics, and a few others) have very stringent standards as to correctness. I would venture to guess their standards are at least as high as for algebra, at least for journal articles. (Some of these areas do use conference proceedings extensively, but those are recognized as not being as reliable as journals, even when they do become the dominant mode of communication.) The journals I cited as examples ("Discrete Mathematics," "J. Combinatorial Theory," and "Codes, Designs, and Cryptography") are all in discrete mathematics, and all have stringent refereeing standards. Very few of their papers are incorrect. When I complained about "the inadequacies of the present system," I chose these journals precisely because they contain rigorously checked results. My point was that they fail to provide the signals as to significance of their results that are often touted as a great advantage of print journals (this claim is usually followed by the non-sequitur claim that therefore e-journals cannot replace print ones). Because of specialization (journals engage in "monopolistic competition," as economists call it), it is seldom that two journals are strictly comparable, and so the information that one can derive from where an article is published is "noisy."

Andrew Odlyzko


From: Ann Okerson ann@cni.org Subject: Re: Network Management To: amo@research.att.com Date: Mon, 5 Sep 1994 23:00:31 -0400 (EDT)

Andrew [Odlyzko wrote:]

I agree with [Stevan Harnad's] three claims, all of which say that Ann Okerson's alarm is not justified. The article in the August 12 issue of Science that Ann cites is rather confused on the two factors that are leading to changes in the way Internet is run: (a) participation of commercial organizations, and (b) growth of multimedia services (including Mosaic usage as well as the more exotic videoconferencing). It is factor (b) that is much more likely to force institution of a pricing scheme because of its dramatically higher bandwidth requirements.

The SCIENCE article made both points, but with respect to [Andrew's] (a) it is not participation of commercial organizations but the fact that the government/NSF is getting out of the network support business pretty much. They are continuing the process of handing the networks over to commercial organizations, a move which will be finished by next April.

Both (a) and (b) will be influential, or at least that is a view widely shared by policy makers and folks throughout the public interest sector here. SCIENCE is reporting that, not creating the concern.

Ann Okerson


From: Stevan Harnad (harnad@soton.ac.uk)

Steve Goldstein can correct me if I am wrong about this, but my understanding is that the NSF is now supplying only 10% of the cost of the backbone; when the Universities, which now pay 90% take this on, it will accordingly amount to 10% more than what they pay now. Because of the nature of network transmission, they have not found it necessary to pass on these costs to individual users so far, and I doubt that the additional 10% will change matters. It is indeed, as Andrew Odlyzko has aptly suggested, somewhat analogous to charging for ink used per word...

Stevan Harnad


From: Stevan Harnad harnad@ecs.soton.ac.uk Date: Tue, 6 Sep 94 18:31:34 BST Subject: Re: Naylor on Paying the Piper

Bernard Naylor B.Naylor@soton.ac.uk, quoting me, wrote:

Subject: Re: Naylor on Paying the Piper Date: Tue, 6 Sep 1994 09:58:17 +0100 (BST)

sh> The issue is text for which there is no paying market, even on paper; sh> text that the libraries and universities are ALREADY subsidizing now sh> (but in a highly Rube-Goldberg way, with hostage library budgets).

bn> This is an interesting use of the concept of "subsidy". I doubt whether bn> the purchase of academic journals by libraries has any elements bn> amounting to subsidy which economists could not point out are readily bn> perceivable in other settings where goods with "value" are acquired in bn> return for payment. If academe has a false sense of values in respect bn> of journals (or some journals), then it should set about correcting bn> that - as there are some tentative signs it is in the process of bn> starting to do.

sh> There is very little correlation between the market-value of sh> scientific/scholarly writing and its scientific/scholarly value. sh> Hence "esoteric" does not mean of lesser epistemic value, it just sh> means of lesser MARKET-value.

bn> I think the marketeers (who are my paymasters) would not bn> entertain this assertion for a moment. They would say: "If it's bn> worth having, it's worth paying for. People who try to deny the bn> links between valuing something enough to want it and being bn> prepared to pay for it are just trying to have their cake and eat bn> it." No doubt, they wouldn't claim that everything is correctly bn> valued in the market place, but they wouldn't see that as any bn> reason for not letting markets work. On the contrary; they would bn> say that the operation of the market should be reviewed in order bn> to make it work better. As you say, the wheat/chaff ratio is a bn> fact of life in so many areas. Paying for things (or not being bn> prepared to pay for them) is one way of sorting out the one from bn> the other which is well established. Naturally, people who write bn> articles for scientific journals might like to think that this bn> one area is so different that different processes should apply. bn> I just don't think the case has been made.

I regret that I must keep disagreeing with my new Southampton colleague before we have even had a chance to meet nonvirtually, but there are two crucial points that are either being systematically misunderstood or have so far managed to escape notice:

(1) I have not for a moment suggested that, when there is something that people want and need that they can and must pay for, they should not or will not. What I am saying is that whereas a circumstance conforming to this did indeed obtain in the case of paper (esoteric scholarly/scientific) periodical publication, it no longer obtains in the electronic-only medium (and to keep speaking or thinking of it as if it did does not make it so; it is simply a failure to take a proper measure of the radically new circumstances): To spell it out: it is the "must" that no longer applies (given the true per-page costs of electronic-only scholarly periodical publication, which I estimate at below 25% percent of the per-page cost in paper). There is now a CHOICE available to the consumer that never existed before. Say whatever you want about market forces, if there is a way to get something (practically) for free, there is nothing (except duress or opacity that will make consumers continue to pay for it. And that brings me to the second crucial point:

(2) Even NOW, in paper, the consumers (i.e., the readers) of the esoteric periodical corpus are NOT the ones paying for it (hence it is with justification that I say that their consumption is ALREADY "subsidized" -- by the university libraries, for the most part). All I am proposing is that this subsidy would be much more sensibly placed up-front, once the per-page charges shrink to their electronic scale: Let esoteric AUTHORS be subsidized for publishing, rather than esoteric readers for reading. The consequence will be more (and, if properly peer reviewed, better) esoteric publishing and a GREAT deal more esoteric reading (currently constrained by both the cost and the inconvenience of paper). And the entire scholarly/scientific community (as well as humanity as a whole, if you believe that learned inquiry is a good thing) will be the beneficiaries.

Ceterum censeo: market-value is not the proper measure of scholarly value (it's NOT just a matter of weeding out the bad buys among libraries' current periodical acquisition lists!). The essential esotericity of human inquiry is fundamentally at odds with mass-market thinking.

Stevan Harnad


From: amo@research.att.com (Andrew Odlyzko) Date: Tue, 6 Sep 94 06:52 EDT To: ann@cni.org (Ann Okerson) Subject: Network Management

As Stevan has already mentioned in the message he posted a couple of hours ago, the fact that "the government/NSF is getting out of the network support business" does not matter much. Too little money came from that source for it to be the dominant factor. If the Internet were to be occupied just by academic researchers, and there were no dramatic growth in the demand for new services, the present service providers would continue the existing policy of charging by the capacity of the link to the Internet that they provide. The costs of implementing tolls are considerable; the regionals have been doing well with old policies.

The two factors influencing the evolution of the Internet are (to quote from my earlier message)

(a) participation of commercial organizations, (b) growth of multimedia services (including Mosaic usage as well as the more exotic videoconferencing).

Each is leading to changes. Factor (a) yields a much higher growth rate than would prevail if only academic organizations were involved. It also leads to incidents such as the immigration lawyers' flooding news groups with ads for their services. To prevent that, some sort of access controls might be needed. However, the growth rates for traditional text transmissions from these new commercial entrants to the Internet are not all that dramatic, and might have been accommodated with traditional pricing schemes (by capacity size, or, in terms that librarians use, "just in case"). On the other hand, factor (b) appears to force the introduction of a pricing scheme soon. This would be so even if only academic researchers were involved. The reason is the dramatically higher bandwidth requirements of the new services. The videoconferencing example I cited in last night's message requires 0.4 Mbs (megabits per second). The Internet backbone operated until recently at 45 Mbs and the trans-Atlantic link at 1.5 Mbs (although they have probably both been upgraded by now). Relatively rich institutions have T1 links at 1.5 Mbs, and many poorer ones only 0.056 Mbs. Clearly the infrastructure we have now is not adequate to support videoconferencing on a large scale, and so some sort of control is needed. (There are also fascinating technical issues about congestion controls on networks with multimedia traffic, which require new routing schemes to be developed, but that is another issue.)

I referred to the Science article as "confused" because it did not point out the relative importance of factors (a) and (b) to the changes that are taking place and are likely to occur soon. For example, when researchers talk about "loss of innocence," they often mean only the troubles with lawyers advertising on news groups, which is part of (a).

Here is one final argument that should allay the "dollars for every byte" concerns about prices for Internet services. According to that August 12 issue of Science, the Internet traffic is around 13 terabytes per month (tera here is 10^12), or around 150 terabytes per year. The total charges for running the backbone and the regional service providers seem to be around $200 M per year, with about $20 M coming from the explicit NSF subsidy that is being phased out. (There is also indirect government support for development as well as for access charges to the regionals, which often come at least partially from the overhead on government grants and contracts, but we'll ignore those, as they are being threatened with cutoff.) Hence if we tried to recover present costs by charging a uniform price for each byte, the charge per byte would be $ 1.5*10^(-6). A typical email message of 2,000 bytes would then cost all of $0.003. A paper of 50,000 bytes would be more, $0.075. When I sent out the latest draft of the "Tragic Loss ..." essay, which was almost 200,000 bytes, my mailing list had around 300 addresses, so this giant mailing of 60 MB would cost $90.00. Given the rapidly decreasing prices for networks, I feel it is safe to conclude that tolls on the NII are not going to be large enough to impede scholarly communication.

Andrew Odlyzko


Date: Tue, 6 Sep 1994 11:33:16 EDT From: ghermanp@kenyon.edu (Paul Gherman) Subject: Cross subsidy To: Multiple recipients of list VPIEJ-L

[Andrew Odlyzko] suggests that we need not worry about the cost of transmitting text over the internet because the cost will be so much lower than the cost of transmitting video. He postulates that the cost of video will need to be kept low, and therefore the cost of text will be proportionately lower. Past practice would suggest quite the opposite, that the cost of text transmission will be close to the cost of video transmission, and the differential will cross subsidize the cost of video, keeping video affordable. The telcos see the real profits in video not text, so I suspect they will bump up the price of text to lower the cost of video.

Paul Gherman


Date: Tue, 6 Sep 1994 11:33:41 EDT From: "James O'Donnell" jod@ccat.sas.upenn.edu Subject: esoteric fleas To: Multiple recipients of list VPIEJ-L

I've been reading the debate on this list with interest, and while my heart is with those who look to a future of free information, my head is cautious. NSF privatizes the backbone, as Ann points out. What happens then if Rupert Murdoch decides to buy the company that supplies the backbone? Are we going to depend on the FCC to come in and remember that there are academics out there and cut us a special break? This isn't just marginal business news we're talking about, this is the biggest new money-making playground opened up since Japan reindustrialized after the war: the big boys are going to be taking this game very seriously, and they will gladly squeeze every esoteric flea for every penny we've got. We may be able to resist, we may be able to get some special breaks: but it won't come easily or automatically, and we must not be blase about it.

Jim O'Donnell Classics, U. of Penn jod@ccat.sas.upenn.edu


Date: Tue, 6 Sep 94 15:29:12 -0400 From: Hal Varian hal@alfred.econ.lsa.umich.edu

Sorry to butt in on the discussion, but I've been doing lots of work in this area and thought that I might be able to help. Those of you with Mosaic might want to look at my page on the "Economics of the Internet" at http://gopher.econ.lsa.umich.edu. The "Economic FAQs about the Internet" available there is especially relevant.

ghermanp@kenyon.edu (Paul Gherman) suggests:

Past practice would suggest quite the opposite, that the cost of text transmission will be close to the cost of video transmission, and the differential will cross subsidize the cost of video, keeping video affordable. The telcos see the real profits in video not text, so I suspect they will bump up the price of text to lower the cost of video.

The problem is that there is no way to do this: bits is bits. The current Internet treats all packets the same. Future protocols will probably want to treat different types of data differently, but whatever pricing scheme is invoked will have to be (warning, economics jargon coming) "incentive compatible" since it is trivial to disguise data: if video is subsidized, I can just create "video packets" that contain text.

"James O'Donnell" jod@ccat.sas.upenn.edu asks:

What happens then if Rupert Murdoch decides to buy the company that supplies the backbone?

There are currently 4 backbone suppliers ANS, Alternet, PSInet, and SprintLink and more entry is expected. In fact, all that it takes to enter the business is some money to rent a telephone line, a few routers, and---most importantly---some engineering expertise.

Hal.Varian@umich.edu Hal Varian voice: 313-764-2364 Dept of Economics fax: 313-764-2364 Univ of Michigan Ann Arbor MI 48109-1220


Date: Sat, 10 Sep 1994 14:43:59 -0400 From: quinn@math.vt.edu (Frank Quinn)

This is another reply to the questions about charges for internet use. It seems to me that there are actually several similar questions being asked, and the answers suggested don't always address the intended question. Maybe this is why the issue keeps coming up. Anyway three variations are discussed here.

First, charges for use of the internet itself, without regard to the content of the message. Such charges may be coming, but I believe the right perspective is provided by thinking of them as postage, rather than subscriptions. Right now it looks free because our institutions have "bulk mail" arrangements. But even if it changes to a per-piece charge it will be small (note 1).

The second variation concerns charges related to the content. These are in the form of site licenses, individual connect charges, or delivery fees. None of these seem likely to become widespread. Individual charges shift the expense from libraries to the individuals, and the individuals I know will have no enthusiasm for this. Use would plummet since browsers won't pay. The "subversive proposal" mechanism would also take a toll: People would find other, free, ways to offer and obtain the information. As for site licenses, it has already been observed that the market for new journals in any format has collapsed. Expensive electronic startups (e.g. Online Journal of Clinical Trials) have not fared well for this reason if no other. This narrows the possibilities down to paper journals converting to electronic format and trying to retain the subscription base. Does anyone know of a successful example (scholarly journal)? The ones I know about (e.g. "TULIP") are still piggy-backed on paper subscriptions. In any case there is no mass movement in this direction, and publishers (at least) don't seem to have any confidence in it.

The third variation on the question is not so sharply formulated. Currently we pay a lot for access to information, through journal subscriptions. Soon, we are led to believe, we will have nearly free access to it all. Surely this is too good to be true, and someone will find a way to re-institute charges? Even if the arguments above are correct, isn't something else we can't forsee bound to happen?

Put this way we see good news and bad news. The good news is that yes it really will be essentially free. The bad news is that we will get what we pay for: the old system is expensive partly because it adds value during the transmission process, and the new system doesn't. As far as I can tell nothing has been improved by being sent out over the net. The problem is not that they will find another way to charge us. The problem is that the added value we used to buy, and were willing to pay for, may no longer be available for sale. (note 2)

=== notes ===

Note 1) I do NOT, however, buy the argument that, since the market will be driven by video-on-demand, our relatively insignificant bandwidth requirements must be almost free. The video-on-demand idea is shaping up in an unattractive way: see "Dreamnet," Charles Piller, Macworld October 1994. Some proposals have nearly all the center-to-user bandwidth dedicated to video, and the user-to-center bandwidth so spread out that it will scarcely match current internet capabilities. A very poor medium for scholarship! Maybe there is a bit of comfort though: with such poor functionality in the commercial versions the academic network will probably stay intact and separate for a long time.

Note 2) Most people feel some substitute will evolve for the value added by print publication (particular quality control via peer review). Paul Ginsparg, for instance, seems to agree that the total "archive" will be of lower quality without quality control at the front end of the process. He suggests that instead of CONTROL, we will have GUIDES which will develop as an "overlay" on top of this archive, and point us toward the good stuff in it. "Overlay guides" already exist in the form of review and survey articles, and selective bibliographies. It is an attractive and hopeful idea that this activity should increase and diversify. But incentives and a support mechanism (eg. a way to redirect current subscription budgets to pay for it) are still missing. Conyers Herring in 1968 (Physics Today) argued strongly that more reviews were needed even then. The situation is worse now, 26 years later, but reviews have failed to materialize in the necessary quantity. Why should this trend reverse itself when the literature goes electronic?


Date: Sun, 11 Sep 94 07:56 EDT From: amo@research.att.com (Andrew Odlyzko)

Here are some more matchbox estimates for you. The current traffic through NSFNet, the heart of the US part of the Internet, is about 15 TB (terabytes, or 10^12 bytes) a month. It is often claimed there are around 20 M denizens of the Net, but that seems to be an overestimate, and in any case, many Net users seldom use the US portion of the Net, or else have poor connections to it. However, there ought to be at least 5 M users with decent access. Now 15 TB divided by 5 M users yields average traffic per user of 3 MB per month. This is good for quite a lot of email messages and even a few file transfers, but not much more. A single serious "surfing the Infobahn" session with Mosaic, looking at some graphics, etc., can easily require the transfer of more than 3 MB of data all by itself. This suggests that:

(a) The Net in its present form could accommodate scholarly publishing only in a bare-bones form, without use of modern tools such as Mosaic. We do need a few more years of growth in its capacity before scholars' needs can be accommodated and yet still be just "a flea on the dog's tail."

(b) We do not need to await the arrival of videoconferencing to bring the Net to its knees. Wider usage of Mosaic will do that by itself. In particular, there may be a crisis on the Net in the near future. Traffic on NSFNet has been doubling every year for the last few years. However, WWW/Mosaic traffic is growing at astounding rates, doubling every 3 months or so, and already amounts to 5-10% of all Net traffic. Hence we may soon see serious congestion, followed by usage restrictions and tolls. This does not alter the long-term outlook that scholarly publication will be accommodated easily on data networks, but will surely be used by nay-sayers as a sign that we can't rely on communication to stay cheap.

My feeling is that a crisis caused by rapidly growing use of Mosaic can only be good in the long run. It will show that there is a lot of demand for electronic information that is presented in decent form, and will lead to more capacity. The more traffic on the Net, the lower the costs will be. We really do want the hoi polloi on the Net, to cover up our usage.

Even in the short term, imposition of tolls and different grades of service is likely to be useful, in that it might even out the traffic. I've been corresponding with Hal Varian in the last few days on the question of Internet traffic, and it appears that the main Internet links in the US are on average utilized at only 5% of capacity (judged on a monthly basis), although there are shorter measuring times when the utilization reaches 50% (and clearly there are bursts when it is at 100%). It should be easy to increase that usage by factors of 3-5 by charging fees for immediate transmission, and allowing free transit for traffic that can wait an hour or a day.

Andrew Odlyzko


[Ed. Note: the message that follows was not part of the discussion but it surfaced about the same time as the messages above were being distributed and was seen by a number of players. We include it with permission, for it casts light in some murky corners.]

Date: Tue, 13 Sep 1994 09:25:06 -0700 (PDT) From: BOBG@u.washington.edu Subject: Slightly revised--Latest version of Cost memo Q/A To:ann@cni.org

Robert G. Gillespie

NSFNET Pricing/Costs

  1. Many people appear to use the Internet for free regardless of the distance or the amount of use. Who does pay?

Faculty members and others in institutions that have connections to the Internet through their regional networks ordinarily do not see any charges because most institutions have not chosen to recover their costs through usage charges to the end-user. The institutions are paying for the connections and services in these categories:

1) The institutions are investing in their local infrastructure (networks, workstations, operations, training, software, routers, etc.).

2) The institutions pay for access (connectivity) to the NSFNET through their regional networks (NWNET, SURANET, CICNET,...). Those fees are usually membership fees that provide operations, training and access. In addition the institution usually is paying the regional telephone company for the high speed line that connects the campus network to the regional network.

3) The institution may be also paying for the use of information services (commercial data bases) from an information service provider. In some circumstances those are flat fees governed by a maximum number of users.

4) NSF has been paying for the cost of the national backbone (to ANS, MERIT, MCI and others) that has interconnected the regional networks (and others) and also has paid for a portion of the regional network costs. Total costs for networking include local computer support and local network infrastructure, in addition to the wide area services. It is estimated that NSF's support defrays less than 10% of the overall networking costs spent by institutions.

For institutions that do not have direct connection and are using dial-up connections, faculty members and others may be already paying usage fees to Internet service providers. Of course those fees can be usage based or flat rate depending on their service provider's approach (and competition). Currently there are about one thousand out of thirty six hundred institutions of higher education that have direct connections to the Internet. A recent NCLIS survey of a sample of public libraries indicates that approximately 21% of public libraries have Internet Connections, but many of these use dial-up connections.

2) How can a message that is sent across the world with a million bytes be treated the same as one sent in the state with a few lines?

There are several reasons for this. The major cost for lines and routing remain independent of the volume of use until it is necessary to add more bandwidth. Also, because of the way that packets may be routed through different elements and subnetworks, it would be very difficult and costly to keep track of the actual paths and use.

However, growing traffic because of increased use (more users, more bandwidth intensive applications--video, mosaic, etc.) means that the network bandwidth must increase or congestion and delays will occur. Other approaches to avoiding congestion involve establishing priorities or setting different rates for different types of service.

The costs of increasing bandwidth are closely related to the costs of switching (which track the computer chip costs). It may be possible to match the demand for bandwidth without increasing costs as the technological improvements in switching speeds lower costs for bandwidth. However, since there are other factors involved---for instance, regulatory oversight--there are no guarantees!

3) Who sets the policy for the way that persons connected with an institution or library pay?

The institution/campus/library determines how the cost should be recovered or not (just as for telephone costs or computer costs). Neither NSF or the telephone companies set the rates for Internet services. Those are set by Internet service providers which are either nonprofit (like most regionals) or commercial.

4) How will transition to the new NSFNET affect charging to the faculty members and others connected with institution?

The faculty members and others are unlikely to see any changes. Charging is a policy decision for the local institution or library.

Institutions may see an increase in the cost of connection (in their membership fees for regional networks). This is not expected to be more than 10% but there may be anomalies. While NSF has assumed that institutions will be able to absorb these cost increases at this level it is also providing transition funds for the regionals over four years to cushion the impact of those changes. The new architecture replaces the current single national backbone, NSFNET, which currently carries the traffic between regional networks, with a new architecture where traffic between regional networks will be carried by commercial network providers. NSF is funding interconnection points (Network Access Points --NAP) which will provide interconnection for all Internet network service providers.

5) What are the some of the difficult issues ahead?

Guiding the Internet through another order of magnitude of growth and yet providing the stability and increased services necessary will be difficult. Some good questions that were discussed at the recent FARNET Workshop on the Transition from the NSFNET included:

  1. How and who will ensure that interconnection is ubiquitous?

2 . How will cost and charges be handled?

  1. How will "seamless problem solving" be achieved?

  2. Who will set the ground rules for technical interactions?

  3. What is the attitude toward providing enhanced privacy and security?

  4. What kind of policy framework/governance needs to be established? What are the policy framework options for disputes and resolution of issues? How will policy issues and resolution of disputes be handled for users?

  5. What resources are need to achieve interruption-free service?

  6. How will the positive aspect of the Internet culture be preserved? What are they? Can they be scaled up?


Forward to Chapter XVIII

Backward to Chapter XVI