Association of Research Libraries (ARL®)

http://www.arl.org/resources/pubs/mmproceedings/nicholson.shtml

Publications, Reports, Presentations

Membership Meeting Proceedings

The Changing Nature of Intellectual Authority

Peter Nicholson
President, Council of Canadian Academics

Today I will argue that what qualifies as intellectual authority in contemporary societies – who and what to believe – is changing fundamentally. I will speculate as to the reasons, and I will draw out some of the implications for institutions of knowledge brokerage, among which research libraries are of course prominent.

The thesis in a nutshell is this. People today are much less prepared to defer to the experts. But at the same time, we are being swamped with data and information – a glut that cries out for analysis and summary. So there's a dilemma. Who to turn to? Increasingly the answer is – Well, to ourselves of course, as individuals empowered by a world wide web that has rapidly evolved into a social medium. More specifically, it is a medium that today supports massively distributed collaboration on a global scale that – we can only hope – will help us make sense of it all.

How does this deep social and cultural transformation relate to the particular concerns of those of us in this room? I can do no better than quote from an ARL Task Force on "Collections and Access Issues" to the effect that transformation in libraries mirrors the ongoing change of the research institution, just as the transformation of the institution reflects broader societal and cultural changes. The transformation of the research library cannot be understood apart from this larger context and the cultural changes that shape institutional growth. [1]

My purpose in these remarks is to offer one outsider's perspective on what seem to me to be the deepest and most pervasive changes that are shaping the transformation not only of the library, but of all forms of intellectual authority in today's society.

Let me say at the outset that I am not particularly comfortable with the future I foresee. I am, after all, a charter member of the 'old guard? and will never really belong to the new. But I am also an optimist and a realist. The world has changed – and so must we.

The Decline of Deference

I want to begin with some very general remarks on contemporary attitudes toward hierarchical authority generally – of which intellectual authority is but one instance.

As President of the new Council of Canadian Academies – an organization that oversees expert studies of the science underlying important public questions – I, like you, am in the business of brokering intellectual authority. I admit to being a traditionalist in the sense that I believe intellectual authority should have a close correlation with expertise. And it should flow from the tried and true, though never infallible, processes of peer review and other forms of elite consensus building.

More than that, I am comfortable with hierarchies that are based on merit. And I am quite willing to defer to the well-established institutions in today's society since, on balance, I believe that their power is adequately constrained by the legal, economic and political structures of modern democracy.

But I also believe that the values that have shaped my world view – and that of my demographic peers throughout the industrialized world – are being eclipsed by a new paradigm. This new framework is shaped by technology – primarily information and communications technology; by globalization; and by a culture which, to an unprecedented degree, celebrates and empowers the individual.

One of the most significant symptoms of this profound shift is the widespread "decline of deference" to virtually all forms of traditional authority – the church, the school teacher, the family doctor, the business executive, the union leader, the politician, and not least, the intellectual. In short – out there on main street, mistrust and scepticism reign.

While all this is widely recognized, the truly fundamental reasons for it seem not to have been explained in sufficiently broad sociological terms. The explanations we do see typically cite the public revulsion that stems from specific cases – for example, scandals in the Catholic Church; or in businesses like Enron; or in politics – Watergate; the "sponsorship affair" here in Canada; the failure to find WMDs in Iraq; or to warn the British public of BSE. Take your pick.

The key point is that the decline in trust of – and therefore deference to – traditional sources of authority is a nearly universal feature of advanced societies. It transcends every specific, local instance. And it didn't just happen yesterday. Deference to hierarchical authority has been declining for at least the past 50 years and was in fact foreshadowed in the anarchist literature of the 19th century.

Clearly, therefore, we are witnessing socio-cultural change whose roots run very deep in the nature of industrial society – essentially a force of nature. But the only broad analysis we are being given is from the "postmodern" cultural theorists whose writings are largely inaccessible, even to most academics. I think we are owed a better explanation.

But for our purposes today, I believe one can simply take it to be a fact that societies formerly based on deference to authority, community loyalty, and the struggle for the material basics of life have largely given way to societies based on self-worth, consumer choice, and the search for personal fulfillment.

When these objectives are combined with the empowering tools of universal education, a rights-oriented political culture, and the Google search engine, we should not be surprised that people &nash; and particularly younger people – regard ex cathedra expert authority with scepticism, if not outright hostility.

The paradox is that expert opinion is being sought and cited more than ever. But increasingly it is individuals themselves who weigh the various authorities and come to their own conclusion. Just ask doctors about their web-savvy patients. Or ask your own clients. The ARL Task Force to which I referred earlier cited a large survey which found that the dimension of service quality for which users have the highest expectation is "personal control" – i.e., services and tools that enable patrons to easily access information independently.

Role of the Media

Let me open an important parenthesis here on the role played by the media in shaping broader public attitudes toward intellectual authority. The prevailing ethic in journalism is that "fairness" requires that all views on an issue be presented, often without regard for the relative weight of authority of various sources being quoted. The objective is simply to report point, and counterpoint, with an emphasis increasingly on sensationalism, official screw-ups, and conflict – i.e. those things that can attract at least fleeting attention – and advertising dollars – in an information environment that has become super-saturated.

The net effect is to create in the public mind an impression that experts can never agree; and expert authority is thereby diluted. A prominent case in point is medical journalism where the daily >reported advice keeps flip-flopping, whereas the full text of the journal articles would reveal the provisional nature of findings, statistical caveats, and so forth. The bottom line is that the mass media treatment of scientific and technical issues reinforces the prevailing scepticism as to the consistency and trustworthiness of expert authority.

There is an irony here. It is that commercial media have themselves become caught in the web of public mistrust and scepticism. The TV networks and major papers like the New York Times are now objects of intense scrutiny by an army of specialized bloggers to whom a sceptical public increasingly turns for the real scoop.

The Dilemma of the Information Age

Coming back to the main line of my argument, we find that while expert-based authority is being challenged, the volume of information and the economic significance of knowledge are exploding. Information technology itself – whose capacity continues its four-decade exponential improvement – is clearly a key part of the reason. But so too is the huge global expansion of knowledge-generating capacity, the more so as China and India and other giants plug into the economic and research networks of the industrialized world. These societies are adding tens, and soon hundreds, of millions of trained knowledge workers who will bring cultural and intellectual perspectives that are quite different from those of the West. We can therefore expect an unprecedented surge of innovation as the two worlds meet, finally on equal terms.

So it appears that we are facing a dilemma. On the one hand, the whole world is struggling to cope with an information explosion that shows no sign of letting up – quite the contrary. We need somehow to transform a data torrent into useful information and knowledge that can power economic progress and human fulfillment.

But, on the other hand, the agents we have relied upon traditionally to filter and manage information, and to broker formal knowledge – agents like research universities and their libraries, the serious media, and highly trained experts of all kinds – are less trusted as intermediaries than they once were. And even if that concern is perhaps overstated and too pessimistic, we need to ask ourselves whether these expert resources are really up to the task of managing the information glut anyway. Just ask journal editors and referees, or researchers in any dynamic field, how well they are keeping up. Ask yourselves.

Part of the response, of course, has been to deploy the same computer technology that is facilitating the information explosion in the first place to help cope with its management. In other words, the offence is also the defence. And that is happening on a massive scale, nowhere more so than in your own institutions. The digital library, and its expert intermediaries – despite a number of daunting technical challenges, not least being simply the very long-term preservation of digital assets – will be one key part of the solution indefinitely far into the future.

But my contention is that this will not be nearly enough. The sheer volume of information, its global origins, and especially the dynamic, real-time nature of information today will simply overwhelm centralized, and fundamentally bureaucratic institutions.

The infosphere, if I could use that term, therefore needs new and decentralized mechanisms of self-regulation and self-organization, much like a complex economy which, as Adam Smith realized, needs the guidance of an invisible hand.

Massively Distributed Collaboration

I believe that the outlines of just such a mechanism are already emerging in the multifaceted development of what cyber-prophet, Mitch Kapor, recently dubbed "massively distributed collaboration." [2] Probably the single best example is Wikipedia, the free, on-line, user-edited encyclopaedia that in just over five years has become one of the most-visited sites on the web. I will have more to say about that in a moment.

But something much broader is going on. The world wide web has already evolved into a social medium — what some are calling Web 2.0 — a global many-to-many meeting place, very unlike the one-to-many connections of radio, TV, books and newspapers. The latter media are inherently hierarchical — a communicator of one to an audience of many. The social web, on the other hand — like Thomas Friedman's new world — is flat. It is in tune with today's ethos. Just consider some of the manifestations:

In summary – and this is probably my key message – we are witnessing in these examples the convergence and mutual reinforcement of two of the great defining movements of the past half-century – one cultural, the other technological – i.e. the ascendancy of the individual together with the empowering technology of the computer, now enormously amplified by global networking – creating essentially a "cyber nervous system" for the entire planet.

This is an epochal development that will not be reversed. The job for all of us in this room is to figure out how to be a constructive part of it. In so doing, there is no place either for complacency or wishful thinking. What is demanded from libraries is transformation, not merely "change management." And this transformation requires that research libraries adapt rapidly not only to the breakneck pace of technology, but even more fundamentally to new information seeking and usage behaviours of students and faculty alike.

Wikipedia, etc.

In the remainder of these remarks I want to take a closer look at one important example of massively distributed collaboration (or MDC) – and specifically the on-line encyclopaedia movement, since this illustrates most directly how MDC is already transforming the nature of intellectual authority.

The flagship example is Wikipedia, founded only in January 2001, but already the site of nearly four million entries in almost 200 languages. There are more than 1.1 million articles in English, growing by about 1,500 a day. [3] The Encyclopaedia Britannica, by contrast, has about 65,000 articles in the print edition and 75,000 on-line. [4]

What is most amazing is that Wikipedia is doing all this on an annual budget of just $1.3 million, 60 per cent of which goes for the cost of computer hardware, leaving only about $500,000 to cover everything else! [5]

How can that possibly work? Well, for starters the articles are written, and re-written, by volunteers. The website is equipped with so-called "wiki" software that allows anyone with a browser to edit virtually any article at the push of a button. In the flat culture of Wikipedians, experts and dunderheads are equally welcome. The main editorial principle is that articles should reflect a neutral point of view. This is not a site for cranks and propagandists. Acts of deliberate vandalism are not tolerated and are usually corrected very quickly. On the other hand, decisions as to what is deemed to be unjustified bias are taken consensually, and this can be excruciatingly drawn-out in contentious areas.

At first blush, it admittedly sounds a lot like "monkeys with typewriters." But in fact it?s not. In a widely publicized and controversial head-to-head test with Britannica, reported last December in the journal, Nature, expert reviewers determined that Wikipedia articles, on average, contained "only" a third more inaccuracies than their Britannica counterparts. More to the point, only eight serious errors were reported in the sample of 42 topics with an equal number, four, attributed to each source. [6]

Having read the full text of the debate between Nature and Britannica over the methodology of the comparison, I would grant many of Britannica's objections, but would still conclude that the essence of Nature?s findings remains intact. Wikipedia is surprisingly good, especially for a five-year old; and even a source as well-researched as Britannica still contains a significant number of inaccuracies.

The real bottom line, of course, is that notwithstanding doubts about its reliability, Wikipedia has taken off like a rocket. We need to understand why.

Obviously, being instantly accessible and free – no ads at all – is a big plus. Whether the absence of a business model is sustainable remains to be seen. It will depend on the continuing commitment of Wikipedia's volunteer contributors. But some moderate, low-key commercialization would not, I believe, kill the concept. That's because the real power of Wikipedia is that it?s in perfect synch with web culture – which mirrors today's attitudes, and even more so tomorrow's. Wikipedia is also in synch with globalization – 200 languages represented with much of the content original to each language, not simply translated. And Wikipedia – like Google, and blogs, and open source software – operates in synch with the rhythm of the web, incorporating new information continuously in real time, 24/7.

This last point is important, and is part of a much larger story. I can only summarize. The "half-life" of active information has been getting shorter and shorter due primarily to the sheer rate of information generation. There is more and more to process, but not more hours in the day, and not more raw individual brain power to apply. So we graze, or we gulp, and then we move on. The half-life is also shrinking due to the very nature of electronic technology which makes "overwrite" so easy and natural. We are all becoming addicted to the "refresh" button. Documents of every kind – certainly in my experience in business and government – are being revised continuously until the moment they become virtually obsolete. And as the shelf-life of any particular information product gets shorter – whether it?s an e-mail or a position paper – fewer resources of time and money can be put into its creation. The ubiquitous deck of bullet points is the iconic example.

The result is a dumbing down of written communication. We can decry it – and I do – but it reflects a probably necessary trade-off in favour of easier and quicker absorption, unfortunately at the expense of nuance and rigor.

This has profound implications for how good is "good enough" when it comes to authoritative information. Where lives or fortunes depend on it, complete accuracy still matters as much as ever. But for most everything else, the tradeoff point is moving toward faster, not deeper.

This is a context in which massively distributed collaboration systems like Wikipedia excel. But the advocates of MDC claim more, and believe that it can be both faster and deeper.

They may have a point based on the old adage that two heads are better than one – and thousands or millions of heads are incomparably better. This thesis has been developed in fascinating detail by James Surowiecki in his recent book, The Wisdom of Crowds. It is also the validating belief of the open source software movement, summed up in the motto – Given enough eyeballs, all bugs are shallow. [7]

Maybe. But in the case of specialized subjects where quality criteria are more judgemental (unlike software bugs), or where relevant expertise is spread very thinly, the "crowd" is unlikely to be sufficiently wise. So there will always be a secure niche for expertise in the traditional sense. Indeed, that conviction led Wikipedia's co-founder, Larry Sanger, to leave what he had created out of despair over the hostility toward expert authority that dominates Wikipedian corporate culture. Sanger is now creating a new on-line authority, Digital Universe, that seeks to provide both expertly-created as well as collaboratively-developed content. [8]

We should stay tuned, because the puzzle that the Larry Sangers of this world are trying to solve goes to the heart of the challenge facing research libraries. That challenge is to evaluate and integrate very different methods of ascertaining intellectual authority — ranging from the continuously-flowing, collaboratively-determined "truth" of Wikipedia and its ilk, to the timeless records of solitary genius.

The Infosphere as Ecosystem

This leads to my final point. We should be thinking of the infosphere as an ecosystem where different "species" are adapted to specific niches. Google, for example, delivers fantastic volume but the measure of relevance is still pretty crude. Blogs give you an up-to-the-minute read on what's hot. Wikipedia provides a great first cut at coherently organized material plus a good set of relevant links. But if reliability is a critical objective, then sources like Britannica, or research journals, or original documents become progressively more important.

It's horses for courses. There will never be one site to fit all, a point that is glaringly obvious but too often overlooked by the partisans of this source or that. So the relevant task is to educate the users of information – and we all are users – as to what is right for what purpose. Information, and the knowledge that can flow from it, is more than ever the lifeblood of our economy and culture, so we must all become far more sophisticated consumers of it.

It seems obvious to me that libraries and librarians should naturally be in the vanguard of this required movement. In the words of James O'Donnell whose book, Avatars of the Word, anticipated several of the themes I have emphasized today: " – the value of the [library] will lie in the sophistication, versatility, and power of its indexing and searching capabilities." We still need – more than ever in fact – the library as a well-ordered institution, in contrast to the infochaos on the web. For as O'Donnell wisely observes: " – one of the most valuable functions of the traditional library has not been its inclusivity but its exclusivity, its discerning judgement that keeps out as many things as it keeps in."

So, at the end of the day, the social web and the library of the future are destined to be complements – co-habitants in the infosphere. But, it follows from the ecosystem metaphor that the infosphere will never be static. The species that inhabit it will compete and evolve – some colonizing more and more territory; others retreating into niches for which they are uniquely suited – all adapting in response to the surrounding cultural and technological environment.

It is clear that research libraries must transform themselves to remain relevant and vibrant elements of the infosphere. But to thrive is not pre-ordained and complacency would be the surest path to extinction. Because, as I have argued today, the cultural and technological environment of the infosphere has changed profoundly – and with it, so too has the nature of intellectual authority and the challenge facing those who would be custodians of it. . . . A shift in the ecology of knowledge is upon us.

Source Notes

  1. Collections and Access for the 21st Century Scholar: Changing Roles of Research Libraries; ARL Bimonthly Report 225; December 2002. (http://www.arl.org/newsltr/225/main.html)
  2. Presentation by Mitch Kapor at UC Berkely; 9 November, 2005 (http://www.sims.berkeley.edu/about/events/dls11092005) Kapor states: "The sudden and unexpected importance of Wikipedia represents a radical new modality of content creation by massively distributed collaborations. This talk will examine the intriguing prospects for application of these methods to a broad spectrum of intellectual endeavours."
  3. Internet Encyclopedias go head to head: Nature 438; 15 December, 2005. (http://www.nature.com/nature/journal/v438/n7070/full/438900a.html).
  4. Who Knows? The Guardian; 26 October, 2005. (http://technology.guardian.co.uk/online/news/0,12597,1335892,00.html)
  5. Cited in article on Wikipedia in Wikipedia. It is reported that 4th Quarter, 2005 costs were $321,000 ($1.3 million at annual rate) with hardware making up almost 60%. (http://en.wikipedia.org/wiki/Wikipedia).
  6. Nature 438; op.cit. (http://www.nature.com/nature/journal/v438/n7070/full/438900a.html).
  7. Eric Raymond; The Cathedral and the Bazaar; First Monday; 2 March, 1998. (http://www.firstmonday.org/issues/issue3_3/raymond/).
  8. Larry Sanger; Why Wikipedia Must Jettison its Anti-elitism; Kuro5hin.org; 31 December, 2004. (http://www.kuro5hin.org/story/2004/12/30/142458/25).