Libraries are sometimes erroneously considered to be rather passive recipients for the conservation of literary and scientific production. Such a picture was never correct, since there has always been a strong relation between the libraries and the centres of book production. Such a relation is even more strongly present today in our modern universities and other centres of research. Our budget is always too limited to allow anything but an acquisition policy that is tailored as much as possible to the specific needs of our customers. In the library lies the beginning of every research project, since it contains the necessary sources of information; the library forms also the ultimate goal of a research project, since this is the place where the publications that consolidate its results should be deposited. The activities of a research institute and its supporting library are so strongly intertwined that a librarian cannot afford to remain ignorant about the mechanisms through which scientific and scholarly information are established.
Certainly, there are remarkable differences in these mechanisms depending on the field of research. International communication has always been more frequent in the exact sciences than in the humanities, and so is their need for a fast exchange of information. The emphasis of the humanities lies more on the slower but longer lasting medium of the monograph, whereas the exact and biomedical sciences are obsessed with publications in specialised journals with high impact parameters. Nevertheless, there are already indications that some convergence in the scientific habits is taking place. In the rest of my paper, I will mainly concentrate on the so-called STM sciences (sciences, technology and medicine), where the journal crisis has its strongest financial consequences on the distribution of information.
A detailed analysis of the many roles that information may play in the process of scientific research was given by Robert Hayes1. For our description, the most relevant roles are:
These three roles give us, of course, only a very limited sketch of the use of a scientific library. An extensive collection of data can, e.g., also be very important for the use of a library by professionals. Medical doctors and lawyers like to have access to an up-to-date and correct set of scientific data, even if they do not use it for research purposes. Similarly, a scientific library can be used for educational purposes, and this use will be more focussed on the collecting of relevant and well structured data than on the communication about ongoing research.
Let us now discuss in more detail some specific aspects of these processes of scientific information.
Secondary databases, especially in their electronic format, have become very popular tools of awareness for the modern researcher. Through a careful selection of keywords, the beginning researcher tries to find the most relevant publications for his project. The more advanced researchers set up specific search profiles, through whose periodically repeated application they are notified about all the important contributions in their field of interest.
On the other hand, long before the rise of electronic databases, the need of rapid communication in the fast evolving fundamental sciences gave rise to the widespread use of exchanging pre-prints within several specialised scientific communities (with theoretical physics as a typical example). An important consequence for the libraries is that the use of secondary databases is often less intensive than could be expected. Most researchers in these communities know the important centres in their domain, and they have often a direct exchange with them. Index and awareness journals refer only to published articles, which at that moment are already old news.
The classical sources of reference are mostly used at turning points in the research (when new directions or new areas of applications are explored), or by weaker research groups with less developed international contacts. Exceptions to this statement are to be found in the domains with strong competition (e.g. the biomedical sector), where one finds less openness. During ongoing research, secondary databases are mostly used for obtaining supporting information (e.g. properties of materials or accessories in the broadest sense).
A special mention should be made of the Citation Indices (or Web of Science). The original intentions of this product, as explained by Garfield2, were situated in a broad range of information delivery: to obtain a better understanding of the structures of science, finding applications of some new method, etc… In reality, we see that these Indices are used almost exclusively for a (purely quantitative) evaluation of researchers or research groups. Together with an increase in its usage, also the criticism against it is growing: results are only convincing for comparisons inside a single research domain, there are problems with publications with many authors between whom a differentiation is impossible, publications in fashionable domains obtain strongly deformed results… More and more, people get convinced that other criteria should be taken into account in the evaluation processes.
The same phenomenon of partial lack of interest, mentioned for the secondary databases, can also be seen with respect to full text journals. Librarians sometimes make cynical remarks about the fact that some professors seldom use the library. This does not necessarily mean that they are poor scientists; it may also be that they obtain all necessary information either electronically, or through their international contacts with foreign colleagues. As a consequence, many leading scientists have no personal interest in the subscription of their library to the journals in which they publish. Together with the outrageously high subscription prices, this has considerably contributed to the flow of cancellations in the specialised research libraries.
In spite of the fact that most researchers have their own communication channels for scientific information, they still expect that the library is able to provide them in exceptional circumstances very quickly with some required documents, to which they find a reference. In such cases money is seldom important. As a result, we see that our libraries evolve more and more towards service centres for document delivery and ILL. The preference of the researchers would certainly be to have an extensive pay-per-view access to fulltext databases, rather than the full subscription to these journals. A crucial point for the libraries is to prove that they can offer an added value above the direct access from individual research groups to such databases. This added value may e.g. be of financial nature (better conditions due to bulk orders), but one should also not forget the didactical responsibility of the library towards our student customers who do not have access to large research funds.
Another possibility that could be developed in the future is the selective publication („print on demand“) out of a large database of articles that correspond with a certain research profile. Modern printing technology is quite capable of realising such a project. Specialised research teams would be prepared to pay a substantially higher price per page for such individually tailored publications than for the „package“ journals of today, in which often more than 90% of the articles are quite irrelevant for the local groups.
Maybe we can already draw from our analysis a preliminary conclusion about the spending of our acquisition budget. First of all, we should be aware of the distinction between on the one hand the flow of ephemeral scientific information (the communication about ongoing research) and on the other hand the publication of findings from matured research projects (very often in so-called review articles). It is important that the specialised research groups have access to the first kind of information, but we should not really worry too much about building an extensive archiving library for those publications. The second kind of publications, however, is the real stuff from which we should build our knowledge database for the future. Permanent access to this information should remain available for all our users, and for a long time to come. A complicating aspect is the fact that review articles often refer to the original articles for details about experimental set-up or theoretical derivation. Eventually, the more this kind of ephemeral information shifts towards electronic distribution, the more the review publications will have to include the details in an exhaustive way.
Researchers are in general rather proud people, at least in what concerns their scientific productions. Most of them (with the exception maybe of those in strongly applied directions) have little or no interest in a financial return from their publications. Their aspirations with respect to their publications are twofold: they would like to obtain a maximal and rapid dissemination within the interested scientific community (in order to get professional recognition from their colleague researchers), but they also like to publish in a journal with a high impact parameter (because this is what the academic evaluation commissions favour most). Up to 1990, most people maintained extensive address lists for the mailing of pre-prints; in this way they took care of the dissemination themselves. Pre-print archives originated in high energy physics (the study of the „elementary particles“) in 1991, and are nowadays the main carriers of the information in some fields of research. These databases are freely accessible over the Internet or by e-mail.3 Afterwards, articles are still published in the traditional journals for the sake of the academic recognition of the results. A consequence of this practice is, of course, that one is confronted with the crazy situation where the researcher does not publish to be read, but only to obtain a scientific recognition for his research contribution. When an article finally appears in printed form in a journal, all possibly interested people have already read it a long time ago... A librarian’s view would be that the communication has been separated from its archiving; the researcher, however, has only separated the communication from its scientific validation.
A special warning is in place here about the validation process of a publication. Its importance may not be underestimated or solely attributed to the vanity of the scientists, especially in the exact sciences. A book or an article in the humanities often expresses the individual point of view of the author, whereas another publication may eventually reflect a conflicting point of view. If such is the case in the exact sciences, at least one of both must be wrong. Publication in a refereed journal somehow expresses the official recognition of the communication by the international scientific community. It is almost sure that without such a system the quality of a journal would rapidly go down. It is very important that the electronic publication media find their own way of scientific validation! The fact that such a validation system has not yet taken shape is one of the reasons for the slow progress in the breakthrough of new purely electronic journals. Many redaction committees feel themselves bound to their publisher; they are afraid that – in case they wanted to go an independent electronic way – they would have to start all over again from scratch in building up a scientific reputation and an equivalent „impact factor“.
The development of each domain of science proceeds through a multitude of scientific publications. As soon as a substantial piece of this development is achieved, an important task of compilation arises. These compilations result in the so-called „review papers“. Most often, they are written on invitation by an editorial board and such an invitation is considered to be a recognition of someone’s reputation.
On the other hand, many experienced researchers unjustly consider the writing of a good monograph as an inferior didactical occupation. Books are undervalued as scientific publications. Libraries can contribute to the revalorization of the monograph as a didactical instrument. (It is not only our younger students who would benefit: a well written book can play an invaluable role also for the young researchers and even for the experienced researcher who wants to switch his research into a new direction.) Many libraries have cut down their acquisition budgets for monographs, in order to have more money for their expensive journals. They should realise that through the selective purchase of neglected books it may be possible to acquire more relevant material for a lower price per page than through many specialised periodicals. At the same time, they will have invested their money in building a collection that within ten years time may still be useful; such a future cannot be assured for the specialised journals…
This is not the time nor the place to give a full analysis of the serials crisis; such an analysis can be found elsewhere4, and it certainly remains a matter of utmost concern for every research librarian. The unlimited greed of some publishers is certainly the main origin of this crisis. Nevertheless, the mechanisms of our scientific information have made this crisis possible, and they still form a very weak point in all our efforts to improve the situation. Through deep rooted habits and procedures the academic world has accepted that the information is taken out of its hands and transformed into a commercial market product, for which this academic world itself is the most important customer.
The serials crisis is sometimes described in terms of the vicious spiral of price increases and subscription cancellations. An other vicious spiral exists, however, through which the researchers allowed the publishers to build up their strong monopoly position. Authors of high quality publications have always wanted to publish in high quality journals. Their contributions further increased the reputation and the scientific impact factor of these journals. This in its turn increased the psychological pressure on other researchers for sending more publications to the same journal, etc... The universities and the subsidising authorities enhance this vicious spiral formation by pressuring the researchers to publish more and by preference in prestigious journals with a high impact factor. During this process, the researchers and the universities are forgetting that by this mechanism they force themselves to pay for these scientific publications forever-increasing prices to companies that deserve no real merit whatsoever in these activities.
Originally, the most important scientific journals were published at a marginal cost by the so-called „learned societies“.5 Nowadays, even some scientific societies have realised that their journals constitute a possible source of income, with which they can support their other activities. Their intentions may be praiseworthy, but their actions are short-sighted. By neglecting to offer an affordable alternative for the overpriced journals, they effectively allow the commercial publishers to continue their blackmail of the research libraries and they hinder the free world-wide flow of scientific information.
As a consequence of this serials crisis, each library collection policy has become a painful compromise between the wishes of the researchers and the possibilities of the budget. Libraries have become more than ever before dependent on Interlibrary Loan systems. Researchers, who before considered the library problems as those of a trivial infrastructure, like the provisions for electricity and water, are nowadays becoming more conscious of the financial aspects related to their need of information. More and more they are accepting that the cost of the information provision should be incorporated in their research budget.
The arrival of electronic databases and journals has not yet solved all our problems, but it has certainly brought along a number of new possibilities. The electronic databases are not cheap. Therefore, they further withdraw some scarce financial means from the already heavily burdened library budget. What is more: they often alert the researchers about valuable information in journals that are not present in their library, such that the financial problems are being felt even more intensely.
On the other hand, the Citation Indices allow measuring easily the individual scientific relevance of each separate publication, independent from the journal in which it is published. In principle, the urge to publish in journals with a high impact parameter should diminish. One could even envisage a situation where a publication in a cheap and therefore widely available journal leads to more citations and therefore to a higher academic evaluation. Unfortunately, up to now this is only the theory of which little practical use has been made.
In the domain of the electronic versions of full text journals (the so-called primary databases), there is still a lot of uncertainty about the evolution that can be expected.6 The possibility to establish direct links between the references in secondary databases and the full texts in primary databases certainly offers completely new horizons for the information delivery to the researchers and students. We should be cautious not to be tempted by the publishers to pay exorbitant prices for the commodity of unlimited access to all possible information. Reasonable alternatives of a pay-per-view or of a selective access to a well-tailored set of relevant journals and/or articles should be carefully studied.
In view of the enormous possibilities for the distribution of information that have been demonstrated by the Internet, we should reflect very urgently about the future channels of scientific communication. Internet2 is being realised in the United States7, and also in Europe one has started the discussion8 about new high bandwidth computer-based networking. In which way will we use this to optimise the processes of scientific information? The serials crisis on its own is not the most important reason for this reflection, although it brings a special sense of urgency to the discussion.
Up to now, scientists never made clearly the distinction between communication channels for fast exchange of information on the one hand and those for a deposition of new knowledge on the other hand. Since I would like to focus my further analysis on the fast communication channels, we could for the sake of simplicity state that the long term archiving of scientific knowledge should be established through review journals or through a reappraisal of the book or monograph, in whatever form this product may evolve in the digital age. An interesting format has always been to collect a series of highly relevant research papers, and to bind them together with an introductory review text that leaves out all the details. Such a format is ideally suited for the digital age, where the collection of research papers can be constructed in a virtual way by means of a series of hyperlinks.
Since the traditional journal publishers have proved to be unwilling to use the lower distribution costs of web communication to reduce their subscription prices, the conviction has grown in the academic community that we should establish ourselves a new model for fast and cheap (in principle free) electronic dissemination and archiving of our scientific results9. During the past years several initiatives have been launched for arriving at new mechanisms towards this goal. Let us briefly describe some of them:
The electronic e-print archive „arXiv“ at the Los Alamos National Laboratory (LANL) that originated in 1991 from the high-energy physics community is probably the best example of these initiatives; today it covers a very broad range of subjects in physics, mathematics and computer sciences. The submission of contributions is completely automated, and not subjected to a system of peer review. In some fields of physics these archives have turned into the main channel for the exchange of scientific information, with over 50.000 users daily and 15 mirror sites around the world. For assessing the value of the contributions, researchers rely on the established reputation of other groups and on their own careful analysis of the papers.
In order to justify the absence of peer review, the organisers state10 that the number of refereed publications has long ago been substituted by other means of evaluation (e.g. by letters of recommendation). This may be true for the filling of academic vacancies or for the assessment of important grant applications, but it is certainly not yet a universal practice. Furthermore, we see that also the researchers themselves continue sending their valuable publications to peer-reviewed journals and this in conjunction with their publication in the e-print archives.
The electronic format offers new possibilities for the reviewing process. First of all, communication must not be delayed till the end of the reviewing, since the result of this process may be added to the database at a later stage. Furthermore, it can be dynamically changed (as well in a positive as in a negative sense), according to the evolution of scientific insight.
As for the financial aspects, Ginsparg makes the following comments: for disciplines, where authors and reader communities practically coincide (e.g., high-energy physics), free dissemination is the best option. Copyright can be justified in a situation where a small number of authors write for a large reader community, and this may lead to the payment of some kind of fee to the risk-taking research institution. The present system, where copyright is transferred to the low-risk publishers for an insignificant added value, is not sensible.
Similar initiatives have emerged over the years, e.g. in economics11 and in the cognitive sciences12. The co-ordinators of such initiatives met in October 1999 with the idea of setting up the framework for a „universal preprint archive“ (UPS) - later renamed to „Open Archives“13 - that would include papers from all disciplines. For this purpose, they imagined the creation of a network of e-print archives, each of which should contain a submission mechanism, a long-term storage system, and furthermore a standardised mechanism that enables third parties to collect data from the archive. The guidelines for this mechanism for interoperability - essentially an agreement about formatting and about standards for a minimal set of metadata - were written down in the so-called Santa Fe Convention14. The archives should be open to service providers for selective collecting of data through a so-called „harvesting interface“, and also for this interface they have described a protocol. The third party service providers that they have in mind are crossarchive search machines, current awareness services, linking systems, peerreview services, etc… Some of these services might be run on a commercial basis.
Harold Varmus, Nobel Prize winner and director of the U.S. National Institute of Health (NIH), proposed the E-biomed initiative in May 199915. He saw it as a natural extension of PubMed, which was offering free access to the bibliographic database in the biomedical sciences (including the Medline). The initiative was presented as a community-based effort to establish a central electronic publishing site for this discipline. Some specific points of the original proposal were as follows:
Some months after the initial announcement, the scope of the database was enlarged to the whole domain of life sciences (including agriculture) and the name was changed to PubMed Central. The proposal was widely acclaimed, but it received also strong criticism.
The NIH publishes on its website a series of comments16 received from interested partners. There is, e.g., a long statement by Stevan Harnad, in which he warns against mixing together too many different objectives. His advice is to forget about inventing new mechanisms for peer review or for starting new journals; the main objective should be to offer a system for selfarchiving by authors along the example offered by the physics community. Comments from the side of the publishers were expectedly less favourable to the whole idea. Some claimed, e.g., that this was a take-over by the US government of an activity that should remain in the private sector. Other criticism was directed at the non peer-reviewed part of the proposal, described as a repository of taxpayer-supported junk. The opposition did not only come from the commercial editors, however; scientific societies are also afraid of loosing a substantial part of their publications related income… The Biochemical Society, e.g., argued that the creation of PubMed Central would jeopardise the survival of many commercial and society journals, leading to a de facto reduction in the scientific communication. Also the American Association for the Advancement of Science (AAAS), editor of Science, was opposed to the proposal.
Anyway, outside pressure for caution with respect to the proposal was very large. A compromise proposal was made to start PubMed Central not as a selfarchiving site for authors, but through direct participation of publishers. In this way the NIH positioned itself as a partner of the established journals instead of as a competitor. By January 1, 2000, Ruth Kirschstein succeeded Varmus as director, and PubMed Central started as a free online access point17 to two existing journals (Molecular Biology of the Cell and Proceedings of the National Academy of Sciences of the USA, available online respectively two and four weeks after the print publications). Ten more journals are announced for inclusion, and a flexible support for new „electronic-only“ journals is promised for the near future.
Recently, the Current Science Group (editors of Genome Biology) announced a related initiative. Since PubMed Central does for the moment not accept publications directly from the authors, the Current Science Group has started BioMed Central18, which accepts submissions of original research in all biomedical fields. All articles will be subjected to peer review, and made accessible through PubMed Central. In this way, this is a new electronic-only journal, and ISI has been contacted for including the articles (and citations) in their databases. The articles that are still under review will be made accessible through a separate web site, if their authors explicitly request this facility.
Since LIBER is a European organisation, and both of the initiatives that I have described up to now are American based, it might be interesting to see what is going on at this side of the Atlantic. First of all, I should mention that there is a very strong involvement in the Open Archives from Herbert Van de Sompel, the library automation expert at the University of Gent in Belgium. As for PubMed Central, the European Molecular Biology Organization (EMBO)19 has taken an interesting initiative, although we will see that it suffers from the same ambiguities present in the American realisation. EMBO is a kind of international academy with approximately 900 individual members. One of their activities is the publication of the EMBO Journal. With the enthusiasm of its executive director Frank Gannon, EMBO organised in July 1999 a meeting in Heidelberg with journal editors, publishers and scientific society representatives about electronic publishing, especially in view of the NIH initiative. Nobody from the library community was invited… Their main conclusions were:
In order to proceed with this proposal and especially in order to solve the financial problems related to it, EMBO called in January 2000 a new meeting with extra representatives of National Research Councils, the European Commission, the European Science Foundation and CERN. The library community as such was again not present. This meeting confirmed the need for a European database and made an even stronger statement in favour of exclusively peer-reviewed contributions. Apart from some start-up money from EMBO, support would be asked from the European Commission. The final outcome of the meeting, however, was to proceed in a phased manner. In a first step and in order to obtain the collaboration of the publishers for arriving at a complete database, access to E-Biosci would initially be limited to searching only. Scientists, who want to look at the full text, would be directed to the publisher’s site, where access would be arranged according to the existing rules of the publisher.
In some technical follow-up meetings this scheme has been further elaborated. An especially interesting aspect is that the possibility is mentioned that - like PubMed Central - also E-Biosci will host (new) electronic-only peer-reviewed journals (like BioMed Central). A Governing Board was selected (with a UK library representative?), and it is planned that the service would become available within the year 2000. I sincerely wish that this E-Biosci initiative may evolve along its initial ideas into a great system for open access. Nevertheless, I have the unpleasant feeling that the traditional publishers, who want to protect their current income, have drowned the whole initiative under nice words. The present plan may even lead to a rise in their income, when the new search possibilities will increase the need for access to the full texts, to which they will still hold the keys.
The Open Archives initiative is probably the most realistic one: it does not count on the questionable goodwill of the publishers to provide material and it gives furthermore the library a prominent role as co-ordinator of the information. However, it would be very irresponsible and certainly leading to a disaster if one tried to replace the existing publication channels by an electronic mechanism solely aimed at the communication aspect of the publications and neglecting the quality control imbedded in the reviewing mechanisms. We have discussed the important role of the validation in the process of scientific communication. The Open Archives can only be the lowest layer of a structure, which needs a reviewing layer on top of it. Today, the paper journals are still fulfilling this role for Ginsparg’s archive. Tomorrow, this may be done by virtual journals under the supervision of learned societies. In any case, this reviewing process is not the job of the library. We should limit ourselves to providing the basic structure that allows access, both for deposition and for retrieval; this is precisely what the Santa Fe Convention was intended for. On the other hand, libraries should be concerned about the quality control of the scientific information that they deliver. Therefore, we should strongly insist on the implementation of the peer-reviewing mechanism for the Open Archives.
Maybe our average librarian will feel more at ease with the other two proposals, since they are following more closely the trusted paths of the publications of the past. We should also bear in mind that there is quite some diversity in the attitudes and needs of the scientists. New methods for fast communications that work well for the physicists may not (or not yet) be acceptable by people in the human or biomedical sciences. Therefore, it is not such a bad idea to start simultaneously with some different experiments. Making PubMed Central a successful enterprise, however, will require overcoming a lot of resistance from the commercial sector, which has to agree to make their journals available through this channel. The strongest argument for this remains in the hands of the scientists, who can select the journals of this archive for submitting their publications. The librarian has always acted as the intermediary between publisher and reader. I consider it to be an important task of today’s librarian to intervene as the intermediary between author and publisher, by convincing the former to think twice before submitting his publications to a journal that refuses to participate in a system of open access.
The common factor in all described proposals is that the initial purpose was each time to arrive at a mechanism of free access to the scientific information, without any barriers. Since nothing in life is really free, the consequence, of course, is the necessity for a shift in payment from the reader to the writer. When we look at this from the point of view of our previous analysis of the process of scientific information, where it is in the first place the scientist himself who wants that his papers are read, this is a very acceptable system. (Criticism about the injustice towards developing countries, who would not be able to publish their papers, is ridiculous. The needs of these countries in the field of scientific communication are more on the receptive than on the productive end!) The rise of new information technologies is not intrinsically linked with free access or with the shift in the method of payment, but it makes them more easy to implement.
The inherent ambiguity in all these proposals has to do with the involvement of the established publishers. This ambiguity is the extrapolation of the longstanding conflict of interest between the scholarly writer, who wants maximal dissemination of his results, and the publisher who wants to restrict access to those readers who have paid for it. Obviously, it is not realistic to expect that the publishers will enthusiastically participate in a scheme that will lead to a reduction of their income. Two different strategies have been used to resolve this ambiguity. Open Archives goes directly to the authors and offers them a good alternative for their needs of communication; the NIH and EMBO try to play along with the existing publishers and they hope to find an acceptable compromise.
The people who took the initiative for nice schemes like PubMed Central and E-Biosci were maybe too naive when they hoped that at least the scholarly societies would all embrace their proposal. Some very strong opposition was expressed precisely by some of these societies, who have become too dependent on the income generated by their publications. These learned so-cieties should realise that (and I now quote Stevan Harnad) „there is currently a profound conflict of interest between the maintenance of their current revenue streams and what is best for science and scientists. This conflict will have to be resolved in the favour of science, rather than in the status quo in scientific societies and their sources of revenue.“
It is obvious, and some people at the second EMBO meeting have stressed this point, that the library community should get involved in these initiatives. LIBER, as the main European organisation for research libraries, is certainly an ideal partner for entering this game. Before we do so, however, we should start a discussion in our own ranks, to make sure that we have a clear idea about our goals and about the best ways to achieve them. The three recent initiatives that I have described are very different in set-up. We may want to collaborate with all three of them, but we might also come to a consensus about one system, which we would support most strongly. The international discussion is certainly not over yet, and our advice may carry quite some weight if it is really supported by our organisation as a whole. On the other hand, the final aim of our libraries is to serve the scientific community. So, whatever discussion is going on, it should be undertaken within a strong partnership with this community.
If the libraries neglect to reflect and to take appropriate measures regarding these new mechanisms for access to the scientific information, we might be faced with the prospect described by Andrew Odlyzko20. According to his analysis, libraries are spending in general twice the cost of their journal subscriptions to their accompanying services, such as shelving, preserving and lending out. He predicts that publishers will happily forgo part of their present income, if they can succeed in taking over in an economically favourable way the intermediating role of the libraries. This is already taking shape through consortium and even national licensing for access to large packages of electronic journals. Do we really want that this access function to the scientific publications is taken away from the library and given into commercial hands, with all possible future dangers of exploitation due to a new monopoly situation? We have arrived at a breakpoint in the history of the scholarly communication, and it would be a tremendous neglect of our responsibility if we in LIBER would be satisfied with a nice yearly talk about these problems, without discussing what possible action we can undertake. We can not successfully perform such discussions and corresponding actions on our own, but we need the right partners. I repeat that these partners are first of all the researchers who write and read the publications. In my opinion – due to conflicting interests – it is not of much use to involve the commercial publishers in these discussions; the best we can hope is to convince the scientific societies that it is in the benefit of science and research that they co-operate with us in this matter.
Peter Singer wrote a funny editorial for the February 22, 2000 issue of the Canadian Medical Association Journal21. He is looking back in 2003 to the evolution in medical publishing during the past three years, now that Harold Varmus received the Nobel Peace Prize (together with the WHO for the launching of the Global Medical School, based on PubMed Central…). Of course, PubMed Central became very successful. A wave of innovation swept through the world of publications and those that did not succeeded in adapting to the new situation went bankrupt. Authors and readers were delighted. Only „university promotion and tenure committees were less enthusiastic about the changes. In the good old days they could rely on the „brand“ of the journal in which an article was published. (…) Now they had to actually read articles and reflect on their worth…“ This editorial gives fun to read, but it makes you also think: „I wish this would come true…“
We, as librarians, and LIBER, as a professional library organisation, should try to contribute towards the fulfilment of this dream!
In the beginning of this year, the Commission of the European Communities published a communication „Towards a European research area“22. This communication was the initiative of Commissioner Philippe Busquin, who wants to set up a scenario for the creation of better overall framework conditions for research in Europe. In his preface, Mr. Busquin invites all those who feel concerned about the future of European research to come forward with ideas and suggestions with regard to his analysis and the proposed actions. Since libraries play an essential role in the research process, maybe LIBER should accept this invitation and join in the discussion. This means, however, that we should have some vision about the future role of our scientific libraries.
The library is not mentioned in the commission’s communication, except for a citation about actions to be taken with respect to „the development of databases; access to advanced Internet services; promotion of the production of multimedia content and interactive uses“… Nevertheless, I found an interesting reference to the library in a follow-up document on „Research Infrastructures“23. In this text, the authors have mainly large facilities in mind, like CERN at Geneva. I quote: „In general, one may say that the role of research infrastructures in innovation is that of being reference places where essential instruments can be found and exchanges between diverse types of scientific actors take place. This role can be stated as the modern equivalent of European Middle Ages abbey-libraries where both instruments of knowledge (books) and know-how were exchanged.“ Further on, the text gives a clear indication that all kind of instruments fulfilling such a role, like archives and electronic databases, may be recognised as a research infrastructure.
Maybe, we could bring the message that we have a vision about our libraries that goes far beyond that of a medieval abbey-library. When we dream of a library-driven network of freely accessible electronic databases with peer-reviewed scientific publications, we are dreaming of the perfect infrastructure for the communication and exchange of ideas between researchers world-wide. We must certainly be able to demonstrate the positive influence that such an instrument could have on the scientific community in Europe.
On the other hand, such databases would still only fulfil the role of instrument for the fast communication about ongoing research. Shouldn’t it also be our ultimate dream to realise one day the traditional function of the library as the depository of all human knowledge? Already in 1945 Vannevar Bush24 expressed the hope that the new technology would make this possible. Since then, technology has indeed made enormous progress, but in matters of „content“, we are still far from the construction of a „global database of knowledge“. Such a database should satisfy a number of stringent requirements:
The fact that we are still far from the realisation of that dream25 can be illustrated by what happened in physics. There exists a rather primitive access point to physical subjects on the Internet, called TIP-TOP (The Internet Pilot to Physics). Some time ago, it was suggested in Physics Today (the membership journal of the American Physical Society) to transform these WebPages into a large freely accessible hierarchical subject catalogue for electronically accessible literature in the domain of physics. However, nobody wants to do this: university people consider such a project as an obstacle that would slow down their ongoing research, but also libraries do not see this as their task (maybe because of a lack of qualification). Such a project can only be accomplished through an intense collaboration between scientists and librarians, e.g. for physics through a partnership between the European Physical Society and LIBER. The necessity of such a partnership, between libraries on the one hand and the academic world (both the individuals and the organisations) on the other hand, is precisely the main message that I wanted to carry.
1. Robert M.Hayes (UCLA): The Needs of Science and Technology in „Research Libraries – Yesterday, Today and Tomorrow“, ed. W.J.Welsh, Greenwood Press, London, 1993.
2. E.Garfield: Citation indexing: its theory and application in science, technology, and humanities, Wiley, New York (N.Y.), 1979.
3. Examples of such archives can be found at the following Internet addresses: http://babbage.sissa.it/, http://xxx.lanl.gov/, http://www.ma.utexas.edu /mp_arc/mp_arc-home.html.
4. See, e.g., „To publish and perish“, an essay that appeared in Policy Perspectives, 7 (1998) as a result of a meeting at the John Hopkins University of presidents, provosts and library directors of important universities from North America. This text is available electronically at http://www.arl.org/scomm/ pewrept.html.
5. The first examples of these are the Journal des Sçavans and the Philosophical Transactions, which started being published in 1665, respectively in Paris and in London.
6. For a review of the history of the electronic journal, see Robin P.Peek and Jeffrey P.Pomerantz: „Electronic scholarly journal publishing“, Annual Review of Information Science and Technology, vol. 33 (1998), edt. by Martha E.Williams.
7. Information can be found at the web site of the National Partnership for Advanced Computational Infrastructures at http://www.npaci.edu.
8. See e.g. „High level requirements for broadband interconnection of national research, education and training networks and testbeds (RN1)“, a report of an EC Expert Panel (January 2000). This document can be found at http://www.garr.it/docs/rag-finale.pdf.
9. One of the earlier proponents of this idea was Stevan Harnad, director of the Cognitive Science Centre in Southampton. See, e.g., his contribution „A subversive proposal“ in Scholarly journals at the crossroads, Ann Okerson and James O’Donnell (eds.), Association of Research Libraries, 1995. The text is also available at http://www.arl.org/scomm/subversive/toc.html.
10. See, e.g., P.Ginsparg: „Winners and losers in the global research village“, Conference at UNESCO HQ, Paris 19-23 February 1996, available at http://xxx.lanl.gov/blurb/pg96unesco.html.
11. http://netec.mimas.ac.uk/RePEc.
12. http://cogprints.soton.ac.uk.
13. For more information, see http://www.openarchives.org; an experimental prototype of the system has been set up at http://ups.cs.odu.edu. This initiative was somehow anticipated in A.M.Buck, R.C.Flagan and B.Coles: „Scholars’ forum: A new model for scholarly communication“, http://library. caltech.edu/publications/scholarsforum/.
14. See http://www.openarchives.org/sfc/sfc_entry.htm.
15. See http://www.nih.gov/welcome/director/pubmedcentral/ebiomedarch.htm.
16. See http://www.nih.gov/welcome/director/ebiomed/comment.htm.
17. Go to http://www.pubmedcentral.nih.gov/.
18. See http://www.biomedcentral.com.
19. See http://www.embo.org/News.html.
20. Andrew Odlyzko: „Competition and cooperation: Libraries and publishers in the transition to electronic scholarly journals“, J. Scholarly Publishing, vol. 30 (1999), p. 163-185.
21. P.Singer: „Medical journals are dead. Long live medical journals“, Canadian Medical Association Journal, vol. 162 (2000 ), p. 517-8. Also available at http://www.cma.ca/cmaj/vol-162/issue-4/0517.htm.
22. EC Communication 99109-C (January 2000). This document can be found at http://europa.eu.int/comm/research/area/com2000-6-en.pdf.
23. See http://europa.eu.int/comm/research/area/infrastructures.pdf.
24. V.Bush: „As we may think“, The Atlantic Monthly, July 1945, vol. 176, p.101
108. (available at <http://www.theatlantic.com/unbound/flashbks/computer/ bushf.htm>).
25. It is still too early to evaluate from this perspective a recent initiative for an Interactive knowledge site called „Fathom“ (at http://www.fathom.com), launched by a collaboration between the New York Public Library, the British Library, the Smithsonian Institution’s National Museum of Natural History, Columbia University, the London School of Economics and Political Science and Cambridge University Press.
Raf Dekeyser
Universiteitsbibliotheek K.U.Leuven
Mgr. Ladeuzeplein 21
B-3000 Leuven, Belgium
Raf.Dekeyser@bib.kuleuven.ac.be