Open Access: Shaking the Basics of Academic Publishing
Although open access is not a new concept, the all-embracing structural upheaval caused by digital technology is still turning academic publishing upside down. SOAP (Study of Open Access Publishing), a project funded by the European Commission that ran from March 2009 to February 2011, found that 8%–10% of scholarly articles published worldwide now appear in fully or hybrid open access (OA) journals (http://project-soap.eu). OA publishers, as well as traditional publishing houses, have established new business models. Both scientists and publishers will have to adapt—or disappear.
During the 2011 Academic Publishing in Europe conference (www.ape2011.eu), speakers considered how classic scientific publishing can be heavily reshaped to adapt to the needs of the scientific community. Publishers should capitalize on the opportunities offered by digital technology to advance scientific information, communication, and collaboration processes and to help scientists cope with information overload.
Open access, which advocates free access and substituting submission fees for subscriptions, is gaining traction worldwide. With the support of funding authorities, libraries, and scientific service institutions, scientists and scientific organizations see a transmutation of academic publishing from the current scheme into a science-centered open access system. The goal is to expand it, building up a global e-research environment that combines specialized research software with information, communication, and collaboration tools at the researchers desktop.
Scientific institutions and organizations, such as CERN, the U.K. Royal Societies, and learned societies throughout Europe, as well as the research organization the Max Planck Society (MPG), support and promote the movement. But the roles in future academic publishing are not yet assigned. Scientists, librarians, publishers, and funding authorities have learned that the transition from print to digital systems is a huge task. All parties involved in the production, publishing, documentation, and archiving of scientific knowledge are challenged by technical and political issues. How does a globalized science affect national economic interests, intellectual property rights, and personal income from findings?
Davos for Research Development and Innovation
The European Commission has mandated that the High Level Expert Group on Scientific Data prepare a “Vision 2030” for scientific data e-infrastructures in Europe. John Wood, secretary-general of the Association of Common wealth Universities, London, and chair of the group, listed “technical, linguistic, legal and special problems” as the main challenges within the “globalization of research.” Special problems, he explained, are “transnational access, diversity of national intellectual property and data legislation, repurposed data and ethical and privacy considerations.” He even mentioned the necessity of “green solutions” to refer to the energy supply that will be needed for the transport and use of the data.
Wood said that the European Research Area Board (ERAB) recommends European authorities take the lead in addressing the global challenges. ERAB proposes “A Davos for Research, Development and Innovation” (RDI). The report feeds into the European Union (EU) 2011 capacities work program (e-infrastructures) and serves as a background to prepare the next framework program of the EU.
LESS EMOTIONAL DISCUSSION
Eefke Smit, director of standards and technology of the International Association of Scientific, Technical and Medical Publishers (STM), Amsterdam, is happy that the tone in the open access debate has changed: “By now the discussion is far less emotional. We seem to move towards solutions.” She pointed out that the topics of this year’s conference—peer review, organizational tasks, technologies, and markets—were not in themselves new, but the move to more rational discussions, more mobile technologies, and more experimenting was a very positive achievement. “We are moving from theory to facts, from thoughts and ideas to experiments,” Smit stated.
The classic scientific publishing scheme has to be heavily reshaped. Over the past 25 years, STM publishers have spent huge sums to develop electronic products and platforms to make scholarly articles available online. They installed content management systems to cope with the editorial tasks. By now, the trailblazers—Thomson Reuters, Elsevier, and Springer Science+Business Media—still have their Golden Road digital strategies, but they are stepping forward cautiously toward the Green OA Road.
In a fascinating lecture, Adam Marshall, group head of marketing and customer services at the London-based provider of publishing and knowledge dissemination solutions, Portland Press Ltd., demonstrated how the software Utopia Documents (www.getutopia.com) brings papers in PDF to life. It links the PDFs to live resources on the web, turning static data into live, interactive content. These resources could be primary data sets from institutional repositories, collections of images, or animated models of molecular structures. Without affecting the underlying PDF file, the software provides functionalities to explore article content, copy lines of interest, make private notes, annotate a document for others to see, and discuss the content online. In collaboration with the University of Manchester’s Utopia team, Portland Press has turned the Biochemical Journal ( BJ) into a semantic journal.
MARK ZUCKERBERG OF STM PUBLISHING
Introduced as the Mark Zuckerberg of STM publishing, Victor Henning, co-founder and CEO of Mendeley Ltd., explained that Mendeley has evolved from a system that enables scholars to manage and share research papers into one that combines a desktop application and a website to help scientists and students manage, share, and discover both content and contacts in research. Web 2.0 functionalities—tags, related research, and a personal library—are provided. Henning believes that Mendeley is the world’s largest semantic research database, build from scratch.
Kevin Cohn, vice president of operations at Atypon Systems, Inc. in Santa Clara, Calif., may not be another Mark Zuckerberg, but he believes his company is setting new standards for digital content delivery, discovery, and monetization. Cohn presented WebKit as the springboard of the mobile era in academic information and publishing services. WebKit is an open source web browser developed collectively by Apple, Research In Motion, Google, and others to set a standard in academic mobile computing. It is used by Apple’s mobile browser Safari, Google’s Chrome browser, and, as Cohn explained, “every modern smart phone screen.” Only Microsoft is absent. Next will be the iPad. Mobile is at the doorstep of academic publishing—probably nearer.
Whether academics publish in OA or traditional journals, peer review remains an important issue to assure scientific quality. But the processes change. Scientists demand more transparency and new approaches to overcome delays caused by the peer-review bottleneck that, in spite of digitalization, grows worse. Exponentially growing literature, globalized research, hyperspecialized research with fewer expert referees, a shorter research-to-written-paper delay, a more competitive environment, and fewer independent expert referees available contribute to delays, pointed out Bernd Pulverer, head of scientific publications at the European Molecular Biology Organization (EMBO) and chief editor of its journal.
In 2009, EMBO instituted a transparent electronic peer-reviewing process using Web 2.0 technologies. The software handles referee reports, editor communications, author rebuttals, and timelines and supports the management tasks with handling statistics. Since September 2010, the new peer-review process has been inaugurated at all EMBO publications.
While EMBO uses Web 2.0 technology to advance the traditional peer-reviewing processes and make them more transparent, a radical step is being prepared by a pan-European group of researchers. Funded by the EU and coordinated by the Department of Information Engineering and Computer Science at the University of Trento in Italy, they have developed “an entirely new dissemination model and website where scientific communication meets the social web,” says Maurizio Marchese, a member of the team of scientists. The scientists call it Liquid Publications, designed to “help you to stumble upon what you need.” The group suggests that scientists should not submit their articles to be published in a journal but instead share it on the LiquidPub website (http://liquidpub.org). This includes data sets, presentation slides, and other presentations.
To rank the articles, the scientists will look at how people are using these materials. They have developed new metrics and methods to analyze usage. The procedure—post the articles unreviewed and then analyze readers’ behaviors—is intended to substitute traditional peer reviewing and bring forth a system in which citation counts can be replaced by the use ranking. Marchese thinks this is “a knowledge dissemination system that better fits into the web era.”
To make supplementary material and data sets clearly assignable to scholarly articles, leading national libraries in Germany, Denmark, France, Netherlands, Canada, the U.K. and the U.S. initiated DataCite (www.datacite.org) as a global research data registration agency. DataCite aims “to establish easier access to scientific research data on the Internet and increase their acceptance.” By using digital object identifiers (DOIs) “initially but not exclusively,” as Jan Brase from the German National Library of Science and Technology (TIB) said, data sets are distinctively labeled in order to make them “legitimate, citable contributions to the scientific records.”
More than a million records are already registered with DOI names. A central metadata base should be live by mid-summer 2011. Cooperation with CrossRef will help with article look-ups, and DataCite is negotiating with additional publishers. DataCite is also cooperating with FIZ Karlsruhe to combine eSciDoc’s open source modular software systems, which were developed by the Max Planck Digital Library (MPDL) and FIZ Karlsruhe, with the DOI-registration interface GetInfo (www.getinfo.de) from TIB.
Malte Dreyer of MPDL gave some deep insights into what eSciDoc is meant to be for the scientists at MPG—an e-research infrastructure; a repository infrastructure; an infrastructure for virtual research environments (VREs); a publication infrastructure; a set of services, tools, and content models; an archive; and a growing set of applications for research. He also mentioned an astronomer’s workbench as a community of practice platform (www.escidoc.org).
RESEARCH, RIGHTS, BUDGET
New developments in open access, open source, and collaborative research suggest that scientists are going to manage their information and communication by themselves, with no role for academic publishing. This is not true. There will be a life for publishers in the future. Academic publishing is more than collecting papers and data sets, as exemplified by the slogan at Springer’s website: “Your research. Your rights.” Springer has put in a nutshell that knowledge is property, that knowledge is power, and that knowledge is money.
The market system is changing. Payment and pricing models are reversed in a gift economy, which partially explains what open access publishers do to cover their expenses. Public Library of Science (PLoS), a nonprofit organization of scientists and physicians, charge the authors “a fair price that reflects the actual cost of publication” for publishing the article, obtain money from member institutions, and collect donations (www.plos.org).
The reversed pricing model is as good for traditional academic publishers as it is for open access publishers. Springer Scientific+Business Media, for example, offers a pricing model that is more flexible by letting authors decide whether to pay for publication or have the publisher charge readers.
VIABLE MODELS FOR MONETIZING
Finding viable models for monetizing publishing expenditures has become a big issue. But the challenges do not stop here. Academic publishing is a challenging global business that takes care of more than just the technical management of publication and peer-review processes. It guarantees quality assurance; adjusts content to new technologies, new platforms, and users needs; and markets to make scientific findings visible in a global world.
Here’s one example: Eefke Smit reported that the Journal of Neuroscience no longer accepts supplementary material for manuscripts. Although the journal has been published electronically since 1996, authors are now asked “to host supplementary material on an external website and include in their article a footnote with a URL pointing to that site.” Since this policy change, the amount of material associated with a typical article has grown dramatically and is adversely affecting peer review. “It has become increasing clear that there are other costs to supplementary material as currently implemented,” Smit commented.
Academic publishers have traditionally regarded universities, libraries, and research institutes as their primary market. Springer’s Derk Haank speculated that he’d be interested in selling information to individuals.
OPENING UP ACADEMIC PUBLISHING
In his wrap-up, Herman P. Spruijt, immediate past president of the International Publishers Association (IPA), Geneva, concluded that there are many variations of publishing. In the future, articles will play a different role. “If an open source environment is seen as the ideal situation, publishers have to ask themselves how they can cooperate,” he said. He recommended looking more closely at the peer-review processes and “not take it for granted any more.” Publishers “have to do a lot of work. We are still experimenting.”
Information professionals need to pay attention to the experiments of not just the publishers but also the scientists whose attitudes and behaviors regarding academic and scholarly publishing are changing as well. As academic publishing faces reinvention and open access shakes its foundations, the implications for library collections, subject matter curation, and quality control will be profound.