Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com).
Magazines > Online Searcher
Back Forward

ONLINE SEARCHER: Information Discovery, Technology, Strategies

HOME

Open Access Gains Traction
By
July/August 2021 Issue

The trend toward open access (OA) in publishing has been steadily growing since the Budapest Open Access Initiative in 2002. In the U.S., the OA movement began to be taken seriously in 2008, when the National Institutes of Health (NIH) became the first U.S. government funder to require grant recipients to make their research output openly accessible. Plan S, which started in Europe but has spread globally, is the latest initiative encouraging OA publication. As OA gains traction, the implications for researchers, publishers, and librarians are profound.

OA is a boon for researchers and libraries, given that journal subscription prices have escalated well above inflation over time. OA allows researchers to retain the rights to their scholarly output, while subscription journals generally hold the copyright to published articles. In science and research, “open” is a good thing, because it allows for reproducibility of results, transparency of methods and output, resource sharing, and collaborations, thus breaking down disciplinary silos.

As the research environment becomes more accessible, what does a “culture of open” do to the impact of research outputs? This is not a rhetorical question. According to Daniel Hook’s Feb. 24, 2021, post on the Dimensions’ blog, the year 2020 saw more research output published in OA formats than in subscription journals (dimensions.ai/blog/open-access-surpasses-subscription-publication-globally-for-the-first-time).

OA CONTENT IS NOW EASIER TO LOCATE

Hook’s blog post outs Dimensions’ integration of the Unpaywall (unpaywall.org) OA Classifications tool. Our Research (ourresearch.org), the creators of Unpaywall and the new name for Impactstory, is integrated into a number of subscription databases. A Dec. 9 2020, Scopus blog post by Rachel McCullough indicates it is using Unpaywall to aid searchers in identifying OA materials (blog.scopus.com/posts/scopus-filters-for-open-access-type-and-55-million-more-oa-articles17-million-in-total). Web of Science touts its partnership, in place since 2017, with Our Research to provide an OA filter on search results (clarivate.com/webofsciencegroup/solutions/open-access). A 2008 article in Nature, “How Unpaywall Is Transforming Open Science,” notes that Dimensions “used Unpaywall from the get-go” (nature.com/articles/d41586-018-05968-3). It’s not just scientific databases that show interest in Unpaywall—library discovery systems have also integrated Our Research projects.

This is great, right? More research is available openly; it is easier to locate OA versions of many research articles; and the amount of OA research output is now greater than the amount of research output in subscription journals.

OA is not without its inequities and issues, however. For starters, there are different “colors” of OA publishing. One essential distinction between OA types depends on whether the research is made open through the journal publisher (“gold”) or if the research is in the form of an article preprint archived in an OA repository (“green”).

A MATTER OF SUPPLY AND DEMAND?

To fund OA availability of research output, some journals will levy an author processing charge (APC) for an individual article within a non-OA journal to be made open. So-called “hybrid” journals will often charge an institution for a subscription as well as charge researchers at the same institution APCs if they want (or are required) to make their output OA. Furthermore, it is well documented that APCs are not entirely reflective of the processing costs of publication. Instead, the more prestigious the journal, the higher the APCs.

Here is where it gets interesting, or maybe flat-out concerning, from a research impact perspective. A recent analysis by Kyle Siler, University of Sussex, and Koen Frenken, Utrecht University, showed that 73% of titles indexed in the Directory of Open Access Journals (DOAJ) did not charge any APCs whatsoever. This may not be surprising, as DOAJ only indexes journal titles that are 100% “gold” OA, not hybrid.

Nonetheless, a mere 10% of DOAJ titles have a Journal Impact Factor (JIF). Clarivate’s Journal Citation Reports (JCR) only lists the JIF for a curated list of journal titles, the vast majority of which are available by subscription. Siler and Frenken’s DOAJ analysis indicates that 44% of all research articles (whether OA or not) are published in journals that do have a JIF (“The Pricing of Open Access Journals: Diverse Niches and Sources of Value in Academic Publishing,” Quantitative Science Studies, 1(1):28–59, Feb. 1, 2020; doi.org/10.1162/qss_a_00016; direct.mit.edu/qss/article/1/1/28/15570/The-pricing-of-open-access-journals-Diverse-niches).

JIFs are not by any means the be-all and end-all hallmark of journal quality. Far from it. There are many rallying cries against placing too much emphasis on JIF. Examples include the Declaration of Research Assessment (DORA; sfdora.org); the frequent refrain in research impact circles urging “responsible use”; an Institute of Scientific Information research report published in 2019 espousing “Profiles, not metrics” (clarivate.com/webofsciencegroup/wp-content/uploads/sites/2/dlm_uploads/2019/07/WOS_ISI_Report_ProfilesNotMetrics_008.pdf); not to mention myriad peer- reviewed articles and scholarly commentaries pointing to the inadequacy of JIF as a standalone barometer for quality. You get the idea. At the same time, the aspiring researcher is all too often advised by his or her established colleagues to publish in a high impact factor journal. The research ecosystem still feeds on JIFs.

TRANSFORMING WHAT, EXACTLY?

You may have heard about transformative journal agreements, a concept that seems to be gaining traction at many research institutions. For a good, quick breakdown of transformative agreements, see Lisa Janicke Hinchliffe’s post on the Scholarly Kitchen blog (scholarlykitchen.sspnet.org/2019/04/23/transformative-agreements). The University of California (UC) System made news with its March 2021 contract with Elsevier Publishing. Essentially, this contract shifts the agreement between the two parties from a typical subscription model to one that is intended to promote OA. Instead of a subscription fee, the library now pays a “reading” fee that is around 10% of what it was paying for its Elsevier package. With the balance, the library pays for part of the UC researchers’ APCs when they publish with Elsevier. The default for their authors is that the content is OA (tinyurl.com/yphxwcm8). There is more to it than this, but that’s the general idea. You can get details on a large number of signed transformative agreements by consulting the ESAC Transformative Agreement Registry (esac-initiative.org/about/transformative-agreements/agreement-registry).

The University of Virginia Library blog, The Taper, had an interesting take on the UC/Elsevier agreement, noting that it shifts the playing field in several ways and will have the effect of increasing OA availability of UC scholarship (thetaper.library.virginia.edu/2021/03/19/four-concerns-about-the-new-uc-elsevier-deal.html). The focus of OA is to remove copyright stipulations that restrict access to research findings. The Taper goes on to say: “But copyright is only half (maybe less) of the dysfunction in academic publishing. The deeper, more insidious problem is the journal prestige economy (aka impact factor mania)—the academy’s reliance on journal reputation and metrics like journal impact factor in evaluating the quality of scholarship and of scholars.”

In other words, moving away from paywalled/subscription-based journal access does nothing to combat the misuse of journal ranking and JIF. Without some sort of grand shift in the research publication landscape, no matter the color, OA will never, in and of itself, confer prestige.

IMPERFECT SOLUTIONS

Proposed approaches to reconciling the misuse of journal ranking and journal-level indicators vary. A previous iteration of Metrics Mashup talked about DORA, the Leiden Manifesto, and other statements about responsible use and appropriate roles for journal ranking. These provide useful rallying points but are essentially unenforceable honor codes.

Another suggested approach to reducing overreliance on journal indicators is to revamp the reward system for conducting research by appealing to researchers’ sense of idealism and pure science. Instead of focusing on productivity and impact, focus on increasing knowledge for the benefit of humanity. Lofty, virtuous ideals certainly sound appealing, but it is difficult for funders to justify grant allocation without the prospect of seeing some concrete results that provide a good ROI on their funding investment.

Calls for publisher boycotts crop up every now and again as a solution to various inequities in the system. Boycotting a single publisher does nothing to rectify the problem. As Shaun Khoo pointed out in October 2019, unless problems endemic to the entire system are addressed, the practices of the boycotted publisher will just crop up in the behavior of another publisher down the road (the-scientist.com/news-opinion/opinion-boycotting--elsevier-is-not-enough-66617).

Some argue that journals no longer have a reason to be so exclusive in preserving their prestige. The shift to electronic and web-based publishing means that the journal is no longer constricted by the limitations of print media. Why can’t high impact factor journals simply publish more articles? This argument disregards the additional time, effort, and cost of adding content to a journal issue. Additional content would mean lining up more peer reviewers and would also create more copy-editing hassles, cause delays in turnaround, and place more chores on overtaxed editors who are not paid for their roles. At the same time, a publisher might feel justified in raising subscription rates or APCs, which seems to cancel out the benefit of adding content.

Finally, let us not ignore the proposal made by Björn Brembs, Katherine Button, and Marcus Munafò in 2013: namely, that we should do away with journals entirely in favor of an interoperable archive or repository system through academic libraries. Their argument is predicated on the idea that a digital environment renders the division of scholarship into subject-specific journals obsolete (“Deep Impact: Unintended Consequences of Journal Rank,” Frontiers in Human Neuroscience, v. 7; doi.org/10.3389/fnhum.2013.00291). There may be some merit to this. After all, newer tools in the bibliometrician’s arsenal, such as Dimensions, Altmetric.com, and PlumX, are publication or object-based, not journal-based.

Will one of these tools wind up winning out for research retrievability? I suppose anything is possible, but right now that does not seem very likely to happen. Google Scholar, as you may recall, relies partially on publisher agreements to populate its search results. Additionally, the creators of Microsoft Academic (MA), another article-based tool, announced that MA will cease being updated after December 2021 (microsoft.com/en-us/research/project/academic/articles/microsoft-academic-to-expand-horizons-with-community-driven-approach). This is not surprising given that it was meant to be a sandbox for natural language processing experiments by Microsoft, and furthermore, in my estimation, provided a relatively lousy end-user experience.

TIME WILL TELL

When the creators of Unpaywall, Heather Piwowar and Jason Priem, along with Richard Orr, did a study using a snapshot of data collected from their tool, they found exponential growth of open scholarship available on the web and in bioRxiv. Through their analysis, they found 31% of all articles in 2019 were available openly, and 52% of article views were of OA articles. They also described some conservative estimates of what the OA landscape would look like in 2025 (“The Future of OA: A Large-Scale Analysis Projecting Open Access Publication and Readership,” bioRxiv 795310; doi.org/10.1101/795310).

We have already seen an explosion in preprints and other openly available research content due to the COVID-19 pandemic and the need to get scientific results out and shared in a rapid fashion. In a very real sense, the pandemic has accelerated the rate of shift to more open research and scholarship. Hook mentions in his blog post that Unpaywall was used to determine that more research output was OA in 2020 than was behind a paywall. Due to the vagueness of how that was described in the blog, I am not certain it is an apples-to-apples comparison with the estimate in the 2019 Piwowar, Priem, and Orr article. Still, it would stand to reason that the movement to increased open research has outpaced their conservative estimates.

It further stands to reason that if 52% of article views in 2019 were of OA articles, the rate of increase in views has also outpaced the conservative prediction in their analysis. If more open research is available, and openly available research is more likely to be viewed, perhaps the problem of journal indicators will take care of itself. If open research content is more likely to be viewed, it is not a stretch to imagine that it will more likely be cited. It is therefore possible that citation counts will eventually shift to favor OA publications. From there, the metrics, including JIF, will have to be inclusive of OA or become obsolete. Right now, it is like looking at a row of dominoes, and the first one has tipped over. Will the rest follow suit? Time will tell if this bears out.

There are no quick answers to solving the many issues of risk and reward in the research game. Although OA moves research out from behind publisher paywalls, efforts need to be made to assure that the shifting agreements and focus on removing restrictions do not further entrench existing re wards of prestige and desirability resulting from overreliance on journal-level metrics.


Elaine Lasda is coordinator for scholarly communication and associate librarian for research impact and social welfare, University at Albany, SUNY.

 

Comments? Contact the editors at editors@onlinesearcher.net

       Back to top