Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com).
Magazines > Online Searcher
Back Forward

ONLINE SEARCHER: Information Discovery, Technology, Strategies

HOME

Pages: 1| 2| 3| 4| 5| 6
Open Access: Progress, Possibilities, and the Changing Scholarly Communications Ecosystem
By
March/April 2014 Issue

Open Access and the Journal Impact Factor

For many researchers working in university settings, the most troublesome issue around open access is still the direct link between the impact factor of the journals in which their articles are published and whether they receive tenure or promotions. Many journals, particularly those which are newer or published outside of the global North, i.e., North America or Europe, are not given an impact factor; likewise, newer journals, such as all OA journals launched in the past few years, either do not yet have impact factors or have lower ones. Impact factor is calculated by measuring the average number of citations per article published in a journal during the preceding 2 years. The backwards nature of this measurement inherently favors more established journals. So, in researchers’ minds, the question becomes whether to publish in a traditional journal with a higher impact factor and a legacy of prestige versus publishing in a newer OA journal, which could potentially be less helpful in securing tenure or a promotion.

As Rice sees it, “In some ways, the biggest barrier to OA is the prestige that a number of traditional journals have. I can tell my colleagues until I’m blue in the face that they should publish in OA outlets, but no one is going to walk away from getting an article into Science or Nature or The Lancet or Cell just because they aren’t OA. And as a person in leadership, maybe I don’t even really want them to, given the visibility of those journals.” Ideally, Rice wishes those journals would switch to an OA model, so all this could be done with. For now, tenure, promotion, grant evaluations, etc., all depend to a certain extend on having peers evaluate your work and your CV. “And,” Rice concludes, “there’s no denying that we give higher evaluations to work appearing in more prestigious journals; research has demonstrated this correlation. So, the system we have favors high prestige journals over low prestige journals. And since OA journals tend to be newer, they will tend to have lower prestige.”

Too often, researchers get stuck at this roadblock. Yet because of the dual path of open access, in most cases, researchers have the opportunity to publish in the journal of their choice and still make their publications openly accessible. Most often, the issue is lack of awareness. For example, authors published in The Lancet or Cell automatically have the publisher’s permission to archive the postprint (i.e., final draft post-refereeing) in an OA repository. Science allows authors to deposit pre-prints and the authors’ post-prints. Nature is the most restrictive of this group, permitting only pre-prints to be deposited. (Note: These journal policies are based on information in the SHERPA/RoMEO database of publishers’ copyright policies and self-archiving policies, found at sherpa.ac.uk/romeo/search.php.) Here is when a university’s OA policy often can be helpful: Many OA mandates stipulate that the university retains a nonexclusive right to deposit preprints or postprints into their repository.

Open access and challenges to the status quo within scholarly publishing have highlighted issues due to using impact factor as a proxy for quality. While many of these issues are not new, the changing landscape has led to more direct attacks by the community on the continued use of impact factor. As Björn Brembs, neurogenetics professor, University of Regensburg in Germany, and vocal critic of the impact factor, notes: “The IF [Impact Factor] is negotiated, irreproducible, and mathematically unsound. The empirical data indicate that any university basing any of their decisions on this number will risk their long-term standing and success.” [See Björn Brembs, Katherine Button, and Marcus Munafo, “Deep Impact: Unintended Consequences of Journal Rank,” published in Frontiers in Human Neuroscience , June 24, 2013, for further details: frontiersin.org/Human_Neuroscience/10.3389/fnhum.2013.00291/full.) “If the scientific community cannot base their most important decisions on evidence, it will be their undoing.” In sum, Brembs states, “At this point in time, the data suggests that throwing dice is at least as good as using the IF, if not better.”

Fighting Against the Impact of the Journal Impact Factor

Brembs is not alone in his frustration. In December 2012, a group of editors and publishers of scholarly journals met during the Annual Meeting of the American Society for Cell Biology in San Francisco and produced a set of recommendations, referred to as the San Francisco Declaration on Research Assessment (DORA). The introduction to DORA (iehost.net/pdf/SFDeclarationFINAL.pdf) states:

The outputs from scientific research are many and varied, including: research articles reporting new knowledge, data, reagents, and software; intellectual property; and highly trained young scientists. Funding agencies, institutions that employ scientists, and scientists themselves, all have a desire, and need, to assess the quality and impact of scientific outputs. It is thus imperative that scientific output is measured accurately and evaluated wisely.

The declaration summarizes many of the problems with impact factor, but also makes an important point about other types of research outputs not disseminated via journals, i.e., research outputs not yet included in a systematic way in assessments of research. The declaration highlights this growing area, stating that recommendations “for improving the way in which the quality of research output is evaluated … focus primarily on practices relating to research articles published in peer-reviewed journals.” However, the recommendations “can and should be extended by recognizing additional products, such as datasets, as important research outputs.”

DORA includes 18 recommendations for funding agencies, universities, and other institutions, publishers, and researchers. These recommendations are clustered around three themes:

(1) The need to eliminate the use of journal-based metrics, such as Journal Impact Factor, in funding, appointment, and promotion considerations; (2) the need to assess research on its own merits rather than on the basis of the journal in which the research is published; and (3) the need to capitalize on the opportunities provided by online publication (such as relaxing unnecessary limits on the number of words, figures, and references in articles, and exploring new indicators of significance and impact).

Initial signatories of DORA included Randy Schekman, one of the three researchers sharing the 2013 Nobel Prize for Physiology or Medicine, and editor-in-chief of eLife , a new open access journal. As of the end of January 2014, more than 10,000 individuals and 400 organizations and institutions had signed the declaration. For Schekman, OA advocacy has continued: On Dec. 9, 2013, he published an opinion piece in The Guardian in which he attacked the impact factor on a highly public stage and encouraged scientists to eschew the “big brand” journals such as Nature , Cell , and Science in favor of publishing in open access journals. “Like many successful researchers, I have published in the big brands, including the papers that won me the Nobel prize for medicine, which I will be honoured to collect tomorrow. But no longer. I have now committed my lab to avoiding luxury journals, and I encourage others to do likewise.” His reasons were interesting and extensive and worth a read: theguardian.com/science/2013/dec/09/nobel-winner-boycott-science-journals.

Altmetrics

This last theme from the DORA recommendations alludes to the broad category of “altmetrics,” i.e. various metrics which are beginning to emerge to measure the usage, reach, or impact of research. Most often, the term altmetrics is used to describe measures such as the number of times an article has been downloaded from a journal publisher’s website or a repository, the geo-location of downloads, views of abstracts versus downloads of the full text, or the realm of references to a publication via social media—Facebook Likes, Tweets, or LinkedIn references. In today’s environment, social media metrics are becoming fairly easy and straightforward to track using newly developed tools such as Impact Story, Plum Analytics, the London-based company Altmetric, and the PLOS Article-Level Metrics App.

On the other hand, the more challenging—and richer—data points are those which are difficult or impossible to systematically capture at this point in time. As Brembs explains, “Altmetrics measure attention, and inasmuch as attention is desired, altmetrics can help such decisions. But more metrics are required, for instance, reproducibility, methodological soundness or utility if qualities beyond attention need to be valued.”

James Hardcastle, senior research executive, Taylor & Francis, spends much of his time working on data analysis of bibliometrics, citation analysis, and altmetrics. Hardcastle has some concerns with altmetrics: “We use metrics like citations, downloads, and now social media because they are easy to capture. What we want often to understand is how research is linked back to practical outcomes, for example [development of a new] drug treatment or [tracking how this research] has built upon others’ work.” While there are ongoing developments in the altmetrics community to try to capture more of this information, Hardcastle says it is very much still in development.

Furthermore, Hardcastle points out, “Metrics are only as good as the data they are based on.” According to him, all of the metrics currently in vogue are difficult within the context of a truly global research ecosystem; he offers this example: “African Journals OnLine lists 202 journals from Nigeria [ajol.info/index.php/index/browse/country?countryId=156], [but] Scopus indexes 17 Nigerian journals and only four have impact factors.” He adds that altmetrics have been created in London and the East Coast of the U.S. that are largely based on tools developed in Silicon Valley. “Online download figures require users to have reliable internet connections and not share PDFs via email or local hosting. Whichever metric we use will not represent the impact in the global South.” This, he points out, “devalues research coming out of the global South and risks devaluing research targeted at outco mes” in this area of the globe.

Hardcastle also has some concerns with “how metrics effect motivation,” noting, “By using counts of papers, we encourage people to ‘salami-slice’ their work. But using the Journal Impact Factor, we actively discourage the publication of duplicate or null studies. Altmetrics will have different issues around reward, but could we end up just rewarding scientists with the largest Twitter following or for writing the wittiest titles?”

Bjørnshauge also suggests that altmetrics are not a panacea and believes that looking at references to research should not be the only indicator of impact but rather one piece of the puzzle. Bjørnshauge thinks the question that universities should be asking is, “How would we measure the actual impact of our research, not only measuring the impact of research on research itself, but on society in a much broader sense?”

OA in the Humanities and Social Sciences

While much of the early dialogue surrounding open access focused on fields within sciences, mathematics, and engineering, open access in the humanities and social sciences faces some different issues. The Modern Language Association (MLA) has taken a leadership role among humanities and social science organizations in recent years, serving as a champion for pushing for innovation within scholarly publishing.

Kathleen Fitzpatrick, director, scholarly communication, MLA, and visiting research professor of English, New York University, shares her thoughts on this environment within the humanities and social sciences: “One of the ways that the scholarly publishing landscape differs for the humanities and social sciences is its dramatically different funding model; while the level of grant support for the sciences has created the possibility of a fairly straightforward transition to gold open access through article processing fees, there is no comparable funding for research in the humanities and social sciences. As a result,” she says, “organizations like the Modern Language Association are investigating entirely new modes of scholarly communication, like MLA Commons, rather than focusing on a change in who pays for existing publications.”

Fitzpatrick asks researchers to think about the big questions when determining where to publish and how to disseminate their scholarship: “Humanities scholars, like scholars of all kinds, should be considering not just how their work might affect their own careers—that is to say, the relative prestige of the venues in which they publish—but also how their work might have the greatest possible impact on their fields as a whole and the culture at large. Those latter potentials are greatest where the work can be distributed as broadly as possible, where it can be openly discussed, and where it creates the potential for future collaborations.”

A social science researcher, Deborah Lupton, senior principal research fellow and professor, Department of Sociology and Social Policy, University of Sydney, Australia, also reflects on how the scholarly publishing landscape differs for the humanities and social sciences. Like several others inter viewed for this article, Lupton mentioned a lack of awareness among researchers: “Scholars in the humanities and social sciences have yet to become aware of the issues related to OA. Many have no idea what it is or how to do it. Those in the STEM fields are way ahead of us in these matters.” She says, “OA journals are still very few in the HASS [Humanities and Social Sciences] compared with STEM fields and scholars in the HASS have not yet had to think about or deal with OA to any great extent. This is now changing as funding bodies are now beginning to mandate for OA for all researchers who they fund.”

When asked what issues related to scholarly publishing humanities and social science researchers should be considering, Lupton responds: “HASS scholars need to learn about OA and the complexities around OA in relation to copyright, intellectual property, and simply ‘how to do it.’” In her view, OA does not text to involve payment to journals for publication fees, but can be achieved for free via repositories. “I have attempted to promote this form of self-archiving to my colleagues in the interests of opening up research and data, promoting one’s research and teaching, maximizing one’s academic profile, and increasing interest in one’s work from members of the public and potential students.”

Pages: 1| 2| 3| 4| 5| 6


Abby Clobridge is is founder and consultant, FireOak Strategies, LLC. 

 

Comments? Contact the editors at editors@onlinesearcher.net

       Back to top