KMWorld CRM Media Streaming Media Faulkner Speech Technology Unisphere/DBTA
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM EContentMag Faulkner Information Services Fulltext Sources Online InfoToday Europe Internet@Schools KMWorld Library Resource Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research

For commercial reprints or PDFs contact Lauri Weiss-Rimler (
Magazines > Online Searcher
Back Forward

ONLINE SEARCHER: Information Discovery, Technology, Strategies

Media Kit [PDF] Rate Card [PDF]
Editorial Calendar [PDF] Author Guidelines

previousPage 2 of 3next
Set Your Cites High: The Value of Quality Citation Information
By ,
Volume 40, Number 5 - September/October 2016

Do You Get What You Pay For?

When it comes to retro, you can’t get a lot more retro than the paid subscription services. Going back to Eugene Garfield’s original Science Citation Index, first published in 1964, which pioneered the notion of citation searching, libraries have subscribed to, and sworn by, subscription databases that facilitate cited reference searching. Four that we cover in this article are HeinOnline, EBSCOhost, Scopus, and Web of Science.


HeinOnline ( bills itself as “the world’s largest image-based legal research collection,” with more than 2,000 searchable law and law-related journals. When you click on an article of interest from HeinOnline search results, there is a notation that states “cited by” with the number of citing articles.

HeinOnline’s approach is decidedly old-school (read: reliable). The number of times an article has been cited equals the amount of times that article’s citation appears in other sources within HeinOnline. It offers documentation that lists the individual libraries—law journals, bar journals, and an index to foreign legal periodicals—within the database that offer citation searching. It also has several extremely reliable analysis tools that point to the articles included in the ranking. These tools include lists of most-cited authors, journals, and articles and the “HeinOnline Hat Trick,” also known as ScholarRank, which ranks the 50 articles cited most frequently by law journal articles, cases, and accessed the most often by HeinOnline users. Search results can be sorted into each of these three categories.

EBSCO Databases

EBSCOhost ( databases cover a full range of subject areas, including academic journals in business, finance, higher education, and healthcare, of which certain journals support cited reference searching. For documenta tion purposes, you can download exact titles and dates of coverage. When you choose a single database that supports citation searching, you see a “cited references” link at the top of the EBSCOhost screen. If you are searching multiple databases simultaneously, the databases which support that feature appear in the drop-down list. After the search is conducted, those articles that have cited references are differentiated in the results list with a checkbox in the left-hand column.

When articles of interest are located, two options for cita tion research appear: “cited references” and “times cited in this database.” A specific cited references search option is also offered, in which searches can be conducted by cited author, title, source, year, or a combination thereof. One way to nar row the search universe to present only results with cited references is to click on the “references available” search limiter.

EBSCO’s criteria for including cited references for journals include standards such as prominence of the journal in its f ield of study, circulation, usage, and number of references available. EBSCO also reviews the number of references available per article in each journal as a measure of citation- inclusion worthiness. This evaluation metric is unique in that it holds journals to a level of standardization for each article included in an issue and encourages publishers to adhere to uniformity.


The Scopus ( abstract and index database contains more than 60,000 records from more than 22,000 journals along with thousands of conference proceedings and book contents. A Scopus subscription does not come cheaply, but in terms of rigorous selection criteria, you definitely get what you pay for.

Scopus’ main selling point is its focus on content selec tion and quality. It takes a four-pronged approach to content selection criteria: All included journal titles must be peer-reviewed, published regularly, have English abstracts and Roman script references, and have a publicly available ethics and publication malpractice statement.

In addition to these stringent rules for inclusion, it has similar rules for journal exclusion. Each title is regularly reviewed according to “red flag” benchmarks: High self-ci tation rates, few citations compared with peer journals, an “impact per publication” (ipp) score (“the average number of citations received in a particular year by papers published in the journal during the three preceding years”) that is half or less than the average ipp score, low article output compared with peer journals, lack of pervasive abstracts, and less full-text links than peer journals are all markers that target a journal for re-evaluation and possible discontinuation. Sco pus also uses a “radar” system that looks for outliers by detecting oddities such as sudden article output growth, “geographical diversity among authors and editors,” and changes signaling disproportion between received and self-citations.

The transparency of these requirements is extremely helpful when using Scopus analysis tools for citation research. Additionally, articles on Scopus are tagged according to the publisher’s identifying information. For example, when an article is received, the identifying information is separated into fields, with each item having its own list of references. When you click on “cited by,” Scopus generates a list of citing articles. This list contains only articles that are on Scopus; again, the citation and abstract come from the publisher.

There is also an additional feature, “view secondary documents,” which algorithmically creates a list that looks for other sources such as popular press, YouTube videos, and seminars that have cited an article listed on Scopus. While this opens the citation search up to a much larger universe of possible citations, that universe is unknown. Since the exact sources the algorithm is parsing cannot be documented (which is in direct contrast to the “cited by” feature, which parses through Scopus-contained documents), info pros who cite these secondary documents need to issue caveats.

The main advantage of using citation reference evidence in research reports or expert testimony is that we can point to the collection where the references were found and explain how we know that our evidence is sound. When using Scopus “cited by,” we have a pretty good idea of the sources of the literature that is being used. Since this is not the case with the secondary documents, we should proceed with caution when using those references and issue caveats ac cordingly. It also should be noted that if a secondary document is added to the Scopus collection at a later date, it is moved from secondary documents to “cited by” and is given a count of 1 in cited references.

Scopus offers analysis tools that provide a snapshot into the meaning behind citations. For example, a graphical representation of the number of articles by author by year might depict a rising star, a has-been, or an expert in a field who has pub lished consistently during a period of time. In the same vein, you can further massage this data by topic to pinpoint experts on very specific knowledge within a particular field of study. Yes, we are relying on Scopus algorithms to provide this data, but we can be confident that it has the underlying documents that we used to make the calculations. Also, if a user or author finds a mistake, Scopus is highly responsive to that information and quick to make corrections. Scopus has an “author feedback wizard” online form for submissions as well as several email addresses set up to receive this feedback.

previousPage 2 of 3next

Amy Affelt is director, Database Research, Compass Lexecon and author of The Accidental Data Scientist: Big Data Applications and Opportunities for Librarians and Information Professionals (Information Today, 2015).

David Pauwels is senior information professional, Compass Lexecon.


Comments? Contact the editors at

       Back to top