Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)

Magazines > Computers in Libraries > June 2022

Back Index Forward

SUBSCRIBE NOW!
Vol. 42 No. 5 — June 2022
FEATURE

Curating OA Collections: Concerns and Considerations
by Linda Robinson Barr


[T]hese were critical conversations to have prior to any decision on any sort of OA offering.
Most discussions around OA have focused on scholarly communication support for promoting OA publishing through a variety of library-led efforts. Far less attention has been paid to the benefits OA can offer libraries that are looking to expand access to high-quality information resources while minding budget constraints. As more than half of new research articles published in 2020 are openly available and the rate of OA publication continues to grow (this data comes from Lens.org for journal articles with a 2020 publication year), there is tremendous opportunity for all types of libraries to provide discovery and access to content that would previously have been unaffordable.

Austin Community College (ACC) Library Services joined SirsiDynix’s CloudSource OA pilot project in 2019. CloudSource OA, according to the SirsiDynix website, is a curated collection of OA digital content from the world’s leading scholarly publishers. As a member of the pilot project, ACC Library Services had early access to the software, which gave us access to the product’s curated collection of OA material. As pilot project members, we were asked to provide feedback to the company as it was finalizing the version that would go to market. This article is not meant to explain the functionality or pros and cons of SirsiDynix’s CloudSource OA product. Rather, it focuses on the main considerations and general matters involved in adopting any OA strategy for your library.

Our Situation

ACC is rapidly growing and currently boasts an annual enrollment of 70,000 students. We have 11 campuses served by one library collection split across these campuses, a robust ebook and journal collection (58,000 ebooks and growing), 210,000 ejournal titles (including database content), and streaming media databases. In 2020, 11,000 students attended our librarian-led information-literacy sessions.

Prior to joining the pilot project, ACC Library Services’ faculty librarians certainly had an awareness that access to OA materials was scattered and that OA materials were difficult to serve up as a cohesive service with a consistent, accessible user experience for our students and faculty members. ACC faculty had a strong interest in open educational resources (OERs) and had been involved in a college-wide effort to adopt, use, and create a zero textbook cost (ZTC) curriculum, starting in 2016. Our work with OA materials would naturally extend from our prior efforts to assist classroom faculty members in the discovery and use of OERs.

In addition, as a concept, the idea of supporting OA platforms and content generators was well-embraced within library services. As an example, the library initiated and continues to maintain a hub on our statewide institutional repository, OERTX Repository. However, as a community college, our focus is on teaching and not on research and publication; therefore, our faculty members’ experience with OA scholarly publishing is limited.

First Steps

Our first thought about the steps we would follow after joining the pilot was that we would review the features, set up access for students and faculty members, elicit responses and reviews, and then make a decision on whether to move forward with the product. However, we found that because we were relatively new to the OA space, we had to agree on a philosophy for using this tool, figure out our delivery strategy, determine instructional materials needs, and find out if ACC faculty librarians were on the same page about the need for a large database of curated OA materials.

There were some who, understandably, expressed the concern that support of this new tool would possibly be more trouble than it was worth, because it would involve developing new instructional materials, as well as taking responsibility for the support and oversight of how our instance of the tool was configured. Although the discussions about OA need and use and how best to offer the content took a great deal of our time, we felt that these were critical conversations to have prior to any decision on any sort of OA offering.

Beyond giving feedback to SirsiDynix on the pilot, our larger purpose in becoming pilot project members was the opportunity it provided to review the value of a tool that pulls together millions of OA resources, offering us a level of control over the large and increasing OA journal and book content available for our students and researchers. In order to determine the benefit to our students and faculty members, we created a team of faculty librarians to develop a cost-benefit analysis, establish goals, and construct a timeline to assist us in deciding whether to begin offering this particular OA service or any other. As we moved through the pilot project, we learned that there are many issues to be considered in advance.

A number of concerns were expressed, both philosophical and pragmatic, and lots of considerations were made as we worked through the project. This remainder of this article reviews our thought process through the series of questions that we challenged ourselves to answer.

Evaluation Criteria

ACC Library Services subscribes to a number of journal article, streaming media, and ebook databases. In reviewing the possibility of including OA-designated search tools in a structured fashion, our initial intent was not to cancel any paid subscriptions. However, we knew the topic of cancellation could not be ignored, as it is likely to come up in any such discussion. We intentionally decided, as a group, to set this question aside and not include it initially in the evaluation criterion for two reasons. First, we did not want that consideration to distract from our primary intent, which was to determine if OA content would be valuable to our users. And second, the ability to measure user acceptance would not come until late in the process when we could actually gather usage statistics from the new product, so it would be futile to consider cancellation options up front.

Our initial conversations centered around content quality, ease of access, and—most importantly, perhaps—the perceived need for such materials to support our community college curriculum.

Primary Concerns

Does a student need to know what OA resources even are? Does a searcher need to know whether the library paid for a resource in the results list? Does the distinction really matter? Why should they care? Of course, students have to be assured that the OA content is vetted and checked for veracity, but at what point in the search process do they need to understand that a resource is OA or not?

As we deliberated these questions, it became clear that the issue of what a student needs to learn about the resources they are using has nuances, and there are some strong opinions about how OA resources should be displayed in the search results. To be clear, the answer to this question does and should vary depending on the type of library and the structure of programs within that library.

ACC Library Services currently provides access to a discovery system that is structured with the search results grouped in a multi-tabbed display by type:

  • Books, Movies, Music
  • OA Resources
  • Journals, Magazines, News

The multi-tabbed results display, which lists OA materials as a separate class of items, was and is still a sore spot for some librarians, as a number of them would prefer a single list display. However, because each group of results actually utilizes somewhat unique facets and limiters, a single search results display would require a great deal of alteration to our instructional literacy materials and training for our reference librarians. This isn’t to say we aren’t looking at this possibility, but it will take some time to implement should we choose to make this change.

Content Coverage Considerations

Beyond the issue of evaluating the piloted software, which would allow us to search 40 million OA entries within our discovery system, we needed to evaluate the OA content offered and discuss the philosophy and use of OA in general for our community college curriculum and students. Some of the questions we struggled to answer included determining if the content is useful, high-quality, and appropriate for our patrons.

The OA materials made available via the federated tool we were evaluating stood at more than 8 million when we started the pilot project and is now at 40 million, as previously mentioned. All materials were defined as OA, which included prepublication resources as well as peer-reviewed materials. ACC Library Services needed to decide whether to offer all materials or to use the dashboard provided by the vendor to limit what would be searchable (for example, exclude preprints and non-peer reviewed items). While the CloudSource OA pilot interface did allow us to expand or limit the collection by these criteria, we spent a great deal of time trying to answer the question of which content would be most useful, and that discussion continues today. If we included all content types, wouldn’t there be just too much content for what our curriculum requires and our faculty members and students need? Isn’t it overkill?

Many people might say more information is always better, but in a situation in which including more results makes it harder for the user to sift through to get to the most appropriate information, it may not be better. We don’t know the answer. So, when we begin our user testing phase, we plan to test the lay of the land by putting no limits on what OA materials our users can access initially and then go from there.

About the CloudSource OA Pilot

At a talk I gave at Internet Librarian Connect 2021, co-presenter Carolyn Morris (from SirsiDynix) explained the pilot project’s goals. Based on the challenges inherent in curating OA collections, the CloudSource product aggregates OA content with enhanced metadata and provides a search API and collection management tools for librarians to curate their own collection and make it available for discovery.


Slides from the author’s deck tell why the library participated in SirsiDynix’s CloudSource OA pilot.

Issues for Training and Explaining

If we decided to go forward with offering this OA content, the next logical question would be how we should incorporate a discussion of these OA materials in our information-literacy efforts. While we have a robust information-literacy component in our library programs and a team assigned to review all new resources and implement teaching aids for them, we chose to defer a full consideration of the needs until after the testing phase. The group charged with evaluating the pilot product will pull in our information-literacy team during the use-testing phase to get their input on best practices for including a discussion of OA and OA resources into our information-literacy classes. However, we anticipate there will be some challenging questions for the team to work through: How do we teach the difference between reviewing search results from the OA resource versus the search results from our paid databases? Do we even need to distinguish between the two?

When students review search results, we certainly need to make sure they understand what they are seeing. In the case of this particular pilot program, the notations and icons associated with the OA citations differ from the symbols associated with the bibliographic records for the journal article records within our existing catalog and discovery system results list. So, that will need to be explained. It will be important to point out that Creative Commons license notations are key to OA resources, but such notations are not currently displayed for our paid resources—books, articles, and streaming media databases—even though some of those resources may be OA.

Lessons Learned

When reviewing the addition of OA databases and tools to library services, librarians need to vet the new tools by asking questions and establishing acceptance criteria. The steps prior to assessment and use include the following:

  • Develop multiple strategies for assessing, testing, and use.
  • Discuss, ponder, argue, and then agree on the approach and goals.
  • Review assumptions about OA, information literacy, and student search success.
  • Create a process analysis, a timeline, and goals.
  • Value and support differing opinions and approaches on the team.
  • Most importantly, ask questions.

Collection Concerns

There will naturally also be other questions that emerge about our collection once the OA materials are included, the most obvious being, what’s the overlap with the materials in our paid resources? As one of the first steps we took in the pilot project, we obtained an overlap analysis spreadsheet, which allowed us to see how much of the material available in the vendor’s OA collection overlapped with our paid database content. The immediate value of this analysis was not so much about the overlap we had, as it was about how much of an addition and expansion our collection would receive by adding the OA materials. (As noted previously, we have no plans to cancel databases or subscriptions just because we are also offering some of the same content as OA.)

But just in case, we should definitely ponder this final question: If we were actually to cancel overlapping paid resources, how reliably could we depend on OA materials being retained? Our group expressed concern that although the platforms supporting OA appear to be strongly committed to the retention and maintenance of OA materials, we simply do not have enough experience yet to fully understand the long-term retention patterns for OA materials.

The questions I addressed in this article are the same ones you might ask about offering any OA collection—maybe even one you curate yourself. See the sidebar below for a summary of the issues we pondered as food for thought as you explore the options for yourself.

OA Issues to Ponder

  • How to evaluate OA resources
  • How to present the content
  • How to teach about the content
  • How active to be in curation
  • How to sustain the collection
Linda Robinson Barr

Linda Robinson Barr (M.L.I.S., lbarr@austincc.edu) is the head librarian for technical services and library automation at Austin Community College Library Services. Barr has had the opportunity and honor to oversee a number of initiatives related to the use of improved software tools for access and delivery of electronic materials. She is a native Texan who has lived and worked all over the country in public, academic, and corporate settings.