Information professionals and their commercial partners have been long-term collaborators. We share an interest in improving access for our user communities, and we also share a drive to expand the diverse information ecologies we oversee. There is a widespread consensus that we benefit whenever we pool our collective wisdom to help our users navigate information landscapes. By working together, we can smooth the way for those users to discover the electronic resources they need with less pain.
Vendors are also key partners in studying access and discovery dynamics. OCLC, for instance, conducts extensive research about our profession and reports on its findings openly in white papers (see also, OCLC’s NextSpace magazine). This literature is well-worth reading. Recently, I got ahold of an OCLC white paper titled, “Meeting the E-Resources Challenge: An OCLC Report,” which offers a snapshot of recent experimentation with e-resources. It intrigued me because it focuses holistically on the “user journey” of information discovery. It also presents a number of intriguing case studies taken from libraries in the U.S. and U.K. (See oclc.org/content/dam/oclc/reports/pdfs/OCLC-E-Resources-Report-US.pdf; it’s free of charge, but registration is required.)
I’m going to use my column space in this issue of CIL to explore three themes from this OCLC white paper. In my view, the themes provide a useful focus for understanding how library systems, library vendors, and public service providers can improve the user journey when they join forces.
Understanding the User Journey
The OCLC paper makes its case for improving access on a simple but powerful assumption: Library users literally undertake a “journey” through a series—you might say, a jumble—of websites, online services, and proprietary vendor platforms to find what they want. The journey itself is by no means simple. A search may start on the open web or perhaps in the OPAC, but it quickly jumps in any number of directions.
It can be a circuitous path. A library website will have electronic resource finders that list content by various tags such as title or subject, and it will have its own way of answering queries, such as using pop-up windows or other visual cues. Likewise, the OPAC may point at artifacts far away, offer interlibrary borrowing functions, provide book delivery systems (“paging”), embed full-text files, or use permalinks to ensure access to digital files that reside elsewhere. Time after time, this process of moving from one zone to another confronts the user with a number of technological platforms that may have only tenuous links to each other. Add to this the fact that many high-value databases have their own search-and-retrieval interfaces, and you end up with a journey that may involve a dozen or more technology platforms.
In a way, the very freedom the internet has spawned contributes to the growing complexity of the user journey. The internet encourages innovation to fill needs, and we are now used to lightweight and portable tools such as microblogs and social media. But the internet also relies on legacy systems that have been updated and rewritten to work in the open space of the internet, even as they serve specific public library systems or university campuses. We’ve been tinkering with our legacy systems for years, and OCLC clearly believes that the downside to tinkering is catching up with us. Now is the time, the report claims, to try to smooth out how these myriad systems can work together.
Tools to Enhance the Journey
Perhaps the timing is better than ever to reframe our goals for access. If we conceptualize user search and discovery as a journey through a flood of information, we can view information systems as separate islands in the stream, with each one presenting its own unique set of rules. That’s hardly a new metaphor for identifying access barriers—“information silos” is another favorite of mine—but nautical terminology draws our attention both to the stream itself and the islands it surrounds.
The challenge lies in how discreet domains of knowledge are linked to each other. User experience research has shown time and again that systems interoperability is one of the most crucial determinants of user success. Many library systems—whether open source, commercial, or consortial—did not start out with a strong emphasis on interoperability. But most developers are now playing a furious game of catch up, OCLC included. Interoperability is now understood as an essential element of information architecture. Basically, we can’t do without it anymore.
Metadata also gains new importance alongside interoperability. In OCLC’s report, Maria Collins (head of acquisitions and discovery at North Carolina State University) says that “the effectiveness of our systems is increasingly about the quality of the data that describe the resources.” Among other things, she points out that MARC records can be used in new ways to increase discoverability.
|But if we want to create an e-journey that does not send users on an arduous ‘e-voyage,’ we would do well to expand the common ground.
Another case study shows how librarians at the University of Birmingham in the U.K. focused on building an interface that reduces the labor of learning all-new sets of tools and finding aids. They developed a single user interface to all library content and systems—effectively retiring the OPAC as an end-user tool.
These examples point to two important trends. First, librarians are continuing to examine legacy systems for new value, such as extending the usefulness of MARC or other taxonomies. Second, they are simultaneously pushing the limits of existing user interfaces in search of better solutions. Our users benefit from both of these trends.
Data: Let’s Synchronize
There is a lot of sizzle surrounding data these days. It comes from the excitement of Big Data, the breadth of new sources, enhanced modeling that can be done by nonspecialists, and fast data flipping, to name just a few trends underway. But amid all the excitement, it has also become clear that the profession has a very specific challenge when it comes to data, and this OCLC white paper gets right to the heart of the matter.
Library computing environments have long required repetitive data entry, with users having to rekey the same information in different, non-interoperable systems, wasting their time and causing them aggravation. Medical records management has similar data structures and data input characteristics, exacerbated by the law mandating patient confidentiality and limiting how data elements are handled, stored, and maintained. In the case of library systems, there is no law to hold us back, but we keep soldiering on. OCLC’s report says: “Enough!”
Although reducing repetitive data entry is a top priority, it also matters how we perceive and process data. Social media demonstrates how much time can be saved whenever data are made portable and can travel to wherever they are needed. In the contemporary environment, the holy grail for library systems is to enhance data-driven metrics in all forms—usage reports, statistics, financials, and so on—so they can be repurposed on-the-fly.
The more portable the data, the better the chances are of keeping all pieces of the information journey synchronized too. In the aforementioned report, Gregg Silvis (associate university librarian for information technology and digital initiatives at the University of Delaware) says that “[w]e need more sustainable processes for customizing and synchronizing data sets,” so they don’t fall out of synch. Whenever they do fall out of synch, the workload for our users increases.
Collections: COUNTERing the Evidence
One of the most interesting aspects of “Meeting the E-Resources Challenge” is its assessment of evidence-based collection development. It is not too difficult to find colleagues who dislike this idea—after all, a research library must collect resources that may not receive high use but that must remain accessible for the long run. Even so, data-driven metrics tell us many important things about our collections, and interest in new ways to deploy these data is growing.
For example, actual usage could be linked with gate drop statistics to demonstrate the importance of physical space for special collections—and we could use all the help we can get to preserve that space. Collection metrics can also play a proactive role in other arenas, perhaps as advance indicators of how the curriculum is evolving. This would be a high-value use of collections metrics.
There is a lingua franca for usage statistics that has been lurking in the background, at least for some of us: COUNTER (Counting Online Usage of Networked Electronic Resources). It’s a joint library/vendor/publisher venture. In this day and age, all info pros should become familiar with COUNTER as the foundational software for all collections statistics. COUNTER-compliant data can be modeled for acquisitions decisions in ways that show concrete evidence of usage activity. Publishers use it as a basis for pricing and licensing. Libraries can also use COUNTER (or similar open source software) to increase the value of library-generated statistics. Expanding COUNTER-compliant data may become a new outreach tool for educating users (not to mention administrators) about library collections.
A Better E-Journey
Librarians and vendors have increased the impact of their user research, and this bodes well for a better user experience. But if we want to create an e-journey that does not send users on an arduous “e-voyage,” we would do well to expand the common ground. OCLC’s white paper offers some exciting examples of how library systems can evolve if vendors and front-line service providers work together. These include enhanced interoperability, open access (OA) publishing, better link resolving, and conceptualizing web-scale access solutions. The key to success is a collaborative effort, and if anything, the future will call on all of us to forge deeper collaborations. Fortunately, we are a profession of collaborators, so we already possess the skill—and mettle—to thrive.