ONLINE Magazine
Table of Contents Previous Issues Subscribe Now!
VOLUME 26 • NUMBER 5 • September/October 2002
Conducting User Surveys:
An Ongoing Information Imperative
by George R. Plosker
User surveys are often mentioned wheninformation professionals strategize on expanding the reach and recognition of library services. However, user surveys tend to "slide off the plate" as information professionals go about their busy schedules. Yet remaining relevant to users is critical to library survival. What better way both to ascertain what users desire and to inform them of capabilities they are unaware of or don't use? Practical tips and suggestions on how to go about doing such a survey will hopefully whet your appetite for doing user surveys. Ideas on relating survey efforts to the business or academic planning cycle, working with functional departments on goal setting and project work, and segmenting the user base should make the process more palatable.

Given the new vernacular in libraryland—browsers, Web-speed, spidering, search engines, and Web directories—Internet concepts have become relevant to all front line information professionals. More importantly, user behavior and expectations have gone through extensive changes due to the altered information environment. Users expect 24/7 availability and the convenience of access from their own homes or offices.


In ARL libraries, according to Scott Carlson's article "The Deserted Library," published in the November 2001 issue of The Chronicle of Higher Education (pp. A35-A38), actual visits to the reference desk are clearly on a downward trend, while remote access to library subscription databases has increased. The old model of one at a time queries, usually at or through the reference desk, leading to one-off needs fulfillment, is no longer adequate. With more than 175 million people on the Web and reference traffic declining, librarians should be determining what percent of their users come to the reference desk. How can populations of potential users be reached and made aware of the added-value services the library provides?

Understanding how these trends impact library users and the impact on library environments comes from collecting relevant and accurate data. While it is possible to intuit changes in user needs and expectations via simple observations of trends and behavior, is always preferable to take a more scientific approach to the key issue of determining the needs of users, both existing library users and potential groups of users who are not currently using library services. Remember, if you can't measure it, you can't improve it!


Several strategies are available to information professionals to determine key needs and obtain input on preferred solutions. The reference interview remains a valid avenue to determine what a user needs. Librarians might ask additional questions at the conclusion of the traditional reference interview to see if key users have ideas on enhancing and extending user services. Do they, or would they, use online databases if remote access were provided? Would e-mail reference service be useful? What about chat technologies—would patrons use a chat function to communicate with professional reference staff remotely?

Many tactics can be utilized to get closer to your user base. A classic approach is to segment your users. This should be a comfortable exercise for information professionals, as you are really "cataloging" your patrons into groups that share similar perspectives, motivations for using information, and, therefore, needs. Once these patrons are defined into groups, work to understand each constituencies' goals and interests thoroughly. You can accomplish this by joining project and/or community teams made up of these groups. Investigate their planning cycles, volunteer to consult on needs related to a key initiative, determine what their "hot buttons" are. Once you are closer to the overview perspective provided by being closer to the group and their objectives, you can even move to proactively anticipate needs by using alerting strategies. Surveys and audits become increasingly effective if targeted to smaller groups, becoming more focused, more precise, and elevate the dialogue to actionable specifics.

Suzan A. Brown, in her forward-thinking article, "Marketing the Corporate Information Center for Success" (ONLINE, July/August 1997, pp. 74-79) provides specific illustrations of this approach in a corporate library environment. Brown segments her corporate users into administrative management, R&D practitioners, and scientists and engineers. Administrative management is concerned with the organization's overall productivity and viability, which means they may be interested in competitor activities, new business opportunities, and regulatory requirements. The R&D staff is concerned with recent scientific developments and technical information and is likely to have a special interest in patents. Scientists and engineers will be primarily interested in project-related information that is narrowly focused on previous research, but they will be open to serendipity.


In a public or academic environment, you can divide your users vertically by academic discipline or horizontally by grade or age level. Administrative units can be considered a separate segment, especially in the public library world where administrative units are often the funding body. Other public library constituencies include student populations, adult and distance learners, children and young adults, or ESL populations.

All professionals know that getting and staying close to your users/clients takes effort and time. While all of the tactics listed above are viable, the use of surveys and information audits will provide a framework and foundation for all other approaches. Why conduct surveys and audits? The survey or audit will do all of the following:

  • Reveal service issues and opportunities
  • Identify (unmet) needs
  • Have an implicit marketing function
  • Utilize limited resources in a more efficient manner
  • Obtain input for strategic planning

Brown comments, "...dealing with customer complaints revealed by the survey is imperative. According to a survey by American Airlines, one unhappy customer tells nine to 13 other customers about his or her bad experience, and only 4 percent of unhappy customers complain to the company. For every person who complains, 24 other unhappy customers do not communicate at all, and 75 percent to 90 percent of unhappy customers who say nothing will never do business with the company again. However, the study also showed that 82 percent to 95 percent of those who complain will come back if their problem is resolved quickly."

Brown also stresses the outreach aspects of doing a survey. "Although information needs and current customer satisfaction can be assessed on a one-on-one basis as customers visit the information center, this method omits all those potential customers who never think about consulting the information center. To reach potential customers, information centers may consider conducting a survey by mail, e-mail, or through an employee newsletter. The questionnaire can be designed to assess the needs of both current and potential customers and should address such issues as satisfaction with current services, other outside services used, perceived needs for other kinds of information, and how the information center can provide better service in general."


Hopefully you are now convinced that doing a user survey or information audit is a good idea. How do you get started? Here are several simple steps that will guide you in obtaining the information needed to improve the effectiveness of your library.

First, determine what topics need surveying. Perhaps you have recently implemented a new service and want to check on user reaction to the new database. A targeted approach is often preferable to attempting to survey all aspects of library service. This will keep the survey shorter and more manageable. It will also begin to orient your users to the idea of satisfaction surveys that can be built upon. Doing a 20-page survey with hundreds of questions is probably not a good place to start!

Once the topic(s) of the survey are defined, it is useful to do an informal gathering of information. Pre-test your survey questions by interviewing several core library users to see what they think. This interaction will bring forth the specifics of what the users are or might be interested in and will lead directly to the creation of the survey questions.

In addition to working with individual library users, you should conduct small "focus groups"/brainstorming sessions with key constituencies. During these sessions, you may wish to focus on these types of issues:

  • What do people perceive the role of the library to be?
  • How successful do they feel the library is in fulfilling that role?
  • How is the external/user environment changing or impacting the role of the library?
  • What are needs or issues specific to their group that the library can help with?
  • What changes do they anticipate for the future?
Take careful note of all informal information gathering so that all input can be worked into the survey design. Other data points to help define the survey include an analysis of the existing library image and environment. An examination of recent internal correspondence will reveal internal directions and issues. Looking at patron communications, outreach programs, and library publications will also provide useful perspective. Following this review, determine what trends or themes emerge.


At this point, you can move to a complete review of your informal data. It is useful to classify the data to help focus on the issues. Classification by source or constituency, by aspects of library operation, and by trends and impressions will help shape your survey questions.

User surveys can utilize several methods of data collection. There are essentially two ways to go: verbal or written surveys.

Verbal surveys can be done over the phone and often make use of trained interviewers. These are particularly well suited for in-depth input and can provide data quickly. If you are looking for feedback on specific programs, publications, or events, the quick phone survey will work.

Written surveys can also take many forms. Written surveys range from the mini-survey or "one pagers" consisting of a few easily answered questions, up to comprehensive, multiple-issue surveys. Your approach should be based on what are you trying to accomplish. What information would you like to obtain? Reviewing your informal data will point you in the right direction. Remember that staying in touch with your users is an ongoing process. You can start with a simple, single-issue survey and move on to more comprehensive surveys over time.


The first design step is to determine the mechanics and logistics of the survey. You want to answer questions such as these:

  • Who will be surveyed?
  • How will you format your questions?
  • How can you make your questions clear and relevant?
  • How will you measure validity?
  • How many questions?
  • How will you print and distribute? (Web?)
  • How will you collect completed surveys?
  • How will you calculate and distribute results?
  • How will the results be used?
The single-issue survey has a narrow focus. For example, a Silicon Valley special library surveyed sci-tech personnel only on the potential of e-books. The entire survey consisted of a few demographic questions and asked about thoughts on and potential uses of e-books. Going on the philosophy that shorter is better (10 minutes or less to complete), the survey had a 90 percent return. As a bonus and despite the deliberately narrow scope of the survey, the library also received useful feedback on broader issues.

On the other hand, I have also seen a 15-page survey designed by an outside consulting organization that covered virtually all library issues and services. The issues of length and time to complete are important ones. Generally, needing more than 30 minutes to complete the survey is a negative, regardless of nuances in your environment.


Designing the questionnaire is the next step. Keeping your target audience in mind, how can you encourage users to complete and return? (Hint: Food works.) How can you insure that the questions are answered properly? A carefully stated cover sheet can help here. The cover sheet should explain what is expected, provide an overview of content, and describe the style of the questions.

To get at both simple and more complex issues, use open- and close-ended questions. Open-ended questions are essay types that allow the user to express more in-depth input as well as allow for opinions and views.

However, open-ended questions require more time and effort to formulate. They certainly take more time to answer and can overburden your audience. Open-ended questions are also more difficult to quantify or tabulate. It is recommended that no more than 10 percent of total survey questions be open-ended. Put open-ended questions towards the end of your survey instrument.

Close-ended questions are "multiple choice." They are easy to answer and to tabulate. However, close-ended questions limit input and do not allow for probing into causes of problems or potential solutions.

Finally, the form should be pleasing to the eye and uncluttered. It is always a good idea to test your survey form with a small group and evaluate their comments and results. Make revisions based on their feedback.

While your process will shape and identify the specific questions you will ask, there are typical components of user surveys. There is often a demographic component that adds value by defining context. These questions typically include:

  • Department, job title, goals, duties 
  • Frequency of use
  • What do you use?
  • Document types, media
  • (print, electronic, etc)
  • Why do you use?
  • Project, specific search, professional reading, school assignment
Other questions may get at the logistics or methodologies of using the library. Do patrons come to the library in person? Do users call in, send faxes or e-mails, or use remote service(s)? You may also want to know what other information (sources) are used.
  • "Single-issue" surveys literally can run the gamut of today's library services and include the evaluation of:
  • Reference service
  • Collection
  • Specific product
  • Online search
  • Specific market research report
  • Facilities
  • Potential new service(s)
Questions should be defined in terms of what is being measured or evaluated. Library surveys often attempt to obtain input on:
  • Awareness of library services
  • Relevance/pertinence of results
  • Ease of access
  • Quality of reference service
  • Effectiveness of outreach efforts
  • Responsiveness to requests
  • What services are being utilized
  • Unmet needs
  • Why services are not being used
  • Training needs
  • What documents/document types are most important
  • What access tools are most important?


As is often mentioned in today's budget-conscious era, measuring value is the key issue in evaluating libraryservices. While it is difficult to quantify the impact of library services, it is becoming increasing important to do so. Some of the best measures that have been seen to date include attempt to determine if library use or access to information helped to:

  • Save time or money
  • Generate new ideas or insight
  • Increase depth of knowledge
  • Uncover or confirm details or facts
  • Expand perspective or add dimension
  • Increase confidence in decision making, including tracking the type, quantity, and importance of such decisions
  • Obtain more budget dollars or headcount
  • Improve process
  • Eliminate redundancy
  • Lead to a discovery
  • Limit competitive surprise
  • Meet individual or organizational goals

Outsell, an industry leader in valuation of the corporate libraries, in an article, "Data Point: The Value of Corporate Libraries" (Outsell e-briefs, February 22, 2002), describes user time saved as the key return-on-investment (ROI) metric, and stated that users "save an average of 2 hours and 45 minutes each time they interact with the library. Multiplying this 2:45 by the number of library uses and knowledge workers' average salaries, the benefits become tangible and compelling. Outsell has seen the resulting value rise into the millions of dollars for individual organizations, easily covering the library's budget. Outsell analyst Roger Strouse notes that this data, when coupled with additional quantitative and qualitative measures, shows that special libraries can and do pull their own weight."


Assuming that your survey has been a success, it is now time for tabulating your data. It is recommended that you use focused time and a quiet place to tabulate. This is "head-down" work and you certainly don't want to lose those valuable survey forms! Once tabulated, checkmark each completed survey clearly. Always, review calculations with a colleague to make sure you have not made any mathematical errors.

When reporting and distributing your data, which is recommended, provide both raw data and percentages such as response rates overall and by question. Composing charts and graphs will also add to the impact and readability of the results. You can report open-ended questions with excerpts and number of similar comments.

Work your results into your strategic plans. You can utilize the data as a management tool in terms of defining approaches to meet defined needs. Use the data for marketing and public relations purposes, including briefings to senior management or local governing bodies. Consider releasing the results of your survey to local media, especially for public libraries. Write about your experience for the professional literature. Let others in the profession learn from what you've done.

George Plosker [] is vice president, Content Support and Training, for Thomson-Gale.

Comments? E-mail letters to the editor to

[Contents] [ONLINE Home] [Subscribe] [Top] [Information Today, Inc.]