A few months ago, I had a discussion with a library science Ph.D. student about a common problem among ESL students. They do a search but then often lack clear criteria to evaluate the results and to choose the best ones. It occurs to me that this is not just a difficulty for ESL students, but for all of us. Making sense of search results, especially when there are a lot of them, is a difficult prospect.
We’ve grown used to living with search results, muddling through far too many of them for the few gems available, and assuming there are no strategic ways to fix the outcome except to guess a whole new set of words. In fact, many of the problems with results arise from bad searches, so I will look first at strategic searching—the path to cleaner citation lists.
Ask a question
One reason why search results can look like trash is that database users see their search as a means to figure out what their mostly undefined goal is going to be. Some people think it’s a viable plan to throw out some topic words and then cull through the citations for clues about a direction they want to take. This, of course, is backward and utterly inefficient.
I’m a firm and unapologetic believer in making searches as intentional as possible. The kitchen sink method—tossing words into an engine and hoping it’s smart enough to make meaning emerge—is a sloppy way to go and inevitably wastes time and limits effectiveness. My students roll their eyes when I tell them that they need to take the time to define their goal before they search, but that’s always the best route. I value the exercise of sitting down with a student over a search problem (which is really a results problem) and doing some back and forth until a light goes on and the goal suddenly makes sense. The student emerges from the experience with a “that’s it—aha” moment. Once the light goes on, real searching can begin.
Having a clear goal, which should almost always be expressed in a single sentence rather than a monologue, also provides valuable terminology. Pulling search words, or their synonyms, right out of the statement of goal (question or thesis) means the search is going to be locked into the searcher’s goal. The result is a set of citations in which irrelevance is limited and time is saved. I put it to them like this: Would you rather muddle for hours over lots of irrelevant citations after a search or take a few minutes beforehand and plan for success?
Understand the nature of the data
Students may not understand the nature of the data they’re searching. Do they realize that a Google search brings up websites of varying character, while a book search brings up books? Although obvious to us, having a consciousness of the nature of the resources you are going to find will often pay off in a major way.
While an open web search might seem easy as pie, a lack of ability to understand the characteristics of results is a real deficit. Without knowing how to distinguish the multitude of different types of results, finding a solid source in an environment as huge as the web is bound to lead to frustration. Students who seek scholarly articles with Google rather than Google Scholar or a subscription journal database pretty much doom themselves from the beginning.
Even when they have better search tools, users need to understand the landscape of knowledge. If you want to find a whole book on the narrow motif of “the dock in The Great Gatsby,” you’ll be out of luck. Books are blunt, broad-based instruments. When you need books in your bibliography, you need to find the ones that at least contain your goal even if the whole book is not focused on it. Articles, on the other hand, rarely do surveys. Articles are narrow and problem-based. Trying to locate a scholarly article devoted to detailing the history of homelessness in the United States will lead to frustration.
A search poorly formulated on a non-existent question or a failure to understand the nature of the resources available in a search provides insights into the “now what?” problem. One search error can plague a whole result set and will doom the results to irrelevance.