Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)

Magazines > Computers in Libraries > April 2024

Back Index Forward

SUBSCRIBE NOW!
Vol. 44 No. 3 — April 2024

INFOLIT LAND

Finding Stuff Out: The Curse on Student Research
by William Badke


Blame Google. Blame AI. Even blame the information literacy movement. These days, we are cursed with an abundance of knowledge-seeking, and it’s detracting from serious thought. Every semester, I engage with a new flock of credit research course students, undergraduate and graduate, who are well-used to doing “research” and who see no need for further instruction. They rail against having to take my course, as this recent graduate student admitted in an email:

“I just wanted to reach out and apologize. I didn’t speak well of this course and referred to it with contempt. I spoke of it with dread to others. It was the ‘checkbox’ I had to complete in order to pursue my degree.”

This student changed his mind as the course progressed, but so many who hold such a view have a false conception of the research task that is denying them a chance to become critical thinkers when searching for information.

To put it simply, Google and much of current generative AI are primarily intended to deliver packaged information around a topic. They answer the “what” without coming close to the “why,” the “how,” and the “should.” And this is in total accord with what many students actually believe research to be: a quest to find out stuff and report on it. What is Google for? To look things up and get an answer. What is AI for? To explain a topic eloquently, even to pose a problem, while rarely, if ever, doing analysis to choose among options or find a best practice. Google and AI are information vendors, not tools for critical thinking.

Even information literacy, with its focus on one-shots and usually based on search technique, gives the impression that the main task is to use great skills to do a better job of finding stuff out. Our revered ACRL’s Framework for Information Literacy in Higher Education, in its “Research as Inquiry” concept, falls short of genuine critical thinking. Take this statement: “The spectrum of inquiry ranges from asking simple questions that depend upon basic recapitulation of knowledge to increasingly sophisticated abilities to refine research questions, use more advanced research methods, and explore more diverse disciplinary perspectives” (ala.org/acrl/standards/ilframework#inquiry).

Did you see it? Inquiry ranges from simple to sophisticated but, even for the latter, the goal goes no further than exploring “more diverse disciplinary perspectives.” Where is the closure? Where is the problem-solving? We are locked into knowledge-diversity exploration. Recognizing diverse perspectives is great, but it falls short of offering a solution to a problem.

I asked ChatGPT, “What is the best psychotherapy for anxiety disorders?” It gave me seven therapies but refused to suggest which was best. In fact, these days, any proposed solution to a problem can be viewed as dogmatism, indicating a refusal to allow for diverse views. Identifying options, however, is still in the territory of finding stuff out. It does not go as far as evaluating the evidence and coming to some kind of conclusion.

It’s about the purpose of information

Think of this. I go to the hardware store because I intend to buy a shovel. Why do I want a shovel? Because I’m going to put a frame around it, hang it on my wall, and show it off to my friends? Probably not. We don’t buy a shovel for the sake of admiring a shovel. We buy it because we want to dig. It’s a tool, not a piece of wall art to possess.

When “research” is seen as the task of finding information, then information is the goal: “I want to find this out so I can have it. I want to weigh options because I want to know what they are.” But information as a goal is limited and contradicts the way we use information in practice every day. I look up the hours for a business because I want to know when I can shop. I seek out someone’s Facebook page because I want to send that person a message.

Information is rarely a goal. It is a tool to do things with. Compiling a detailed account of the plight of the polar bear without ever suggesting a best practice to do something about that plight makes my “research” dead on arrival.

The dreaded research question

A student sent me a “research question” that I critiqued because it lacked a way forward. The question was: “What are the emotional and psychological effects of cyberbullying on adolescents, and how do they cope with these effects?” Beyond the obvious fact that this dual question needed to be singular, I pointed out that he was simply discovering the facts about the problem without seeking a way to move to a proposed solution.

He wrote back, “I’m surprised by the task’s difficulty and my lack of comprehension, but I appreciate the honest feedback.” His revised question was, “How can parents and schools collaborate to protect adolescents from the emotional impact of cyberbullying?” Bingo. The statement of the problem (an exercise in finding stuff out) had turned into an action item (using found information as a tool to solve a problem). What is more, the project now involved seeking out various solutions, evaluating them, and coming up with a proposal to address the problem. This is also known as critical thinking.

The finding stuff out problem persists. Take this graduate student: “For the longest time, I have been summarizing or synthesizing other’s thoughts and thinking that is research.” Research questions are indeed difficult. Finding out has gripped so much of the educational consciousness that problem-solving seems to be an alien concept.

But let’s not be dogmatic

We love diversity these days. We love the scholarly conversation that airs multiple ideas and debates them. But we are hesitant to bring about closure. Why? Because to choose one option among many means that we must dismiss ideas that deserve a hearing and have every right to belong to our ongoing understanding. Declaring, “You say this, and she says that, but what is the right answer?” seems unscholarly, since scholarship is the airing of views.

But this is where critical thinking shines. If you say measles vaccines cause autism and I can show you that the research behind your idea is fraudulent, I am not merely dismissing your view. I am answering it with evidence. To claim that my action limits diverse options and stifles the free exercise of ideas is to fly in the face of science. Just because I discover all the alternate views about something does not mean that I shouldn’t use evidence to dismiss some and affirm the most evidence-based option(s). This is not dogmatism. It’s scholarship.

Conclusions, of course, need to be held relatively loosely. Perhaps not all the evidence is in, or new findings will contradict what we once believed. Critical thinking rejects dogmatism even in the conclusions we make. But if we refuse to make conclusions, we are refusing evaluation, a key plank of critical thinking.

Research questions are hard

Research is not about finding stuff out, though that might happen in the process. It is not about laying out the options without using evidence to make a choice. It is about problem-solving, which is at the heart of critical thinking.

My graduate student’s comment, “I’m surprised by the task’s difficulty and my lack of comprehension,” is not unusual. To turn finding stuff out into a problem-based research question is a major learning challenge for most students. The concept—Information is a tool, not a goal—can help, but getting my students over the research question threshold requires more. I tell them that if they can find the answer by looking it up, it is not a real research question. I stress that a research question must pose a problem for which a conclusion is possible only after some effort.

But mostly I troubleshoot actual attempts at research questions. Here are some real-life examples.

Proposed question: What are the gender differences in the development of personalities?

My response: This is something you could look up and find an answer to without much analysis. It is also too broad. Try something like, “What is the best explanation for more prevalence of borderline personality disorder among women than among men?”

Proposed question: What are the most common mental health challenges faced by adolescents during the COVID-19 pandemic, and do these challenges compare differently to adult experiences?

My response: This is something you can look up and does not pose a problem to solve. Try: How should schools address the post-COVID mental health challenges of adolescents?

My proposed questions engage information as a tool to solve a problem rather than falling into the finding stuff out notion of research that is so prevalent in student research today. Certainly, students need to know stuff, but just possessing it is like displaying your new shovel on the wall. “Now I know something” is a far cry from, “Here is my concluding response to …”

Does it matter?

Maybe this is too fine a point. Students exploring topics to find stuff out are still using databases. They still have to find the best resources. They are learning something. Maybe it’s enough. Possibly it is, until their future boss says, “We have an opportunity here. I want you to explore the options and come up with reasons why one option is better than the rest.” Simply to identify the options is no longer nearly enough. The boss wants a conclusion. He wants your information to be a tool.

I fear that many of us have become sloppy. Teaching students to develop good, problem-based research questions that go beyond the known to advance a solution, a best practice, or a solid conclusion is clearly what real higher education is about. Take my own institution’s student learning outcome: “Cognitive complexity: Skills including critical and creative thinking, quantitative and qualitative reasoning, communication, research, and information literacy.” (Your institution probably has something similar.) How do we achieve even a portion of those aspirations without teaching students how to move beyond finding stuff out?

As information literacy instructors, it is our task to guide students into a deeper mode of thinking, not looking it up on Google, but considering information as a means to advance our understanding, solve problems, and make a better world.

William Badke


William Badke
(badke@twu.ca) is associate librarian at Trinity Western University and the author of Research Strategies: Finding Your Way Through the Information Fog, 7th Edition (iUniverse.com, 2021).

Comments? Emall Marydee Ojala (marydee@xmission.com), editor, Online Searcher.