Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com).
Magazines > Online Searcher
Back Forward

ONLINE SEARCHER: Information Discovery, Technology, Strategies

HOME

The Social Life of Noise
By
July/August 2019 Issue

If you just bought Bob Woodward’s book Fear, about the first couple of years of the Trump administration, Amazon will then recommend Michael Wolff’s Fire and Fury, James Comey’s A Higher Loyalty, Omarosa Manigault’s Unhinged, and Andrew McCabe’s The Threat. All four excoriate Donald Trump.

You can see why this seems to make sense from Amazon’s point of view: If you bought a fiercely anti-Trump book yesterday, you’re not that likely to buy a pro-Trump book today. And yet, it also seems wrong: Are you really all that likely to put down one 449-page book about how awful (or wonderful) Trump is and immediately pick up another? And even if you are, is that the reading behavior we, as a society, want to encourage, no matter what your politics?

Here’s a hint that the answer to this last question is no: Amazon’s list is not what a librarian is likely to suggest to you. Sure, if a reader went up to a librarian and said, “I just read Fear and it was great. What do you recommend? ” the librarian may well suggest some of the books on Amazon’s list. But the librarian is also likely to then veer off course: “ You know, you might like Garry Will ’ s Nixon Agonistes , a biography of another problematic president that really takes you inside his mind.” Or What’s the Matter with Kansas? because in 2004 , it explained to Democrats the appeals of the sort of populism that seems to have swept Trump into office. Whatever the recommendations, the librarian is likely to include works that stretch the user’s viewpoint. They’re works that are not so different that the reader will reject them out of hand—an anti-Trump reader is unlikely to take the suggestion to next read Trump’s 2007 self-help book Think Big and Kick Ass—but that do not simply reinforce the reader’s current ideas.

Librarians often are good at this because they know what’s available, and they feel an obligation to promote social good. Amazon knows more about the available books but exhibits no real concern for nudging us out of our most comfortable zones. Their algorithms seem to be aimed at making the easiest possible sale by recommending the lowest-hanging fruit. You read an anti-Trump book? Here, have another. You read a science-fiction book about time travel? Here’s another six. You read a book about camping? Here’s another 12, plus 30 ads for camping equipment.

Amazon, of course, could adjust the AI algorithms generating its recommendations so that for every four it takes from the top of the list of Books We’re Most Likely to Buy, it takes one from further down the list. It could look among books bought by people who bought Fear but whose purchase history otherwise is not much like yours. It could look for books infrequently purchased by purchasers of Fear but that are ranked highly by those readers. If Amazon wanted to try to expand the viewpoints of its users, it could apply some of its amazing computer scientists to the task.

But Amazon has not set that social goal as a priority, presumably because books that are not in a customer’s belief sweet spot are harder to sell; that’s what makes the sweet spot so sweet.

To Amazon, recommendations of books that searchers are less likely to buy represent a lost opportunity. A computer scientist might call them “noise”—from the customer’s point of view. A librarian might agree. But librarians—unfortunately stereotyped as the enforcers of silence—understand that such noise is essential to curiosity, sociality, and democracy.

For example, open stacks are a very noisy way of presenting books for consideration. While there is a well-worked-out order that dictates each book’s precise placement, a user browsing the stacks is searching for surprises. If she knew exactly what she was looking for, she’d be on a search-and-retrieve mission, not browsing. The vast majority of the books will be irrelevant to her. They are noise. But that noise is essential to the discovery of interests we didn’t know we had.

This distinction holds in the digital world as well. Searching is a transaction: “In exchange for my keywords I expect an ordered list of relevant responses.” Browsing is a stroll through rich chaos.

Noise is essential to belief. When we are too firmly entrenched in our beliefs, the sounds coming from those who disagree can be just noise. We can’t even make sense of why anyone would believe such nonsense. Librarians are often quite good at finding the books (or journals, DVDs, podcasts …) that we would otherwise tune out.

Getting us to tune in requires finding works that are just different enough that what at first might seem like noise resolves itself into something meaningful, just as when we’re introduced to a new type of music, it can take a while to find and appreciate the musical elements.

Sometimes we fail to find the music. But sometimes we succeed: The recommended book opens our eyes to a new thought, a new aesthetic, a new way of thinking about an issue we thought was settled. The space of a library is in that sense an invitation. A library that consisted of the single perfect book for us would be far less important than a library filled mainly with books we’ll never want to read.

Now, this doesn’t mean that a noisy library will always beat machine learning’s recommendations. Machine learning has the advantage of knowing more than can fit in the head of even the wisest human librarian. Its algorithms can be tuned so it recommends works usefully inefficient, based on signals that expose us to attractive differences that would otherwise escape the notice of human minds.

But in so doing, machine learning systems will be confirming what libraries have known all along: If we’re going to live with one another successfully, we need lots of noise.


David Weinberger is a senior researcher at Harvard’s Berkman Klein Center for Internet & Society. His new book, Everyday Chaos: Technology, Complexity, and How We’re Thriving in a New World of Possibility, is published by Harvard University Press (everydaychaosbook.com).

 

Comments? Contact the editors at editors@onlinesearcher.net

       Back to top