Vol. 10 No. 9 October 2002 
Information for Sale: My Experience With Google Answers 
by Jessamyn West Proprietor, 
Table of Contents Previous Issues Subscribe Now! ITI Home

When your resume and business card say "freelance librarian" people are often interested in what you do for a living. Finding the right niche in a tough job market can be a challenge. When Google Answers started accepting applications for researchers for their online question-answering service in April, I thought I'd found my match. 

Still in beta at press time, Google Answers is a fee-based, question-answering service. If you have a question, you can post it, set a price for it, and sit back and await a response. An answer can currently cost from $2.50 to $200 originally $4 to $50 with the researcher receiving 75 percent of the amount bid, once the question is answered to the asker's satisfaction. The interface also allows for comments, so that people not approved as researchers, or who may not have the entire answer to a question, can chime in with additional information. This process of knowledge accumulation and storage has been likened to "a paid version of Usenet" without all the spam []. Google owns the answers that researchers and commenters provide. Since Google also owns the Deja News Usenet archive, this direction seems like a logical progression for them. 

How It Works

Once before in August 2001, Google tried to start an answering service called Google Questions and Answers []. The previous service had Google staffers e-mailing responses to questions for a flat $3 fee. It was up and functional for about a day. The demand may have overwhelmed their resources. In the new model, Google provides the interface, screens the researchers, and deals with the finances and any disputes that arise. It sends out a monthly researcher newsletter, keeps tabs on the answers, but otherwise remains fairly hands-off. Payments to researchers from Google Central occur on the basis of how much you have earned monthly at the most frequent, yearly at least. According to Google PR, it has about 500 researchers working for them now. I am one of them. 

I'm not sure when I first heard of Google Answers. One day my inbox seemed full of messages from people telling me to check it out. The interface had a fairly bare-bones setup where you could either ask a new question or read past questions. There were only about 300-400 questions in a database with no search features, so I spent a lot of time paging through 25 questions at a time, seeing what was there. The questions varied a lot, people needing help finding the name of a painting they had seen, wanting tips for a camping trip, asking for marketing data clearly needed for their job. The application process for researchers had three parts: Tell us about yourself, do some sample questions, sign the contract. 

I wrote them a letter stressing my library background and my MLib. degree. I discussed my personal reference philosophy and pointed them towards my Weblog for more information. I figured the hiring might be quite competitive, but it turned out to be sort of first come, first served. A week later, I received my sample questions. All were basic research questions but with some small twist, for example: 

"Name the movies in which Elvis plays a character who dies in the movie." 

Well, he only dies in one movie on-screen but is seen wandering off to his death at the end of another. The questions were clearly designed to separate the people who could use Google from the people who could process and synthesize information. The online helper documents were a bit spare and the examples given for "how to answer a question" contained the answers to several sample questions, even some included in the application process. I sent in my answers and waited, well aware that many other people, including friends of mine, were going through the same process. I signed and returned my contract via e-mail, and then waited some more until I received my researcher login. 

In the meantime, the buzz had begun. People began to use the service and review it. Librarians were wondering how the service Google Answers provided differed from what a public library offers. Discussions about the cost and value of information were springing up. After about 3 weeks of application details, I got my credentials and logged in. An interesting feature of the Google interface is that everyone who uses it researchers and question askers alike must obtain a login name. All Google "handles" are suffixed with the letters -ga. While only researchers can answer questions, anyone with a login may add their comments to a question; the system does not differentiate between the comments of a researcher and those of a non-researcher. Everyone is just called yournamehere-ga. This was the crux to the underlying machinations of the system reputation. 

Google itself trades on its reputation as the largest, most popular search engine to get the sort of large-scale response it requires to keep an operation like this afloat. Approved researchers must maintain their reputations in order to stay employed. People who pose questions can rank the responses they receive and if a researcher gets too many low marks, they can be terminated. At some point, Google's application process slowed and then stopped completely. However, the researcher FAQ states that new researchers may be "selected" from the pool of people who comment, if they provide high-quality comments. This system thus encourages both a high degree of accuracy and eloquence on the part of the researchers, while at the same time encouraging other people to do the same work essentially for free, in the hopes that they might someday become researchers themselves. 

While it obviously favors Google Answers' best interests to keep paid researchers answering questions soGoogle can receive 25 percent of the question fee, as opposed to merely a 50-cent posting fee, many questions still wind up being answered via the comments alone. Google Answers encourages synthesizing comments and adding to them in order to answer a seemingly already answered-in-comments question, but as it also notes in its newsletter, "One of the more common reasons given for a refund request is that a Researcher's answer didn't add substantial value to the comments already posted." Many question-askers also fail to rescind their question after finding satisfactory answers in the comments, thus exacerbating this issue. 

How It Worked for Me The Software

The answering process on the backend is a bit more nuanced. Researchers log in to a special Researcher Center, which includes listings of past answers with the ratings they have received, an invoice listing, and a list of questions needing answers. To answer a question, a researcher must first "lock" it, claiming it for themselves. A question lock used to last for 1 hour, but Google wisely extended it to 2 hours in mid-July. The lock can be renewed by the researcher before the hours are up. When a question is locked, no one else can either answer or comment on it, although the locking researcher may request clarifying information on the question from the person who originally asked it. When a question is unlocked, it is free for any researcher to answer; Google wisely came out against the selling of locks early on in the game. The locking system and accompanying phenomena highlight how Google Answers diverges from non-fee-based information systems. 

When I first began my tenure at Google Answers, there were a plethora of questions and not too many researchers. You could examine questions at your leisure and select the ones that fell in your area of expertise. There was researcher camaraderie and a few forums sprang up for researcher discussion and skill swapping, one of which unofficially became the primary means of communication among researchers. As the pool of researchers got larger and the pool of questions did not grow as quickly, questions became harder and harder to obtain and lock. Researchers no longer searched as hard for questions in their field of expertise; they would just lock a question first, then see if it was one they could answer. A dichotomy was spawned between researchers who were hoping to make a living and hobbyists like myself, who enjoyed the sport of answer-hunting and wouldn't mind a few bucks on the side. 

The software interface did not seem to be devised with fierce competition in mind. Less scrupulous researchers discovered that they could lock multiple questions simultaneously or devise scripts or bots to lock questions as soon as they became available, giving them a decided advantage. As these issues were raised to the Google Editors, they were dealt with, but the primary sanction remained reputation don't cause trouble and you can continue to stay in the pool of researchers. 

How It Did and Didn't Work for Me An Example

Of the questions I picked, some fell in my areas of expertise technical support issues, historical facts, etc. and some were just plain odd. One person offered a few bucks for someone to tell him a joke he hadn't heard before. Another wanted psychic advice, or barring that, a humorous reply. People used the service as an impromptu temp agency, offering a few dollars for someone to test drive a Web site or to make business appointments for them. While the service was intended to offer answers to factual questions, people tried to push the envelope any way they could. Debates arose in the forums over what to do if a question was unanswerable or required fee-based online resources, such as proprietary marketing data. Since the answer "There is no answer" can also be a legitimate response to a reference query, people would sometimes try to give informed answers to impossible questions in the hopes that their response would help enough to be considered worth the money. 

In the beginning months of Google Answers, the researcher would get paid for their answer, even if the asker requested a refund. This policy has since changed. Now, if a refund is requested within a specified amount of time, the researcher will not receive payment. The economics of this system discourage people from attempting to answer tough or tricky questions, since the questioner is the final arbiter of whether their question was sufficiently answered. This can sometimes get sticky because, as any librarian can tell you, sometimes patrons do not know what they want or don't understand the nature of reference sources. Giving good customer service in these instances becomes a very delicate operation, especially when the financial nature of the arrangement makes the customer believe they are always right. 

I had a particularly jarring episode that concerned the exact citation of a quotation commonly attributed to one author. When my extensive research netted no corroboration on the provenance of this quotation, other than the fact that Bartlett's had dropped it from its later editions, I summed up my research to the questioner with my annotated opinion that the author of the quotation could not be verified. I received a response via a "clarify this answer" feature essentially saying, "Attributed means he said it, and I want to know when!" Is the customer always right when the customer misunderstands vocabulary words? 

This particular episode didn't end there. My answers were then posted to a public mailing list, where the questioner tried to determine if they were worth the $4 he paid for them. Since my Google handle is a rough approximation of my name, this information got back to me and caused a lively exchange between me and the now-not-so-anonymous question-asker. I got a one star rating out of five for that question. 

The Researchers at Google [Google always spells it with a capital R] are supposed to be at the same time both highly experienced and completely anonymous. Any personalizing information placed into a comment or answer is grounds for deletion of that information and possible suspension of the researcher. So I, as a Google Researcher, can say that I have a degree in Library Science, but there is no way for this information to be verified. Once a question is posted, it can be answered by any available researcher, or commented on by anyone. There is no way to direct a specific question to a specific researcher without assent from the entire community. Google Answers editors have begun attempting to discourage researchers from offering hasty incomplete answers so that they can get to more questions, but scarcity of questions and competition for them among a larger more anonymous pool of researchers have made that difficult to enforce. 

A researcher's specific credentials may help strengthen the answer to a question in their stated field of expertise, but does nothing to further the researcher's standing at Google Answers, besides keeping them employed. Google Answers does present a Researcher of the Week award which confers a small amount of honor, but no preferential treatment. In this way, the assembled experiences and abilities of the researcher pool add to Google's reputation and indirectly to more work for the pool of researchers, I suppose but there's no seniority, no hierarchy, and no tangible reward for talent other than fame. The most money still goes to the researcher who answers the most high-paying questions, and this often breaks down into the researcher who is either quickest on the mouse or less principled about the weak points in the site mechanics. 

The Economics of Selling Information

This may sound like an overgeneralization, but it seems that when you pay to have humans answer your questions, you often talk to so-called experts, and when you get answers for free, you either talk to a librarian, a random stranger, or an open source aficionado. The difference between the Google Answers' model and the public/academic library model appears mainly that when a librarian gives a patron a response to their reference query, the patron tends not to argue with her. If she tells the patron the question has no definitive answer, that response is more likely taken as fact rather than a personal failing on the librarian's part. The fact that all library patrons share the time of the librarians tends to encourage a polite acceptance that each patron's specific question is one of many needing to be answered. 

In the Google Answers arena, I have seen researchers insulted, sworn at, and otherwise degraded by people not happy with the responses they received, when you might think that just not paying for the answer would be reprobation enough. Part of the Google Answers standards of conduct include politeness and friendliness at all times and not discussing Google policies or pricing with question askers. Catering and kowtowing to upset customers at the expense of explaining to them that their question was priced too low or phrased too poorly became a trade-off I had difficulty making. 

While I enjoyed my time at Google Answers, I was soured by people asking $4 questions and not being satisfied with the depth of the responses they received, responses that had clearly taken a fair amount of the researcher's time. One of the strict rules at Google Answers forbids discussing the amount of money offered for a question. If the questioner offers too little, the researcher should simply refuse to answer their question. Of course, in the competition for scarce questions, this never happened, except in extreme instances. It seemed indelicate or rude to point out to a questioner that if they had placed a higher price on a response, they might have gotten better research and more time from the researcher. Is the customer always right if they want skilled research for $4 an hour? This "customer is always right" philosophy that pervades marketplace interactions seemed to override personal senses of reasonableness in many cases. Google Answers is currently working on guidelines for what kinds of questions most appropriately fit into the various price ranges. Researchers will welcome this tool. 

The fact that there are people willing to answer a potentially difficult question for $1.87 does not mean that it is a good idea to encourage people to expect more research for less money, especially when supposedly interacting with experts. The Google Answers system prides itself on having talented workers and yet at the same time encourages though does not force them to frequently work for a fraction of the price that degreed, experienced experts could earn for the same work. While determining the free market value of this sort of information retrieval and presentation most of which is available online, for free is tricky, my experience working for Google Answers made me feel more often like I was being paid to do Google searches that the questioners didn't have the time or the skill to do, rather than using my research background and abilities to turn facts into actual knowledge. 

Note: Google Answers is a beta product. Therefore, polices and procedures that were accurate at the time of writing may have since changed. 

Jessamyn West is a freelance librarian and researcher. She runs the Web site and has made $176 working for Google Answers over 4-6 weeks. You can reach her at
Table of Contents Previous Issues Subscribe Now! ITI Home
© 2002