Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com)
Periodicals > Link-Up Digital
Back Forward

Keeping a Lid on Online Discussions
by

Bookmark and Share
Link-Up Digital

It's no secret that you can find some of the best and the worst in human nature in online discussions, whether in Yahoo Groups or Google Groups discussion lists, company Web forums, personal blogs, online communities, or elsewhere on the Internet.

The Internet can help bring people together, bridging gaps in distance, nationality, race, and religion in creating greater knowledge and understanding.

But you also see some of the most malicious and inflammatory words imaginable. You find blatant racism, anti-Semitism, Christianity bashing, homophobia, and more--all designed to pit people against one another. Nasty arguments called flaming ignite, which can seem at times like chimpanzees screeching at one another.

If you're in the position of overseeing online discussion, it can take the wisdom of Solomon at times to keeps things under control.

Some moderators look at themselves as lords of mini-fiefdoms, abusing the power that moderation gives them and heavy handedly ordering people around or warning participants not to do anything to anger them. Successful moderation requires a light touch and a heavy dose of tact, empathy, patience, and self-effacement.

Some of the best advice about online moderating can, naturally, be found online.

As a system administrator at Stanford University, Russ Allbery put together a Frequently Asked Questions (FAQ) archive about moderating Usenet moderation (www.eyrie.org/~eagle/faqs/mod-pitfalls.html), but his advice applies to any online discussion forum. Among his sage words: "Are you able to be infallibly polite? Or at least know when you need to cool off a bit before responding?

Remember, people expect anything they post to be approved, and you're going to have to reject some of it. They're going to be upset about that. Quite frequently they're going to be angry. Sometimes very angry. You don't get the luxury of losing your temper."

Joel Spolsky, co-founder of Stack Overflow (www.stackoverflow.com), a Web site about computer programming, has this to say about online discussions: "Any public discussion group elicits antisocial behavior from a small number of disruptive users, whether through boredom, maliciousness, or the desire to perpetrate a scam. As soon as you delete their posts, whether they're spam ads for mortgage refinancing or simply off-topic, people like this will log on under a different name to complain about censorship and prattle about their First Amendment rights."

Spolsky continues: "This creates a secondary effect of well-meaning people who didn't see the deleted post quoting Voltaire and complaining about censorship as well, and the downward spiral begins. If this happens too much, it will drive people away."

To avoid such a tragedy of the commons, Spolsky advises moderators to politely, and privately, explain to participants why their post was inappropriate and if possible to move the post to an off-topic area, away from the main discussion.

Recently, Anil Dash, co-founder of Activate (www.activate.com), a media and technology consulting agency, wrote this in a blog post: "We can post a harmless video of a child's birthday party and be treated to profoundly racist non sequiturs in the comments. We can read about a minor local traffic accident on a newspaper's web site and see vicious personal attacks on the parties involved. A popular blog can write about harmless topics like real estate, restaurants, or sports and see dozens of vitriolic, hate-filled spewings within just a few hours."

Dash believes the main solution is active moderators monitoring all discussion areas who guide the conversation, answer questions, delete comments when appropriate, and in the worst cases, ban users.

You should groom moderators so they carry out their responsibilities well, and you should have enough of them. "The sites with the best communities have a really low ratio of community members to moderators," says Dash. Also, post your policy about what is and isn't acceptable behavior.

In a blog post advising organizations on how to best manage online communities that spring up at their sites, technology journalist Esther Schindler (www.bitranch.com/about-esther) advises against being too controlling:

"Far too many businesses set up an online community and then insist that they have to control the conversation. Criticisms are removed; questionable language rejected; conflict discouraged. All very Happy-Happy-Joy-Joy. And always a complete flop."

Instead of deleting posts that criticize an organization or its products or services, recognize that if not allowed to be posted at your site, these criticisms will be posted elsewhere, says Schindler. If posted at your site, you have an opportunity to show your customers or clients that you're listening and that you recognize that like everybody you're not perfect. Then try to solve the poster's problem.


Reid Goldsborough is a syndicated columnist and author of the book Straight Talk About the Information Superhighway. He can be reached at reidgoldsborough@gmail.com or reidgold.com.


       Back to top