Information Today
Volume 18, Issue 1 — January 2001
Table of Contents Previous Issues Subscribe Now! ITI Home
IT Feature •
What’s Ahead for 2001?
Predictions from 13 information industry sages

As we do every year at this time, Information Today has asked a group of information industry movers and shakers about the present state of the industry, as well as where they think it’s going. The contributions below range from the serious to the not-so, but we’re sure you’ll find them all thought-provoking.  —The Editors

Timothy M. AndrewsTimothy M. Andrews CEO, Intertec President and CEO, IndustryClick 

2001 will be the year in which trade publishing finally comes to terms with the fundamental question: How can we establish ourselves online without destroying the value of our existing franchises? 

The Web is already firmly established on the broadcast model, with open access to users paid for by advertisers. Trade publishers are already familiar with this model. Advertising has always been our principal revenue component. 

For the people who pay the bill (advertisers), the online environment creates the opportunity for target marketing—the power to conduct interactive exchanges with a “market of one.” Publishers like Intertec under-stand this concept. We have been marketing targeted publications to niche interest groups for years. The online environment enables us to deliver an individually customized version of a targeted publication to each subscriber. 

The real breakthrough will be the creation of new online franchises that fully leverage traditional content. We need to treat the Internet as a new, more versatile delivery channel that expands an existing franchise. As banks and brokerage houses have discovered, businesses have to give customers channel choice, or say goodbye. 

The great advantage for trade publishers is that online delivery doesn’t really change our basic revenue model. For a long time, we have been providing “endemic advertising” or advertising designed to generate leads and sales. As we generate more sales leads through the online channel, we can earn more revenue. 

Our challenge is to integrate new online media fully and seamlessly with our existing editorial, sales, and marketing efforts. While we are giving our subscribers a new delivery channel, we can also provide additional content through the Web site (with all its search and exchange capabilities), Webcasts, online trade shows, and on-demand programming. These new capabilities allow us to create complete vertical online communities for individual industries. Professionals will use the new communities for all their information needs, such as accessing content, training, finding career opportunities, and networking with peers. 

Vertical communities will draw new subscribers to our sites and create a remarkable array of new sales opportunities for our advertisers. The opportunity to generate new revenues is actually twofold. Advertisers will not only reach more readers, they will also capture an increased share of each customer’s spending. 

Migrating readers to online vertical communities makes it possible to offer multiple-media programs that provide powerful new marketing capabilities. These programs can include online interactive research and survey techniques, direct e-mail campaign capabilities, online trade shows, Webcasts, and infomercials. In addition, traditional advertising content will appear in both print and online, thereby increasing returns on creative investment. 

The great benefit of this model is that it doesn’t cannibalize the value of existing content, subscriber bases, and advertiser relationships. The integrated approach to online publishing delivers added value to all the players, increased revenues for advertisers and publishers, and new, more valuable content for our users. 


Susan FeldmanSusan Feldman Director, Document and Content Technologies Program, IDC 

No longer distracted by millennium fears, the information industry has been serving up new products with a vengeance. We’ve had a glimpse of the future these last few months. Here are some current and emerging trends: 

• Convergence—Suddenly online information systems are hot stuff. Cyberspace is roiling as Web-tools vendors, structured database vendors, content providers, and document-management companies all thrash around looking for the best position. Every product we see has a slightly different combination of features—rich in some areas and deficient in others. All promise to add features they lack through “easy”APIs, gateways, development kits, or partnerships. Buyers need to know what their requirements are and why before they start to shop for new systems, or they will be sold software that doesn’t fit their needs. 

• Focus on improved search—In the ’80s and ’90s, information was hard to come by. Now seekers of enlightenment are swept away by the flood of it. New products all claim to fix this problem by focusing, personalizing, filtering, or otherwise slowing the information flood to a manageable gush. It will be tricky to determine how to get what searchers need without eliminating materials that are relevant and important. Anyone looking at these technologies should insist on some sort of dynamic updating of profiles; linguistic processing of results; expansion of queries; tools to find people, places, and things; and improved relevance ranking. 

• Content architecture: tools for structuring unstructured text—While this sounds like an oxymoron, in fact, any information professional is going to find techniques like categorization, taxonomies, or XML suspiciously familiar. All of these techniques make it easier to determine what a document is really about. There are multitudes of companies entering the categorization sweepstakes. Some rely purely on statistical methods and are completely automatic. Some are completely manual in their preparation of a taxonomy, and then categorize semiautomatically. Others mix manual and automatic approaches. Probably this latter mixed approach is best in terms of deep, precise categorization, but it is slower than pure, automatic approaches, which are suitable for those with high-volume, shallow-categorization requirements like news feeds. Buyers need to probe beyond the marketing hype to find out precisely what is going on. I would, at the very least, look for some means to capture new terms as they appear in the literature. Any taxonomy or semantic network will become obsolete unless it is regularly updated. 

• Control over content, process, and even design for nontechnical staff—Because of the scarcity of IT staff, as well as the bottlenecks they have created, most new content-management products now hand control back to the creators of content, usually in the form of easy-to-use templates. Writers of serious works need no longer fear seeing their magnum opus on the Web flashing green and pink. Changes can be done quickly. 

• Collaborative tools that are being added to many new knowledge management applications—These enable colleagues to work together on a project remotely. 

• Wireless applications for information delivery—Everyone is talking about “any information delivered anywhere to any device.” Making it a reality is another question. New content-management products come enabled for multiple formats. The bigger question is what they are putting on a 2-inch-square cellphone screen. Choosing what to display and how to interact with such a clunky device for typing is not trivial. We have seen only one product—soon to be released—that grapples with this question successfully. The others just cavalierly say they are “WAP-enabled.” 

This year’s ideal information system finds any information, anytime, anywhere, in any format. To this core capability, vendors have added work flow, task monitoring, Web design and maintenance, archiving, security, categorization, and XML structuring. Tools to analyze information or handle transactions or protect intellectual property rights can also be added. The whole thing is embedded in a collaborative work environment that enables users to trade files, post comments, edit, suggest, comment, or discuss. We are transferring some of the complexity of dealing with multiple technologies to the behind-the-scenes processes. Hopeful, as always, of making this ideal system easy to use, we are still seeking the perfect (human) librarian. A tall order. 


Clare HartClare Hart President and CEO, Factiva 

During the millennium year of 2000 the buzz in the information industry centered around the coming of age of the intranet and the development of enterprise information portals. This year the dominant theme that has emerged in the information industry is a shift from end-user research to an application-driven approach in the use of information. The roots of this movement lay in new technologies that enable companies in the information industry to bring sophisticated content-integration products to the corporate market. 

The companies leading these knowledge management initiatives recognize a shift in the way end-users access information, which was typically through visiting and searching an external Web site. In the new application-driven workplace, the value proposition is the integration of relevant, re-liable information into the work process, so that information is available in context at the decision-making point. This new approach to the use of quality information will dramatically increase productivity by saving time and money, and ultimately increasing the over-all intelligence of the organization. 

The trends that shape this movement are the globalization of content, increasing use and development of taxonomies, and content customization based on an intricate understanding of the human work process: 

• The globalization of content—Due to the spread of business across oceans and borders, our customers know they have a responsibility to provide content to their employees that meets their local language and local content needs. It is counterproductive to build a content offering that does not facilitate the needs of a diverse and disparate employee base. Content providers will continue to fortify their offerings with content that supports users around the world, and the challenge to information professionals will be to meet these broadened needs as they become intranet and EIP content managers.

• Increased focus on taxonomies and indexing—The need for precise search, retrieval, and information push will continue to drive the development of accurate, granular indexing and XML coding. Companies will look to seamlessly integrate content more tightly into their applications, and XML will become the glue that binds relevant information—from external news and business information to internal company information. As companies realize that XML and indexing initiatives will eventually lead to truly integrated information systems, taxonomies will be continually refined to manipulate information flow. Across platforms and applications—from wireless, e-mail, desktop, and Web to CRM—content will be cross-referenced to the extent that changes to content in one application will instantly signal other applications and platforms to adapt and update information accordingly. 

• Customization—In the new application-driven world, customization will be based on a thorough understanding of when, where, and how people use information. This year, information professionals will become experts in understanding the work processes of employees within their organizations. Essentially, information professionals will build new products with the help of companies like Factiva that are tailored to the needs of their internal users, rather than choosing the closest off-the shelf product. 

As these trends evolve, companies and information professionals will implement information solutions capable of anticipating the information needs of end-users. Employees will have the information they need at their fingertips throughout their workday in every environment and application they are working. Human capital will be harnessed and leveraged as business information complements the vast amount of knowledge inside employees’ heads. This convergence of organizational intelligence will make the coming year the most innovative year that the information industry has ever experienced. 


Jay JordanJay Jordan President and CEO, OCLC 

I have five predictions for 2001: 

1. To borrow a phrase from Federal Reserve chairman Alan Greenspan, the “irrational exuberance” surrounding the World Wide Web will become more rational. As the shakeout in the dot-com world continues, E-Economics 101 will start to look a lot like good old Economics 101. Value will prevail, whether print, electronic, or click and mortar. Nagging issues such as copyright, authentication, electronic ordering, and standards will continue to nag, but only louder. As we begin to look at the Web with more realism, we will see that there is still plenty of room for exuberance. Libraries are aggressively adapting Web technology to their traditional roles, and they are applying the skills of librarianship to the Web itself. 

2. There will be more aggregation in the information industry. While the ease of developing and using Web-based search systems led publishers to develop their own Web sites, there will be a movement back toward aggregation through portals and links. In an increasingly hectic world, users demand easy, integrated access to information, and aggregation is a means to that end. 

3. The global nature of the Web will drive the information industry to become multilingual in their interfaces and their content. Users want services and data in their native languages. We will be seeing more multilingual approaches in thesauri, classification systems, and authority-control systems. 

4. There will be increased focus on emerging standards such as XML, RDF (Resource Description Framework), and Dublin Core. XML is going like gangbusters. RDF is an emerging World Wide Web Consortium (W3C) standard designed to support metadata across many Web-based activities. The Dublin Core, a proposed global standard composed of 15 metadata elements intended to facilitate discovery of electronic resources, is being adopted by libraries and other organizations, such as museums and government agencies, that have to describe various knowledge objects. 

5. There will be increased collaboration among the players on the Web. The .edus, the .orgs, and the .govs will be increasingly partnering with dot-coms. If Y2K taught us anything, it was how interconnected the world has become. Now we must begin to leverage the power offered by a networked computing environment and abundant bandwidth. 

Finally, 2001 is a special year for libraries participating in the OCLC cooperative. It’s the 30th anniversary of WorldCat, the bibliographic database that contains more than 45 million records and 767 million location listings and that supports cataloging, resource sharing, and reference services in libraries around the world. As we celebrate 30 years of library cooperation, we are also embarking on a new strategy to extend the OCLC library cooperative and transform WorldCat from a bibliographic database to a globally networked information resource of text, graphics, sound, and motion. This transformation has already begun with the launch of the OCLC Cooperative Online Resource Catalog (CORC), which enables libraries to cooperate in cataloging and providing access to electronic resources. 


Dick KaserDick Kaser Executive Director, NFAIS 

After re-reading my forecast from last year (https://www.infotoday.com/it/jan00/ahead.htm), I felt like the proverbial Prophet of Doom. A year ago, I sadly foresaw growing tensions between those advocating free information and those who must make a living producing it. Regrettably, I feel no need to eat any of those words. I personally have now spent the entire last year of the 20th century engulfed by those issues and the related matter of content ownership vs. fair use. 

As the new century finally, if anticlimactically, turns now, I desperately would like to look ahead and see something better for the information industry and the information community it serves. My first go at writing such a piece fell far short of that mark. So what I have to say now is my second attempt.

On the downside, we have a database/publishing industry that shows all the signs of decline, most notably the ongoing mergers, mergers, mergers—even when one can hardly imagine there could be any companies left to merge. We have what can only be described as an angry public, which having learned that there is such a thing as intellectual property, clearly despises the concept. We have (or at least had) a government that believes (or at least believed) its own rhetoric about all that free and readily accessible digital information that’s retrievable at the touch of a button, when in fact we know now that even our voting system is still based on punch cards. Though the e-commerce bubble has burst, taking the stock market and our portfolios with it, many still want to believe that all those dreams of a new world of free information were actually true and not just the smoke-and-mirror magic of venture capital.

Well, here I go again cascading down into the dark abyss.

The challenge that we face is this: We have now spent close to 30 years talking among ourselves about what information users need and want. We are emerging now from a time when publishers, database producers, librarians, and information specialists made the decisions about what information services were of high quality. But technology has placed the ball not only within the users’ court, but within the users’ home court. We will not change them. They will change us.

This should not be interpreted as a dark thought, for if it is not the light at the end of the tunnel I don’t know what is.  Whatever role we used to play in the information transfer process, we have all always said our mission was to serve these people. Once again we have the chance to prove it. Even though users may not appreciate all that we have or can do for them, it is still our mission to serve them. And that is the light we must remain fixed upon.


Rob KaufmanRob Kaufman President and CEO, netLibrary, Inc. 

Is the Internet Revolution over yet? In the past year we’ve seen the hype surrounding e-business, e-publishing, and e-everything exposed for what it always has been: hype. During this time, netLibrary and its employees have been dedicated to a Digital Evolution, not a Digital Revolution. We believe strongly in the ability of new technologies to help create, deliver, and manage valuable frontlist content more efficiently. We also believe strongly in the role publishers have to play and in the continued empowerment of librarians who, in addition to being netLibrary’s customers, are uniquely qualified to manage and deliver content in the educational and corporate markets.

What does Digital Evolution mean to traditional participants in the information business? We’ve heard about the Internet’s disintermediation of publishers and librarians through the promise of free and easy information. But we’ve also discovered that with free information on the Internet, you get what you pay for. Responding to the failure of free content initiatives, an initial wave of so-called “subscription pool” models required end-users to pay for pools of backlist and self-published content. Now those models, too, are fading from the scene due to poor content quality and poor content organization. End-users are demonstrating a renewed appreciation—and demand—for quality digital content that has been vetted and edited by a publisher and organized by a librarian for optimal access and use.

In support of this Digital Evolution toward quality and well-managed content, netLibrary is focused on helping publishers and librarians remain ideally positioned to succeed in the digital future. 

For centuries, publishers and librarians have selected, organized, and managed the world’s content. At netLibrary, we take comfort in knowing that their success has been based on serving information needs rather than on hype, eyeballs, and clickthroughs. We support a library’s continued role as the world’s best and best-utilized information portal, and we look forward to continued collaboration with libraries’ familiar partners in content management—companies such as Data Research Associates (DRA); epixtech, Inc.; Follett Software Co.; Innovative Interfaces, Inc.; and SIRSI Corp. These partners have created highly robust online systems that libraries use to handle search and circulation requests for tens of millions of library patrons each day.

The time-honored roles of libraries and publishers as gateways to the world’s content are more meaningful today than ever. Access technologies have opened floodgates to vast repositories of content—such as the Internet—without deciphering content quality. At netLibrary, we challenge the value of content repositories, absent the vetting of publishers and the guidance of skilled information scientists: librarians. Publishers and librarians remain the critical links between those who bring knowledge to the world and those who seek it. Tools may change, techniques may vary, and technology alternatives may expand. Through it all, publishers and librarians are uniquely positioned to provide people with quality content appropriate for their needs.


Knight KiplingerKnight Kiplinger Publisher, Kiplinger Publishing Organization Editor in Chief, The Kiplinger Letter and KiplingerForecasts.com 

The coming year will mark the takeoff of pay-per-read purchasing by both information professionals and consumers looking for quality content online. It will be made possible by the spread of convenient new ways to charge online information purchases to a central account, such as Qpass.

In the B2B world, the subscription model—a flat, annual fee for unlimited access—will continue to be cost-effective and convenient for corporate researchers and enterprisewide intranets. Pay-per-read will join the subscription method as a parallel, alternative way to buy occasional articles from a wide array of specialty publishers.

In the consumer information market, the subscription model has bombed, and the advertiser model isn’t looking so hot, either. Plunging banner-ad rates make it unlikely that most Web sites can be supported by ads alone. So the next, and probably last, model to be tested is pay-per-read. This one will finally work.

As it begins to catch on, consumer publishers (especially magazine publishers) will start charging for the voluminous content that they have foolishly been giving away for nothing on their open-access Web sites. 

Professional researchers in corporate, scientific, and legal fields have long been willing to pay for must-have information, typically on a subscription or time-online basis. But consumers have not been willing to pay, for the simple reason that they don’t have to. They’ve been spoiled by the availability of so much free information on the Web. But as the freebies become history, and online purchasing of individual articles becomes easy and inexpensive, consumers will have little choice but to accept the new order.

The legal battles won by copyright holders in the Napster and MP3 cases are a harbinger of things to come. Whether it’s reading material, music, or movies, pay-per-use will become the dominant consumer-purchasing model, co-existing with various flat-rate or time-based subscription plans in the B2B world. 

In this new order, content will once again be king—or at least a prince. Either way, it will cease to be a pauper, earning nothing and begging for the fleeting attention of fickle users.

The kind of content that will get the most attention is accurate, useful information from providers with a reputation for quality. In the world of pay-per-read, both professional and consumer users will opt for branded information over no-name, generic content. And gimmicky Web design will matter a lot less than content quality and utility—the ease of finding just what you need, when you need it.


Greg NotessGreg R. Notess • Creator, SearchEngineShowdown.com • Reference Librarian, Montana State University

As we look toward the information space in 2001, there is little doubt that change will continue and perhaps even increase. We have seen major changes in information ownership and information access, as the companies in the information industry consolidate even more under just a few corporate owners. Meanwhile, over on the wild frontiers of the Web, the search engines have seen significant changes, with much larger databases, continually changing technology, and new economic models.

So where does this lead for the future? With all the changes the Web has wrought, no one knows for sure, but here are some of my guesses. The information industry and the search engines will look for more partnerships and interaction. We may see more library-oriented databases utilizing the underlying technology from the general Web search engines. Meanwhile, more full-text content from the library side should make its way out to the free Web on portals, vortals, and other mortal-rhyming search sites.

2001 should be a good year for content, and consequently for the information junkies among us who love to consume that content—while the Web’s information content has lost the press glamour it had a few years ago to the dot-com economy, tanking Internet stocks, shopping buzz, and vertical portal building. Yet, at the same time, the information content has expanded and improved with new, free, and commercial sources of full-text content.

Many sites are busy planning how to organize, index, and provide access to new collections of data. With database-generated sites, smarter use of XML, custom-built taxonomies, human and machine indexing, and a variety of search tools, many Web sites will be providing better access to their growing collection of content. As sites like xrefer.com show, it is not just original content becoming available, but new information resources previously unavailable online for free.

And with all the new content, will it be harder to find? I actually expect that the search engines will show continued improvements in coverage, search features, and relevance. The largest search engines are constantly analyzing link patterns and user behavior. This should lead to improved relevance. Meanwhile, many search engines are also exploring new features and capabilities. Most have now moved to a default Boolean AND operation, and those lacking full Boolean searching are adding more Boolean operators.

At the same time, 2001 will be a year where the search engine companies need to show that they can become profitable. While they will explore new options for generating revenue, I hope that the new search engine economy will not have a negative impact for professional searchers. Instead, we should see better access to ever-more-quality information.


Allen W. PaschalAllen W. Paschal • President and CEO, Gale Group

As a CEO, my job is to marshal my company’s resources to best meet customer needs and demands. That means understanding how the dynamics of our industry impacts our ability to achieve our goals.

Now, as 2001 gets underway, we’re reminded of what a difference a year can make.

Last January, the dot-com revolution was in full swing. A lot of folks bought into what, in retrospect, we can see was mass hysteria about the abilities of dot-com companies to succeed despite confused visions, unseasoned management, and elusive revenue and profitability. Then along came that little wake-up call—the April NASDAQ meltdown—and many a dot-com entrepreneur has been humbled. 

That leaves the information industry, not unlike many others, again focused on the need for stable, profitable businesses that can generate the cash flow and have the management in place to deliver today’s and tomorrow’s products. Many of our constituents in public and academic libraries appreciate the virtues of dealing with strong vendors who will be around next year and the year after that.

Indeed, the information industry was something like an accordion in 2000. The industry expanded as new players joined the fray. Now it’s contracting again, with the major companies for the most part stronger and more robust than ever. The consolidation will continue, as it naturally does in most maturing industries.

Which is not to say we’re not in for some big changes in 2001 and beyond. We still don’t know how the peer-to-peer (P2P) information-sharing phenomenon will unfold. Yes, Napster has had its wings clipped a bit in 2000, but issues of copyright and intellectual property rights are ones that will require our attention. Peer-to-peer undoubtedly will have an impact on authors, publishers, and information users.

The information industry will remain highly competitive in 2001, with the major vendors vying for customer loyalty against a backdrop of price pressures and the imperative to innovate, especially in the print-to-Web migration. That’s the formula for exciting times ahead.


Joe ReynoldsJoe Reynolds • President and CEO, Bell & Howell Information and Learning

The year 2000 will be remembered by many as the year of dispute and initial resolution of music copyright issues in the digital world. And, as of this writing, it looks like 2001 will be the year for the resolution of similar issues for the written word. 

The U.S. Supreme Court’s decision to hear arguments on freelance copyright issues this spring has raised the visibility of this issue far beyond the information industry and library groups. All parties—publishers, information providers, authors—want a fair, equitable resolution to the questions raised by freelance writers.

Simultaneously, researchers and librarians are encouraging the U.S. Copyright Office to support amending federal copyright law to explicitly allow for copying and distribution of electronic material. A group has asked the Copyright Office to urge Congress to revise the Digital Millennium Copyright Act to specify that a buyer of copyrighted electronic material can resell, lend, or share that material without the copyright holder’s consent (commonly known as the first-sale doctrine).

As a leading provider of online periodical and newspaper information, we are hopeful that resolution of these important and complex issues can begin in 2001, and that resolution will bring about the following: 

• Continued access to all information (no matter the original or archived format) so that gaps do not appear in the historical record

• The archiving of all information (no matter the format—electronic, print, microform, etc.), while respecting and acknowledging the original copyright and ownership/authorship of the content

But, in the meantime, we see a tremendous need to continue to digitize valuable historical materials to enhance and increase access. We have begun an aggressive digitization project for historical newspapers that will allow users to search their full run to gain a complete picture of a given issue. 

With a growing demand for relevant information available via the Web, it is incumbent on the information industry to continue a steady stream of new products that serve users’ and libraries’ needs. And, with a little luck, these products will be comprehensive and whole, with no gaps resulting from unresolved copyright issues.


David SeussDavid Seuss • CEO, Northern Light Technology, Inc.

2001 was quite a year for Web-centric companies. The true measure of how sour the financial mood had become was the revelation that Al Gore had lost the 2000 election because the investing public blamed him for inventing the Internet. Office space rental rates finally dropped for the first time in venture-capital (VC) meccas like Sand Hill Road and Silicon Alley. Offices of the previously rich and powerful VCs were remodeled into Salvation Army stores to serve the thousands of now impoverished dot-com millionaires who needed a helping hand. Also on the financial front, the NASDAQ announced a new market in which investors could choose stocks by randomly punching holes in the stock pages. Dubbed the “Butterfly Index” by stock market analysts, it outperformed professional, managed portfolios by wide margins. 

Information companies also had their problems in 2001. The search engine Ask Jeeves finally admitted that there was no real technology behind its service—it simply employed thousands of little old ladies (who had been clerk-typists before the advent of word processing) to type answers really fast on the fly. This explained the unusually high occurrences of “try chicken soup” in response to medical questions, investing questions, and even business research questions. AltaVista, who at the end of the prior year had announced a radical new strategy of becoming a search engine again rather than a shopping portal, announced it would once again focus on shopping in the year 2002. Upon realizing the confusion it had created, AltaVista elaborated on its strategy, stating it would focus on being a search engine in years ending in odd numbers and a shopping portal in years ending in even numbers.

The ramifications of Election 2000 were still being felt as the domain-naming service Internic added “.duh” to the list of approved top-level domains to accommodate demand from Palm Beach County. In the spring of 2001, a new study sponsored by Information Today concluded that the ballots in that same county were easier to understand and operate than all existing information industry user interfaces and formed a new business by hiring the recently unemployed ballot-design team to develop a new, simpler vendor interface. 

The hot new market in 2001 was for enterprise information portals. That segment took a bizarre turn when Disney announced it was repositioning Infoseek/Go into the enterprise portal space. “We think corporate America needs a Mickey Mouse competitive intelligence portal, with plug-in Goofy gadgets for additional services,” said a Disney spokesperson. AltaVista said it was evaluating the EIP market as well, and might put it in the yearly strategy rotation. 

Upstart Northern Light confounded the industry at the end of 2001 by announcing that it had grown bored with the mission of indexing and classifying all human knowledge to a single, consistent standard and was going to reposition www.northernlight.com to be the biggest, bad-ass punk site on the Web. David Seuss, speaking for the company, said: “We’re feeling loose and our moves look good! Just watch our librarians rock.”


Chris ShermanChris Sherman • President, Searchwise.net Web Search Guide, About.com

What trends can Web searchers expect to play out over the course of this year? There are several, some offering promising new developments, others portending more of the disruptive change and unfortunate failure that we’ve increasingly seen over the past year:

• Revealing the Invisible Web—The Invisible Web (those sites that search engines cannot, or more importantly, will not include in their indexes), will become increasingly visible over the coming year, for two reasons. First, the major search engines are showing signs of becoming more willing to index “difficult” formats such as PDF, Flash, and streaming media. And second, much of the high-quality material hidden away in Web-accessible databases will become easier to find as well. New technologies allow automated, intelligent data extraction from databases. New Web-centric query languages like WebSQL and new data format standards such as DSTP (Dataspace) have the ambitious goal of allowing searchers to query the entire Web as if it were a single, unified database.

• More visual navigation—Two excellent visual Web navigation maps emerged last year, both based on Open Directory Project data: WebBrain and map.net. Images can carry orders of magnitude more information than text, especially when coupled with interfaces that allow easy and rapid filtering and manipulation (imagine a computer-based simulation of brain surgery vs. reading about it in a medical textbook). Products, such as Inxight’s TableLens and I2’s Analyst’s Notebook, and work from researchers like Ben Shneiderman are showing the way.

• More “Ask an Expert” sites, especially from qualified information professionals, such as the Collaborative Digital Reference Service (CDRS) 

What’s likely to crash and burn this year? Gnutella and most public peer-to-peer (P2P) networks. Once touted as unstoppable alternatives to Napster, P2P technologies turn out to have serious problems with scalability, privacy, and security. P2P search and sharing will not disappear—rather, it will be used on a much less global scale, by smallish groups with common interests. Participation in these semi-private networks will often be available by invitation only.

Browser-bar clients will fall by the wayside. We’ve already seen the useful GuruNet vanish; look for most other browser bars to go the way of free Web access. It’s not that browser bars aren’t useful—the problem is that there are too many of them, and they are largely incompatible with one another. 

It’s 2001, so our look ahead wouldn’t be complete without a reference to HAL. HAL does indeed exist on the Web, as a quick check on any search engine reveals. Which leads us to our final wistful and somewhat melancholy prediction: At least one—if not two—of the “big eight” major search engines will shutter its virtual doors this year. The “grow fast, worry about profits later” game is over, and search engines are no less vulnerable to the ruthless economic realities that have forced the shutdown of scores of other dot-coms.  In HAL’s inimitable words, “Dave, we’ve got a problem.”


Charles TerryCharles Terry • President and CEO, COMTEX News Network, Inc.

The Internet has been heralded as perhaps the closest thing to a perfect free market, which can be exhilarating and profitable for those who harness it and devastating to those who trust in yesterday’s ideas and business models. The side of the information superhighway is littered with the remains of business models that failed to see, and prepare for, a universe of empowered end-users with nearly limitless choices and few barriers to switching. In 2001 there will be more losers than winners. The winners will be those who understand and respect the end-user and therefore implement business models based on the premise that end-users alone establish content’s value. 

For Web sites’ information and business services, the challenges are threefold: to acquire a reader/customer, retain him, and establish a viable business model. While attracting a user just takes a marketing budget—$80 per registered free user—retaining him as an ongoing customer is not so simple. Customers can leave on a whim without a trace, without feedback, and perhaps never return again. 

So, in 2000 we learned that a marketing budget is not a business model. How profound! And we thought we were so smart with 100x revenue valuations. The profound revelation for 2001 will be, finally, that the customer matters above all else. 

What does the customer and end-user want? Simple, they want what they want, when they want it, where they want it. 

To succeed, content sites must master the following formula:

• Offer a critical mass of content, to more fully meet the needs of the targeted end-user

• Provide the content in a context that increases its intrinsic value to the end-user

• Deliver said content on the appropriate device at the appropriate time

In the ’90s, aggregators like Dialog, LEXIS-NEXIS, and NewsEdge achieved the formula described above. But in 2000 and beyond, expectations are far higher. Today, users want information embedded in business applications so that the content is already in context, not buried in an abstract database. 

For a hint at who will do well in the coming year, let’s look to some of the success stories of the year now closing. OneSource delivered applications dedicated to specific functional end-users, each with a critical mass of content in context to meet the exacting needs of its users. Bloomberg, the grandfather of aggregation, showed that it has not fallen behind by aggressively meeting all three criteria. ScreamingMedia has enjoyed dramatic customer growth with its ability to deliver content to exacting specifications from a very large pool of content. The common theme in each is they aggregate disparate content and focus on the customer view of the world. The proof: OneSource, Bloomberg, and ScreamingMedia customers pay real money.

Who won’t make the grade in 2001? Destination sites and publishers that fail to deliver a critical mass of content in context when and where the customer wants it.

Table of Contents Previous Issues Subscribe Now! ITI Home
© 2001 Information Today, Inc. Home