Knowledge Leaders Discuss Roles as Creators of Information Future at SLA Annual

Library Hi Tech News

ISSN: 0741-9058

Article publication date: 1 September 1999

557

Citation

Hulser, R.P. (1999), "Knowledge Leaders Discuss Roles as Creators of Information Future at SLA Annual", Library Hi Tech News, Vol. 16 No. 9/10. https://doi.org/10.1108/lhtn.1999.23916iac.001

Publisher

:

Emerald Group Publishing Limited

Copyright © 1999, MCB UP Limited


Knowledge Leaders Discuss Roles as Creators of Information Future at SLA Annual

Conference overview and keynote address summary by Richard P. Hulser, with contributions by 18 others

Richard P. Hulser

Overview

With a number of us feeling like Dorothy and Toto, the attendees at the 90th annual conference of the Special Libraries Association (SLA) had some brief encounters with unusual winds, tornado warnings, rain, and hail. Rather than dampen spirits, the environment created a whirl of excitement of content and activities in Minneapolis.

The theme this year was "Knowledge leaders for the new millennium: creators of the information future". The sessions were divided into tracks, indicated by symbols next to each activity, and included: evolving roles, practitioner's tool kit, knowledge management, and leadership. With topics ranging from knowledge management, bench marking competitive intelligence, to future and dead technologies, there was a lot of content in the sessions from which to choose.

The conference afforded an opportunity to participate in sessions from a number of interesting invited speakers and colleagues. This included a chance to sit in wonderment as Kathryn Sullivan, a former astronaut and current president of the Columbus, Ohio, Center of Science and Industry, aided by a photographic slide, described the view she had from her "office" window on the Space Shuttle. In her talk, she commented on the challenges of information access and retrieval from space as part of the many events planned to celebrate the SLA Science-Technology Division's 75th anniversary.

At a senior managers roundtable near the end of the conference, it was expressed that there was a more positive general feeling this year. There was less focus on financial problems or what would like to be done by the attendees and more about what was actually being accomplished. An emphasis on stories such as case studies was in evidence, with some discussing the alignment of what information professionals are doing with their customers in order to be an interactive part of their organizations. According to these managers, there was also much focus on saving time. As an aside, it was noted that a supposed fascination of librarians with technology was outclassed by the stronger force to network with colleagues.

Naturally, much of the content in the conference centered around knowledge management, with those sessions being the most well attended. It was clear that there were many definitions for knowledge management and it was suggested that perhaps a standard definition should be created and approved by SLA members. Details about many of the other sessions given at the conference will be covered in separate articles in this special issue.

The proliferation of compelling programs such as those already described presented quite a challenge to attendees, requiring them to squeeze the remaining time during the day for visits to over 540 exhibitors, reported to be a record breaking number in the history of SLA. With a variety of publishers, electronic content and systems providers, and consultants, the exhibits were brimming with something for each of the diverse attendees. Interestingly, while some new products were presented, none were strong standouts, particularly in the library automation and electronic content arenas. Rather, many of the vendors appeared to be focused on offering Web interfaces and honing details of their products, along with alerting attendees to the vendors' name changes or mergers.

Strategic alliance sessions were a new part of the conference, enabling vendors to describe their products and services as they applied to problems and needs of information professionals. These sessions were offered throughout the conference and clearly identified in the conference program.

A number of continuing education courses were also provided at the conference, with topics such as negotiating global contracts, patent searching, and management strategies for solo librarians. The courses were labeled with competency numbers matching those described in the Competencies for Special Librarians of the 21st Century included in the final program book. The courses were divided among basic, intermediate, and advanced levels. Field trips to a variety of libraries and information centers enabled attendees to get a firsthand look at their colleagues' environments and services.

All this activity was augmented by many elected and appointed leaders attending training sessions and conducting meetings to ensure the health and growth of SLA as a vital, living organization meeting the needs of the membership. This was topped off by enjoyment and networking at the many social events, including SLA division anniversary celebrations, sprinkled throughout the conference, and a closing gala at International Market Square, a five-story atrium with a glass ceiling and architecture reminiscent of New Orleans.

Overall, this was a conference filled with much to learn and experience and it was clear that the attendees were able to leave with a lot of knowledge gained while also being able to take advantage of the many enjoyable offerings of the Minneapolis and St Paul area.

Keynote Address

Speak the name Laurence (Larry) Prusak, the executive director for the Institute for Knowledge Management for the IBM Corporation, and keynote speaker of the SLA conference, and you quickly get a flurry of discussion. There are rare occasions indeed when something said or written by Prusak doesn't raise the hairs on at least a few librarians and information managers. While there were many varied opinions about Prusak during this meeting, it certainly can be said that he provided thought-provoking ideas.

As he made his entrance, it was clear in Prusak's face that he was both amazed and impressed with the enormity of a capacity crowd of thousands of attendees to his keynote address. Working with no electronic presentations, no overhead transparencies, and no notes, he covered a variety of aspects associated with knowledge management. He promised early on in the speech that he would focus afterward on knowledge management as it applied to library and information science at the separate follow-up discussion session. Nonetheless, he did touch on such issues during his main address as well. This in itself was cause for many a discussion later.

According to Prusak, information by itself doesn't have a lot to do with innovation. It is also becoming a commodity. Knowledge management is really the valuable thing, along with what he called "wisdom management". Confusion of knowledge and management is of tremendous importance, he indicated, with over $7 to $10 billion wasted by organizations trying to deal with this confusion. As Prusak states, "Americans don't like thinking through things", and that is why there has been such a waste of time and money in many knowledge management implementations.

So, then, what is all this? Prusak notes that "an organization needs data, information, and knowledge to exist". He defines each of these as follows: data is the recording of a transaction; information is a message, something which has a sender and a receiver. It has bounds and is intended to "inform" such that you don't see things the same way. Knowledge is what the "knower" knows. It is the biggest expense in any organization, according to Prusak, yet the highest value in an organization. What you end up doing is buying the outcome of knowledge.

Prusak issued a challenge to the audience in helping them understand the difficulties in harnessing knowledge. He asked, "How much of what you know about a topic could you write down or codify? What are the outputs of knowledge that people value?" Prusak stated that there is a 22 percent gain in efficiency if you get the right information to the right person at the right time, but that is all.

Much is learned by firms using knowledge, notes Prusak. He went on to describe some examples of such learning, such as the understanding that innovations come from an interface between the worker and his or her work. "Knowledge capital" is complex because knowledge is invisible ­ it is sticky, local, and contextual (where it is).

Prusak explained that are three categories of issues for making knowledge:

1) Visibility: it appears in the strategy of organizations, to executives working with groups, aggregates of information such as in knowledge maps or directories. "What's it worth to you to know what you know?"

2) Culture: changing behavior of people is a great sticking point in knowledge management ­ the way people are rewarded affects behavior.

3) Infrastructure: governance, knowledge centers ­ an organization must have trained people in order to get the worth sought.

Connections, not capturing [of data or information], is key, admonished Prusak. He had some suggestions to make connections happen. First, let knowledge networks self-organize ­ allow informal networks to grow. Secondly, nothing happens in knowledge without reflection. There needs to be space and time to reflect on the information gathered. The use of cognitive models can help. The richness of networks is the way many executives get ahead. People who succeed network all the time. People learn from one another, live, and through stories. Nothing happens without passion.

Earlier in his talk, Prusak indicated that technology amplifies knowledge. He then went on to point out that technology does not change behavior; it is just a tool. His underlying point is that people get mesmerized by the technology or its capabilities and stray from their organizations' focus. He referred to this as "techno-utopianism". He further stated that there is a "techno-utopianism of executives" evolving where organizational leaders are allowing their minimal understanding of technology and its capabilities to skew decisions for their organizations. Technology, he claimed, is the handmaiden to the culture. It has the capability to change it. As technology evolves, Prusak notes, information is endangering the library profession and others ­ anyone who is an intermediary.

In an admittedly more controversial stance, Prusak intoned that "access does not equal value; it just means access". It is convenience, efficiency. "Hire smart people and let them talk to each other" is what he suggested. Prusak advised the audience to understand how organizations will be working in the twenty-first century by studying economics, sociology, and other areas, not just library science. To stay focused only with library and information science would be detrimental to future progress, he advised, and with that thought his main address was finished.

Follow-up Discussion Session

The keynote talk given by Laurence Prusak was followed by a separate, standing-room only discussion session. It should be noted that Prusak was heard to remark later that he was impressed with the high caliber and breadth of the questions from the audience at this session.

There were a variety of questions from the audience asking about key authors, publications, and suggestions on what librarians could do to properly implement knowledge management in their organizations. Prusak strongly suggested reading World Development Report 1998/99: Knowledge for Development sponsored by the World Bank Group, which focuses on the role of knowledge in "advancing economic and social well-being" as noted in the introductory paragraph. He indicated that the lessons expressed in this report can be applied to organizations.

At the prompting of an audience member's question on suggested literature to read, Prusak suggested that few authors of knowledge management materials are worth reading. One he did recommend was James Marsh from Stanford University. Books and other reading references cited can be found in the reference list at the end of this article.

Prusak indicated that people share information, but they don't share knowledge. He suggested "fire people who don't share". He further noted that signals and symbols are important. People won't share if there is not trust. Find someone who is senior and hide under his or her shadow, he suggested, in order to get work done. Or, make enough money and say what you want!

"Ideas rule the world, not economics or technology", Prusak insisted. He was adamant that the machine metaphor used in organizations is what kills those organizations. Human beings have too many variables to re-engineer. He suggested that human resources professionals are not up on knowledge management, and therefore have no right to claim it.

Prusak is writing a book on what he called "social capital". As earlier expressed in his keynote address, Prusak pointed out that people who share intellectual focus are successful. Synthesis and expertise is value ­ intermediation by itself is not. Questions such as "Whom do I call, and for what reason?" should be answered. The way to get knowledge management working is to have people work with you, to get to know what you know, such as is done during apprenticeships.

When questioned about what the focus should be of the profession, and in particular the Special Libraries Association research arm and its funding for projects, Prusak suggested a robust theoretical approach to library and information science acceptable to real executives. He suggested that information professionals write cases on the successful corporate information centers, supported with analytics, but not to use models of the information industry. He suggested creating a "case series" and send them to business schools, or even co-publish with those schools.

Interestingly, Prusak does not necessarily like the concept or practice of telecommuting, particularly those implementations where there are temporary cubicles where people are assigned space on an as-needed basis. He is concerned about the role of telecommuting in the future, which prevents the natural networking needed to ensure successful sharing of knowledge.

Prusak ended the session with a question for the audience and points to ponder and investigate: What is the theory behind library and information science? He felt that articles should be commissioned on organizational use of information services, and that clearly written articles on the value of information services should be obtained and placed in non-library journals and newspapers.

Additional Resources

SLA Minneapolis Conference Program Committee (1999), "Laurence Prusak shares thoughts on success and knowledge management", Information Outlook, Vol. 3 No. 5, May, pp. 31-32.

Vaughn, D. (1996), The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, University of Chicago Press, Chicago, IL.

Woodfill Park, M. (1998), InfoThink: Practical Strategies for Using Information in Business, Scarecrow Press, Lanham, MD.

World Development Report 1998/99: Knowledge for Development (1998), Oxford University Press. Available http:// www.worldbank.org/wdr/wdr98/index.html

Richard P. Hulser is Product Marketing Manager, Digital Library Technologies, IBM Corporation, New Haven, Connecticut. rphulser@aol.com

© Richard P. Hulser

Jim Smallwood

The New Web Order: The Changing Shape of the Information Environment

Reva Basch took a full house on an eclectic jaunt across the past decade's Web evolution. Her presentation, sprinkled with images of Darwinian evolution, the 1960s, and demographic trend spotting, was well prepared and effectively delivered. Those curious about the Web's future direction were treated to an insightful overview of what lies ahead as she and Mary Ellen Bates shared their thoughts on the current state of the Web and its potential future trajectory.

The session opened with a call to "organize around chaos" by taking a broad view of Web evolution versus getting caught up in dissecting each new development that unfolds. By taking the broad view, a coherent picture emerges from an apparently chaotic tightly focused view.

To provide perspective, Basch detailed Web evolution since its inception in the early 1990s ­ an evolution that has taken us from batch processing to interactive Web pages, to dynamic real-time interaction; from ASCII text only through text and graphics, all the way to television-like multimedia on the Web. One inevitable result of this evolution has been a fundamental shift in the definition of what constitutes information. An example: evapo-ware, Web content that is here today, gone tomorrow, never to be seen again. In other instances that same information may disappear momentarily only to reemerge in an altered state at a new Web site. For Basch, this represents chaos in action.

But that kind of chaos is good. It forms the "primordial soup" or "raw material" of internet evolution. Its key ingredients include:

  • search engine proliferation and fusion;

  • confusion between commercial and non-commercial uses of the Web;

  • content clashes among entertainment, information, and dis-information content;

  • unresolved copyright, quality, and intellectual properly issues;

  • ever-growing trends towards disintermediation, disaggregation, and the over-hype of all things "Web".

Early evolutionary permutations arising from the primordial soup include value-added search tools that concentrate valuable content while carefully filtering and sorting it for easier consumption. Ask Jeeves, Google, and Northern Light are early entrants.

Other evolutionary prototypes include content aggregators such as Yahoo!, Infoseek, and Excite.

Branded Collections represent another new Web life form: NLM, PTO, Gale Group, and Electric Library are rudimentary examples.

This same primordial brew has spawned portals which themselves are already morphing into innovative new forms on corporate and collegiate LANscapes.

Gazing further out, we can see still other evolutionary hatchlings: Web-based data visualization (complete with topographic search result maps), collaborative filtering (seen already at Deja.com and Hotbot), and natural language processing (the real thing this time).

These new players will soon be supercharged by their evolutionary cousins:

  • nanotechnology (manmade molecular machines);

  • embedded systems;

  • wireless technology;

  • convergence.

The result of all this transfiguration is order from what began as chaos. Computers and computing are becoming ubiquitous. The Internet is becoming at once pervasive and invisible. It's going underground.

So what does all this mean to information professionals? At least two things: as information professionals we are beginning to feel "as skittish as a cat in a room full of rocking chairs". On the other hand, as one colleague of Basch's remarked, "Management values my work more that ever. It's like information professionals are the ones giving shots in an anthrax outbreak!"

From Librarian to Knowledge Leader: The Role of the Librarian in the Information Age

This well-attended session fell short on delivering its full potential as not all of the scheduled participants were able to attend. In the end, the knowledge management (KM) roundtable discussion never materialized. A pity, because Cheryl Lamb, Resource Center Manager, Buckman Laboratories International, has certainly worked extensively in the KM arena during her eight years with Buckman Labs.

Her 40-minute presentation began with a somewhat surprising admission: Buckman Labs does not use Knowledge Management! They focus on Knowledge Nurturing, believing it is impossible to "manage knowledge". It's hard to disagree with that philosophy!

For Buckman Labs, the world of information consists of data, information, and organizational knowledge. That much is not unique. Where Buckman departs is its focus on deriving ideas and innovation from knowledge nurturing. To accomplish that the organization must develop skills in problem solving, knowledge sharing, and learning. Effectively applying those skills in problem-solving situations can lead to innovation.

To become Knowledge Leaders we must become promoters of the value and importance of tacit knowledge. We must actively promote people connections and gain deep knowledge about our companies' products and to be in position to really help our organizations. We must know the competition, the regulatory environment, and the technology. (As an aside, Lamb predicted that electronic document management may be "the next big thing for libraries".) Without a thorough understanding it is difficult to make a significant impact.

How does a company like Buckman exploit the power of Knowledge Nurturing? By taking a very systematic approach they have developed the following concept of how to expand organizational knowledge.

It starts with Catalysts, curious people not afraid to share what they know. They help to combine ideas in new ways or cross-pollinate ideas and concepts with other departments. The result: it becomes possible to both analyze and synthesize the results through discussion, clarification, and research. Next, the process of Capture begins, during which knowledge and insights gained through the preceding processes are stored and categorized so that they can be effectively accessed later. The final step is to put what has been learned into practice. Here, Lamb offered an insightful caveat: sometimes, best practices can lead to corporate mediocrity if not regularly reviewed for validity and purpose. The key is to keep the process dynamic.

It's easy to put forth such truths and ideas, but reality presents unusual challenges. These are typically manifested as barriers to communication within the organization. These barriers may be political, divisional, or even intellectual.

So what is the ideal system for nurturing knowledge? According to Buckman Labs it looks like this:

  • Reduce the number of times that knowledge is transmitted from the original source.

  • Give everyone equal access to the organization's knowledge base.

  • Allow each individual to enter knowledge into the system.

  • The knowledge system should function in 24 x 7 x 365 worldwide.

  • The system should be easy to use, even for those not familiar with it.

  • The system should communicate in the user's native language.

Admittedly, this represents an ideal, but the key point is that sharing of tacit knowledge will generate the information required to update the organization's bank of explicit knowledge. Within that model emerge many roles for information professionals, from the application of technology, to promoting knowledge sharing, to acting as a catalyst for change.

Lamb's advice to those information professionals who would thrive in the coming years is to develop the value-added attributes of information. The information we provide must be relevant, current, inviting, easy to use and access, and of high quality.

And what about traditional "library" services? Interestingly, Lamb readily admitted that activities such as online research and document delivery are still a significant part of the work at Buckman. In her words, "Traditional services are here to stay". However, new skills are coming to the fore:

  • database creation;

  • facilitating knowledge sharing;

  • mentoring (knowledge nurturing);

  • proactive distance-learning initiatives.

Our challenge is to take advantage of information technology. We must develop and exploit the synergy between the traditional and new information practices. Finally, we need to measure the return on investment on knowledge.

The Buckman philosophy is probably best expressed by Jan Carlson of SAS: "An individual without information cannot take responsibility. An individual who is given information cannot help but take responsibility".

Jim Smallwood is a Research Manager at GE Capital-Commercial Finance in Stamford, Connecticut. He is the webmaster for the Fairfield County Chapter of SLA, a member of the board of directors of the Hartford Critical and Creative Thinking Center, and unabashed "early-adopter" of the latest in information technology. james.smallwood@gecapital.com

Sammy Alzofon

New Directions and Changing Roles for News Librarians

Moderator: M.J. Crowley, Information Editor, The Star Ledger, Newark, New Jersey.

Panelists:Kathy Foley, Editor, Information Services, San Antonio Express-News, San Antonio, Texas;Lany McDonald, Director, Research Center, Time, Inc., New York;Nora Paul, Faculty Member and Library Director, The Poynter Institute for Media Studies, St Petersburg, Florida.

The three people on this panel have been in the news industry for many years and share a common trait: they know how to make things happen. Their skills and knowledge are widely known and made this a great session to just lean back and listen to. Moderator M.J. Crowley talked about the year 2000 and beyond; Lany McDonald discussed the "new economics" of our roles; Kathy Foley talked about management; and Nora Paul told us about "four things to get over".

M.J.: How do we make these things happen? 2005: TV-computer convergence, XML, direct to press, zones and microzones with no final edition; 2010: Holographic memory, digital archive, warehousing whole pages and searching multimedia elements within the page; 2015: full voice recognition computers, agate disappears from print, circulation and distribution disappears, legislation dismantles "paper of record"; 2020: some sort of portable electronic communicator tablet.

Paul and McDonald agree that we need to make these things happen and that we become the essential information intermediary. What happens to the library as lines, already blurring between researcher/reporter, disappear? "Blow it up and let the pieces fall where they may". Move to an "islands of information" concept in the newsroom.

McDonald: But these changes impact at a corporate level when we create multimedia intranet/Internet packages which begin to generate revenue. Never limit your availability. Partner to organize content. Take an active role in the business end of your company. Market yourself to your company. Weave yourself into everything regarding information content in your company. Then our primary employer moves from editorial as we become corporate leaders in corporate profit centers. (I need to note that she received Time Inc.'s 1998 President's Award, given to individuals "who have demonstrated excellence in generating ideas, solving problems, and delivering results".)

Kathy Foley continued to elaborate on the themes of moving out of the library into the company at large and empowering this with as many virtual resources as is necessary for the job. She emphasized the power of the team approach and how to attain high visibility. We must manage the future. How? Again, this notion of making things happen.

Paul provided the bridge: four things to get over. The Internet is here. We don't work for a single medium anymore. There is a profit imperative. It is our job to lead our companies in the information revolution. I've always found Nora Paul to be provocative, providing the catalyst, but leaving you with the imperative to figure out how to get from here to there. This session was vintage Paul.

Participants were invited to share how they are making money, or would like to make money. What kinds of new staffing issues arise? Where are competent staff coming from? The issue of staffing is near to my heart because I see a move away from library school-trained staff and hiring coming from journalism and computer science areas. My concern is the lack of training for special libraries and librarians and this session made this more evident. News library interns are coming from journalism programs while the need for writing, programming, and management skills have also diluted the library science hiring pool. A library school student in the audience asked how she might be viewed by a potential news employer. Her question moved the discussion to the need for communication about industry skill needs versus current library training.

The closing quote of the session: "The best way to predict the future is to invent it" ­ Alan Kay.

NewsLib Live!

Moderator: Barbara Semonche, Library Director, University of North Carolina, School of Journalism.

Panelists:Linda Henderson, Library Director, Providence Journal, Providence, Rhode Island;Jim Hunter, Chief Librarian, Columbus Dispatch, Columbus, Ohio;Marcia MacVane, Reader Services Manager, Portland Press Herald, Portland, Maine.

The second annual NewsLib Live offered an overview of photo archiving past, present, and future, as well as the experiences of two newspapers in purchasing and implementing products that interfaced with photo archive and an integrated photo/text archive.

Jim Hunter opened the breakfast session asking "Why aren't we happy?" We are archiving text. We are archiving images. We are archiving images and text. A look backward isn't far ­ full-text retrieval is a creature of the 1980s. Full-color digital images came to us from 1990 to 1994, followed closely by the full use of the Internet and then, by 1998, intranets. Hunter sees the nineties leading us to the technology now available to create "boutique archives" ­ "small specific archives for small specific people". (Hunter always brings a slightly jaundiced eye to new technology. When I am considering the purchase of any technology, particularly relating to photo, I call him. He is an elegant devil's advocate.) The challenge is the year 2000 and beyond. The Dispatch currently stores 220,000 digital images, 45,000 of which are sports. Mass storage technologies for outtakes, newspaper Web sites, print files, and clipping files will be needed as document imaging capabilities improve.

The Internet, however, is the driving force for the future, both in number and in volume of pages, along with its audio and video elements. But the issue among newspaper Web site managers is not currently revenue, but rather the corporate mission of empowering people with information, as well as putting a presence out there on the Web. Hunter's message for those managing these changes: choose your allocation of sparse resources well, have professional staffing, determine the relative importance of archiving versus research, and be sure you determine what the basics of your library are and do them well.

Linda Henderson and Marcia MacVane teamed up to compare and contrast two experiences of bringing new systems in-house: with a committee, planning, and knowledge about what the strengths, limitations, and cautions of the project are in advance; and with little beforehand communication about the system vendor and the long-term problems that the system introduction would create. It's a cautionary tale, heard over and over, with little change in the tune; however it is good to repeat it over and over in order for everyone to hear the lessons to be learned, if only to avoid hardship on behalf of one library. Henderson and her committee knew they were purchasing a beta product, and were prepared for a less-than-smooth system implementation. MacVane's text archiving was adversely impacted by a pagination system in which the proposals and planning did not meet expectations. This is an extremely difficult situation and she and her staff suffered the direct consequences of the system purchase. I have been involved in the planning and installation of two photo archiving systems and three text archiving systems. My observations:

1) The library must be included in all stages of the search, acquisition, and implementation of any system impacting the library; and

2) it is rare that a vendor fails completely.

The breakfast session was well attended and animated, particularly considering the hour. The new term for me came from the audience: shovelware. There was considerable discussion about archiving newspaper web sites. Do we simply archive them regardless of content quality or durability as information? Do we archive them just because they are there? In other words, shoveling content. Not necessarily, but we need to be there when the need arises. Another interesting thread of discussion revolved around the question, "Are photos that appear on the Web published?" The message of this session is clear: we're moving through changing technologies rapidly, but, at the same time, old issues of archive selection, installation, and implementation will remain with us. It is a reminder that new systems are not necessarily better systems and that, while what we archive or feel we need to archive is growing exponentially, the technologies for this information management are still not turnkey.

Sammy Alzofon is Library Director with The Palm Beach Post, West Palm Beach, Florida. salzofon@pbpost.com

Judith Field

Strategies for Knowledge Management: Case Studies

This Library Management Division program was sponsored by Majors Scientific Books and featured two practitioners who have recently implemented knowledge management (KM) programs within their organizations: Jim Tchobanoff of Pillsbury and Tom Porter of Case.

The Pillsbury project was initiated in 1997 as the KM Incubator. Focus groups were formed to address the question of how should the Technology Knowledge Center look in 2000. From this input they developed a process that would leverage Pillsbury's intellectual capital in a way that recognized the capability of tools and the company's culture. By May 1999, over 100 initiatives had been incorporated into their new knowledge management system, some of which were available on the company's intranet, some accessible via their Lotus Notes programs that encompassed the food scientists' notebook archives, the patent database, and other intellectual capital resources. A third area will be those items relating to culture, including training modules.

Some of the initial results are that turnaround time on the analyses of plant data was reduced by over 50 percent; time to approve product labels was reduced by 80 percent; and the time to approve new ingredients was reduced by 95 percent. This new one-stop-shop virtual library, when completely mounted, will provide their clients with access to a document warehouse where the intranet, the Internet, and data repositories can be searched simultaneously and includes pointers to relevant resources not in the document warehouse. The document warehouse provides access to over 40 years of research and development in excess of 105,000 documents, over more than 140 Lotus Notes databases and four BRS databases utilizing the Dataware Knowledge Management Suite software. The biggest concern was the security protocol that was tested again after the completion of the pilot project. Tchabonoff's concluding comments about the successful launching of this knowledge management program stressed again the importance of maintaining communication with your users and your Information Systems Department.

In the second case study, Tom Porter provided information about Case Corporation's Real-Time Current Awareness System that was launched in December 1998. The development of this system started in the first quarter of 1996 as an external current awareness tool for executives using the Dow Vision premier product. A year later this was expanded by adding Thomson Financial First Call; by late 1997, demand for this service had expanded to make scalability an issue, complicated by the fact that Wavephone would no longer support the system. It became obvious that, with the growing content and increased customers, enhanced personal and proximity filtering was needed for any new system and that this system needed to be technology independent.

After reviewing several products, Retrieval Technologies Inc. (RTI) was selected as the distribution system. During this selection process, 100 predefined corporate filters/profiles were developed and then tested in the latter half of 1998. Dow Jones is being used for their targeted marketing and Verity as the spider that works both within and without the firewalls.

The lessons that they learned were:

  • the need to select premier trusted content and that you get what you pay for;

  • that powerful filtering technology is just as critical as the knowledge resources continued to rapidly grow and time becomes a critical issue; and

  • the need to continuously market the value of your contributions to the further development of the system.

Knowledge Management: An Oxymoron?

This Petroleum and Energy Resources Division program was sponsored by Majors Scientific Books and featured Richard Fletcher of Energy Futures Research Associates. The first part of his presentation discussed how change was now an integral factor in our lives and how, in order to be successful and stay competitive, you must be willing to quickly adapt to new workplace cultures. He continued by discussing how the current merger mania has become a powerful change agent. These mergers are more than just a merger of staffs and production; it is also a merger of corporate cultures. While rapid integration is a key goal, it is the sharing of knowledge resources that will determine the ultimate success of the merger. The revolutionary innovations that are impacting the science and technology fields have been possible by the rapid adoption and corresponding adaptation of the latest in information technology.

Fletcher went on to say that the interaction of the expanding global networks has resulted in a greater variety of information products. This, in turn, has increased the push for more and improved computer and network devices that increase productivity and provide greater security. He feels that in the next ten years these products will become more and more ubiquitous as more everyday products are being manufactured with embedded microcomputers in them. It will not be long before televisions, computers, and phone services are merged into one seamless product. The current concerns dealing with bandwidth will become a non-issue in the near future. Teleconferences and other groupware tools will soon enable corporations to operate successfully as a virtual organization in the global arena.

These cultural and technological changes have created a sociological, technological, and economic value shift that has necessitated even a greater need for access to knowledge resources. It is this need which provides an opportunity for our profession to develop and promote knowledge management systems. To achieve this we must become students of our corporate environment; we need to step forward and demonstrate our effectiveness as facilitators and communicators and take charge of the contents of the corporate knowledge management system. This is critical, since in the virtual corporate environment there will be little, if any, need for a physical library and by promoting our talents in this way we can defer the question of "Why do we need an information professional?"

Fletcher went on to stress that knowledge management is not just about creating a technological knowledge sharing system but it is also about connecting people.

This latter component is the issue most corporations do not understand. The power brokers feel that technology is the answer and it this lack of understanding of corporate culture which impedes rapid and effective adoption of a knowledge management system. Here he again said that information professionals need to take a proactive role in exploiting our information skills, countless opportunities will appear, and we need to seize the moment to take a leadership role. Curiosity may lead us to new successes and a sense of humor will help us through the rough days.

College and University Librarians Roundtable Breakfast

This session was sponsored by CIS and featured Sheila Curl from Purdue University who discussed how they were able to fast track the implementation of their new Voyager library system. In October 1997 they held a series of all-staff meetings, during which they reached a consensus by the end of the month to go ahead with a new system implementation the following summer. By the end of the year they had signed a contract with Endeavor Information Systems for their Voyager product. In July 1998 the new system went into production. After Curl provided this background, she then shared with us some of the critical items that made this timeline work:

  • Total support of the Library Administrators.

  • Complete support from the entire library staff.

  • Providing regular updates in internal meetings.

  • Being user-centered in developing services.

  • Providing staff with the tools they need.

  • Knowing the characteristics that team members must share:

    ­ ability to set and meet deadlines;

    ­ ability to work as both a member and a leader;

    ­ need to be self-actualized;

    ­ need to have and share knowledge of the process and activities;

    ­ positive attitude toward change.

Several teams were created to facilitate this implementation. They were cataloging and authority control; acquisition and fiscal concerns; serials and binding; circulation/reserves/media scheduling; user interfaces including the Web; staff and user training; and a technical team. Most of the members in the first three teams came from the technical service area and two team leaders were clerks. This latter item helped to get everyone aboard on this project.

Curl then shared a list of what she called "The 10 Commandments of New Systems":

1) Negotiations will take longer than you think or plan for.

2) Never, never assume anything; put it into writing.

3) Buy more training, not less.

4) Know your operating system.

5) Upgrade your computers, now.

6) Relieve staff of "other" duties.

7) Know your old system and learn your new system.

8) Stress is inevitable.

9) Keep your sense of humor and share it as necessary.

10) The software is ready when the software is ready.

She said we need to remember that, for many, change is a scary process, so provide reassurances or at least information as needed. The last item she shared is that your public relations campaign for introducing a new system should be started early and it should be strategic in nature.

Judith Field is Senior Lecturer, Library and Information Science Program, Wayne State University, Detroit, Michigan. aa4101@wayne.edu

Jean Z. Piety

Dead Technologies

Inserting the subtitle "and we didn't even know they were sick", three Association members entertained the audience with descriptions of what they felt were dead technologies. Nicknamed Larry, Moe, and Curley, Walt Howe of Delphi Internet Services, Richard Hulser, IBM Digital Library Technologist, and Stephen Abram, IHS Micromedia Limited, voiced their opinions and with audience interaction provided for a lively session. As Walt Howe explained in his introduction, the previous session on hot technologies prompted this one on dead technologies.

Howe described chips with hard-coded two-character year field as the millennium bug. He questioned whether automobiles were Y2K-compatible, with the prime example being the 1989-1990 Cadillac. Having surveyed manufacturers, a few replied that there were minor problems, but most said nothing.

Paper provided his second example, plain old tree-based paper. Fewer people read newspapers, more people use palm-tops instead of day-timers, and more people file taxes electronically. He continued by citing some ancient history: back in the 1980s there existed dot codes, Wordstar then WordPerfect, desktop publishing done only on Macs, and then Windows 3.0. What You See Is What You Get (WYSIWYG) gradually got better; finally formatting codes arrived. HTML, used for Web formatting, added codes for graphics and pictures. Web went commercial, and now HTML is dying, for people can learn new Web publishing tools without codes.

His last dead technology focused on a big subject of the 1990s: Knowledge Management. He added, whatever happened to TQM (Total Quality Management), MBO (Management by Objectives), Theory XYZ, and operations research. He felt that management theories resembled waves in the ocean, for they get larger and larger, hit a crest, and then recede into the surface of the water.

In response, the audience added line text editor on UNIVAC machines as a dead technology. Someone suggested, "Kill PowerPoint". Another questioned the durability of acid-free paper.

Richard Hulser theorized that a variety of things would replace modems, but did not elaborate on whether the mode would be mechanical, electronic, or fiber optics. He added that CD-ROM networks were never meant to be networked. Dead hardware included the 1.44 megabyte diskette. Dissatisfaction prompts the death of internal search engines, for none is adequate. He also felt that integrated library systems are dying, for vendors try to do more in a limited environment without asking for the strategic plan. He leans toward digital libraries with new systems coming down the road.

He ended by saying that what is not dead is the ready-reference shelf with books that staff use to answer questions quickly. The audience asked if the MARC record was dead. He responded yes, but not catalogers, for they will become "digital model constructionists". He voiced the demise of Z39.50, for it cannot handle digital cameras or videos, and closed with the admonition not to restrict yourself to what you know.

Stephen Abrams tried to present a Canadian viewpoint, across-the-border attitude. His visual example of computer technology aptly described the parts of a flush toilet. But he feels that intelligence is dying, for too many information professionals and too many citizens use the Web with varying results. He also felt that the arrival of Y2K is a blessing, for it will kill DOS and Windows 3.1. Other deaths included e-mail and one-size-fits-all. There is too much e-mail and too little organization of it. There is no mass market for information. No one wants information. What is intuitive is meaningless. Text is no longer a tyrant, for text heads must become netheads. Free comes with a cost, for there can be viruses, stupidity, and misinformation. He showed a cartoon that included both the "information" booth and the "clarification" booth.

In his ecological sketch, he proved that the loss of the passenger pigeon caused Lyme disease. Are librarians and information specialists going that route? He projected the death of bandwidth. He felt that the joystick generation would become Generation "J". Will the term be "post-media" or "media-immersed"? Although he felt that children would talk to the Web comfortably, he closed by asking, "How do we increase intelligence in the world?" As the audience absorbed that thought, they asked no more questions nor offered any more dead topics. The chuckles and mutterings showed that many had personal experiences with the dead examples and the applause showed agreement with the speakers' theories.

Standards Update

Marge Hlava, chairing the Technical Standards Committee, gave a brief update on the Committee's work this year. The Committee members, appointed by the Association, reviewed a number of NISO (National Information Standards Organization) Z39 standards up for revision. The Committee also commented on several ISO (International Standards Organization) standards and followed actions on metadata and DOI (digital object identifier) standards. She introduced Albert Simmonds, retired from Bowker, now a consultant, who updated the audience with his expertise on metadata standardization. Last year's session provided Simmonds the opportunity to describe DOI and Bowker's representation on the international committee. This year he continued the thread by describing the complexity of metadata.

While convention employees tried to assemble the computer equipment to connect the laptop computer to the overhead, Hlava enlightened the audience by translating the alphabet soup of acronyms tossed around when talking about metadata or standards. ISBN means International Standard Book Number; its offshoot ISSN means International Standard Serial Number. IDF stands for the International DOI Foundation, composed of publishers. In explaining the DOI string of numbers, she reported that the left side contains the ISBN or ISSN, while the right includes the specific publication with volume, issue, and pages. Other acronyms ranged from HTML (HyperText Mark-up Language) to CML (Chemical Mark-up Language).

She explained that the Technical Standards Committee looked at drafts submitted by ISO TC 46, the Technical Committee that issues international standards equivalent to NISO Z39.50s in this country. NISO stands for the National Information Standards Organization. Although the protocols on metadata are not really standards, the Technical Standards Committee tries to keep track of the comments on those protocols, too.

Simmonds reviewed the ISBN committee work. Although too weak to stand alone, the ISBN with the DOI system strengthens the DOI initiative. He informed the audience that the IDF, headquartered in Geneva, Switzerland, has offices in Washington, DC, and Oxford, England. Besides three founding members, there are 30 charter members, but none from the library community. OCLC may join, but fees run at $35,000 per year. More information may be found at www.doi.org. Contact is Dr Norman Paskin.

He described the CNRI (Corporation for National Research Initiatives) headed by Dr Robert Kahn. Think of it as a handle system that resolves technology, for CNRI resolves requests for DOI or intelligent property matters. Problems include the SICI (Serial Item Code Identifier) rights management. Who owns the rights, the serial publisher or the author?

Further descriptions included ISBN International, a community of 120 interested local and national agencies, headquartered in Berlin. The group holds an annual meeting for policy and is negotiating with DOI to be the rights agency. Next he described the DOI Registration Agency. That group processes the application, assigns the DOI, manages the links between DOI and its URL, promotes the DOI system, and develops and collects DOI metadata.

The DOI syntax may become a draft standard, containing registration and maintenance details. No DOI string should be registered without an accompanying set of metadata description. The maintenance agency should provide the latest information about relevant metadata schemes and any databases that aggregate metadata about referenced objects.

Adding to the list of acronyms, he explained about "INDECS", a term meaning INteroperability of Data in E-Commerce Systems. Started by the music and motion picture industry, its web site provides more information at http://www.indecs.org/news/news.htm

Next acronym tossed to the audience was BASIC (the Book And Serials Industry Committees), not to be confused with BISAC (Book Industry Systems Advisory Committee), for both are part of the Book Industry Study Group, Incorporated. For more acronyms and clarification, go to www.bookwire.com. Two committees in BASIC cover product data and rights. The committees are very interested in identifiers. His favorite example of the ISWC (International Standard Work Code) is Gone with the Wind, for manufactured items connected with that title pay royalties to the Mitchell Foundation. He expects the ISWC will have a Web site by the end of summer.

A history of XML (eXtensible Mark-up Language) followed. XML started in 1996. It became much better for web design than HTML for it is easier to know. Its Web site is www.xmledi.com. Participants include HarperCollins, Baker & Taylor, Ingram, Pearson Educational, Barnes & Noble, Project MUSE, Amazon.Com, RR Bowker, and OCLC.

Participants in the audience who work with standards know the difficulties with the alphabet soup of acronyms and the delays in the time frame for standards implementation. Internet expansion complicates the standards process. Simmonds concluded that DOI with all the complexities may work. He finished by offering a copy of his overheads through his e-mail address: awsimmo@ibm.net. Hlava wondered if the primary publishers are working in one direction, while secondary publishers are working in another direction. She finished by expanding on the Technical Standards Committee work and on current issues on standards development, such as accelerating the time frame for standards approval to less than seven years. Other issues include internationalization, for other groups will take over if the standards action bodies lag or go in differing directions. She offered more Web sites, such as www.openebook. com. Another source is the National Center for Standards and Certification with e-mail at ncsci@nist.gov. Hlava communicates with the Committee members and other interested parties at mhlava@accessinn.com. To find out more about NISO, go to www.niso.org

Jean Z. Piety is Head, Science and Technology Department, Cleveland Public Library, Cleveland, Ohio. Jean.Piety@cpl.org

John Piety

Academic Sci-Tech Librarians Roundtable and Breakfast

The moderator, Tina Kristowski, Chemistry Librarian at the University of Illinois, set up a flip chart with some proposed topics for discussion, and there was discussion! The topics were: E-Journals, Budgets, Reference, and Vendors.

Electronic Journals

Everyone agreed that e-journals cost money, and most libraries are not canceling print subscriptions. One comment was that e-journals ease the space problem. This didn't imply discarding existing print copies, only that new subscriptions would not require new shelving. A number of those present say that this coming year they must really consider canceling print subscriptions. Many present mentioned that the post of electronic journal librarian has been established in their institution. Some academic libraries are uneasy with the archiving function of e-journals. Will it be on a title-by-title basis? By aggregator? Or can a major project like JSTOR do the archiving for journals not otherwise part of the JSTOR project?

Access is still a problem for many. Both software and hardware are providing problems. Cataloging of e-journals is such a problem that some libraries simply don't try. They offer alternatives to cataloging that give access to the journals, but may not fit the rest of the cataloging system. The best course seems to be to set up policies and procedures before it becomes a problem. In some cases the same journal title is available from more than one source ­ this is a problem. In some places, public services librarians are maintaining the e-journals catalogs with holdings dates. Often the cataloging systems cannot put in holdings information. Maintaining web pages with journals information is a good answer, but often labor intensive. The OCLC Cooperative Online Resource Catalog (CORC) program, which can generate web pages without human intervention, provides a good solution for now. Those present were concerned with statistics on use, particularly use of the e-journals. There is no way at present to compare use of print copies versus e-journals. The publishers simply can't provide the data.

Budgeting

For some libraries, ratios and formulas are sacred cows in budgeting. The question was asked, "How many are stuck with the 80 percent to 20 percent ratio, subscriptions versus books?" A number of hands were raised. It was pointed out that in the science and technology fields this ratio is more often 90/10 or even 99/1. If the powers that be don't separate subject budgets but insist on all being alike, major problems can arise. A more comfortable, and realistic, budgeting process is to have the entire budget balanced at a specific number, such as 60 percent to 40 percent, but not to insist that individual subject areas be uniform.

Reference

Is Current Contents necessary if a library has Web of Science? Consensus of those present is no. However, Current Contents generates reprint request cards and many faculty researchers like that. Most of the libraries present have Web of Science, and those who do not cite cost as the primary obstacle. The Institute for Scientific Information (ISI) is putting together new packages of Basic Web of Science for various needs, including a great number of resources, and that may be helpful to the libraries that do not now subscribe. Faculty who have learned are now using Current Contents with End Notes to make weekly additions to a personal research database.

Vendors

The time ran out before the group could discuss vendors and vendor relations.

John Piety is Science Reference Librarian, Grasselli Library, John Carroll University, Cleveland, Ohio. john.piety@nowonline.net

Priscilla Ubysz

What's Happening with the National Transportation Library?

The Transportation Division sponsored a well-attended and useful session on the current status and future plans for the National Transportation Library (http://ntl.bts.gov). Transportation reauthorization passed in June 1998 mandating the development of a National Transportation Library (NTL) to "improve the ability of the transportation community to share information". According to promotional materials, "BTS is working with public and private organizations throughout the country to acquire transportation data in a variety of formats, ranging from databases, electronic media, maps, plans, reports, and other documents in the public domain. The NTL has implemented a search engine on their web site that indexes 110,000 documents from 14 transportation agencies".

Moderator Daniel Krummes, Institute of Transportation Studies, University of California Berkeley, introduced two speakers, Janice Bain-Kerr and Chris Berendes, both affiliated with the Bureau of Transportation Statistics (BTS). Bain-Kerr elaborated on the work currently being done on building collection development policies for the NTL and, the collaborative activities they are engaged in with other Federal libraries. Although there is no official budget and planning policy as yet, they are actively working towards the legislative mandate and cooperating with other government agencies to evaluate the first century of documentation on all modes of transportation. A primary step is to identify reports of major significance to digitize. Eventually the virtual digital collection will include text documents and non-text materials such as photographs and audio and video formats.

Bain-Kerr said a variety of collection policies were evaluated and will be incorporated into a formal collection development policy, which will encompass a global effort to capture current, enduring, and retrospective materials in the transportation world. The NTL will be decentralized in order to utilize the libraries and resources already in place throughout the world.

An immediate goal is to capture current DOT output, which is estimated annually to be approximately 4,000 to 5,000 items. Archiving documents will be addressed later and the NTL will work backwards to capture reports retrospectively. Working closely with DOT, DOT libraries, Transportation Research Information Services (TRIS), and Transportation Research Board (TRB), the goal is to load 50,000 documents by 2005 from both Federal and State agencies. They also plan to capture documents from local government agencies in the future.

A concentrated effort will begin shortly to obtain more international transportation materials. Records from outside Europe and North America have not been widely captured but the NTL expect to begin pursuing more publications worldwide. Initially the focus will fall on materials from Japan and more countries in the European Union.

The NTL has given NTIS access to their internal databases so NTIS can verify that they are receiving all reports published by transportation agencies. The NTL also plans to link to the TRIS reports that may be requested over the years yet are not necessarily candidates for digitizing.

The second speaker, Chris Berendes (christop.berendes@bts.gov), spoke about the status and issues of NTL.

DOTBOT: USDOT Search (http://search.bts.gov) was established last summer. It encompasses 150,000 documents from 200 sites. Currently it receives around 10,000 queries per day. Access can be found via USDOT, Federal Highway Administration (FHWA), Federal Aviation Administration (FAA), BTS, Federal Railroad Administration (FRA) and Office of the Secretary (OST). The site indexes a variety of formats including Word, Excel, PDF, and HTML. Boolean logic is used for searching but more advanced search techniques also are available. It is possible to search by phrase, focus your search by agency, or sort by date. There are efforts under way to provide contacts for each of the more than 200 links off the site.

The digital library found at http://ntl.bts.gov contains more than 5,000 documents from 330 Web sites. The search engine can be found at http://search.bts.gov/ntlsearch/. Issues put forth by Berendes include:

  • Make the search power more accessible.

  • Focus by subject area.

  • Browse structure.

  • Contact point for each site.

He also asked, "What else should we be indexing within current mandate?"

NTL keeps track of the 70 or so messages it receives each day (librarian @ntl.bts.gov). Berendes explained the twofold benefits of capturing "who was asking what": NTL gets an idea of what the interests are of those using the site, and NTL will eventually be able to incorporate some of the questions into FAQs for the Web site.

The challenges to expand the Web site include the problem of temporary URLs, lack of metadata which most USDOT documents don't use, lack of standardization, and non-text document formats.

NTL is actively soliciting questions, comments, and suggestions from the global transportation community and working hard to provide the best access possible to the world of transportation documentation. Berendes said those responsible for the Library want to "connect you to a place where transportation professionals share their accomplishments and experience with the rest of the transportation community". The ultimate wish is that "the Library will become the primary sources for transportation planners". The current NTL is well along its way to becoming that reality.

Disclaimer:Remarks presented do not necessarily reflect policies or views of the Bureau of Transportation Statistics, the US Department of Transportation, or other referenced Federal agencies, nor do they commit either to specific programs or courses of action.

Investing in the Stock Market

Jim Jubak, author of Jubak's Journal, offered advice for those interested in the stock market during this lively breakfast session moderated by Colin McQuillan of General Mills, Inc. Jubak's Journal is featured on the Microsoft Investor's Web site at http://www.moneycentral.msn.com.

On the strategy for investing in Internet stocks Jubak gave the advice, "Buy on the rumor; sell on the lie". The market for the stocks is still being played out. Much of what's going on in the Internet currently is unexpected and unreported.

Jubak regaled the audience with several anecdotal stories on the perils, pitfalls, and possibilities for trading Internet stocks.

The piece of advice on buying on the rumor was demonstrated by a story of the companies MCI and Skytel. A routine sweep of site names by an intelligent agent from companysleuth.com discovered the fact that someone at MCI had filed a claim on the name of Skytel. Within five minutes of the report being generated, the story appeared all over the stock bulletin boards and other Internet investing arenas. The next morning the Wall Street Journal, "a dead tree publication", printed MCI's response. MCI vehemently denied the rumor. Three days later, MCI bought Skytel. All of this happened outside the "normal" channels. This is an example of where the Internet makes it possible for individual investors to receive information and create information at the same time.

Fraud is a risk, yet strangely enough, also expected when dealing with Internet trading. A half-hour before the market opened one day, a Bloomberg story appeared on a Web site used for building web pages that looked absolutely legitimate but ultimately turned out to be a fake. Somebody had downloaded a Bloomberg page, changed it to include the story, and then posted it. Ten minutes after it was posted; the usually volatile stock was being traded at a significant higher rate. There are two actions that investors can take in this instance. The investor can buy low and sell within hours after the truth is finally unveiled and still end up making a profit, as was the case here. Or the investor can check trusted sources before trading. Either option would probably not stop the stock from gaining steadily or from investors profiting as the Internet trading culture is recognized as tolerating, and even encouraging, a lie. People expect the hype that is generated and work within this system.

Bloomberg did issue a denial that was viewed as bizarre. The denial was considered so weak that it was seen as a non-denial denial. Hours after the denials, rumors, and activity, Lycos and Yahoo! still had not withdrawn the story. It also refused to die. By noon everyone knew the story was a fake but the trading frenzy continued and the stock did end up higher at the end of the day.

Jubak believes that emotional bonds form between Internet investors and the stock they pick. Self-convincing communities for the stock have developed. He is inundated with messages challenging him whenever he suggests a sell on a stock or brings up issues with a particular stock. It is as if he was suggesting that a fond family member was at risk. He feels that people have a great hunger for this type of community that can now be found online.

We all have access to a tremendous amount of information on the Internet and about the Internet. As an organism it has its own life and passion and is different from other environments. Jubak feels that the Internet is not controllable by any of us. The rules for the Internet are different., although he's not sure exactly what they are.

During the question-and-answer section, several pertinent questions arose. One of the first questions asked was about Internet auction stocks. Was there room in the market for all? Was e-Bay worth buying? How big was the auction market? Jubak feels that the market is huge and e-Bay especially is revolutionary. He made the realization that the auctions are the classifieds. As soon as they are big enough they will launch into local sites and the traditional classifieds are in jeopardy. He believes that within the Internet auction sites, the second- and third-tier companies just won't make it; only the biggest and most efficient will survive the market. The probability is that we'll end up with one to three big companies with which to conduct our auction business.

Other brief answers to specific questions:

  • Priceline.com doesn't work in his opinion. They have a supply problem, the technology is lousy, they are fulfilling too small a number of orders, and a large amount of the airline tickets they are selling are sold at lower prices than what they paid for them. These elements don't add up to a successful company.

  • Companies like Wal-Mart will have to cannibalize their own business to keep alive by offering services and products via the Internet as well as retail locations.

  • With AT&T buying TCI we'll see high-speed access coming into the home regularly, even though some cities like Portland say they will regulate cable access. Jubak sees open access in the long term.

  • On day trading, Jubak believes there will be more and more trading occurring, and soon, night trading will be established as well. It's clear to him that 24-hour trading is on its way. Day trading has been profitable on the whole because the market is so good but we may see a trend of day traders leaving owing to a lack of capital.

Jubak's Web site is certainly worth a visit. Included on the site are recent articles on companies of interest, Jubak's list of the 50 best stocks in the world for the long term, his top stocks for the next 12 months, links to a discussion group, and an archive of articles. If you're interested in investing right away, the site also offers Jubak's list of stocks he's currently following, along with a list of recently dropped stocks. Jim Jubak can be reached at jjubak@microsoft.com

Priscilla Ubysz is Research Analyst, UTC Information Network, United Technologies Corp., Hartford, Connecticut. Ubyszpm@corphq.utc.com

Barbara J. Arnold

Holy Land Antique Map Project

Presenters were HelenJane Armstrong and John Freund, both of the University of Florida, George A. Smathers Library, Map and Imagery Library, Gainesville, Florida.

HelenJane Armstrong opened the program by saying that this was a new area of cartography for her. She had been concentrating on the Caribbean collection of maps, but the history of mapping the Holy Land parallels the development of geography. The opportunity for a major gift from a donor led to research for verification, preservation, development of a home page, and construction of a large display case for the physical display of eight of the maps at all times.

The maps generally display an area from "Dan to Beer-sheva", measuring 50 miles wide and 150 miles long. The coast is usually shown from Sidon to Gaza or further south to the Nile delta. Both sides of the Jordan River are shown and many maps of Jerusalem are among the collection. These are antique and rare maps published in Europe during the Classical and Early Middle Ages, the Renaissance, and in the Golden Age of Cartography, from 1493 to 1888. The University of Florida has a Jewish Studies Program and the map collection did have some significant antique maps in the collection already. This gift provided an opportunity to acquire 74 additional antique maps ­ a most outstanding collection. The gift of the collection was made by Dr James and Adina Simmons. Adina Simmons had collected the maps

Using the collector's notes, a partial inventory, and consulting with a specialist in California, Armstrong set about an inventory for the collection, both for a bibliography and for security. Instead of compiling an exhibit catalog, it was decided to develop a cartobibliography. Verification was the real challenge. Few antique maps are cataloged on OCLC or ARLYN. However, there were several atlases in the Smather's Rare Book Collection and Armstrong has a bibliography of other helpful sources for verification. An in-house publication of the cartobibliography, entitled "Antique Maps of the Holy Land", is available from the University of Florida, George A. Smathers Libraries, 204 Library West, PO Box 117001-7001, Gainesville, FL 32661. The cost including tax and postage is $40.

John Freund, the second presenter, talked about the preservation process, what the University of Florida learned contracting out the scanning of images, and what they finally decided to do themselves. Many of the maps had been framed with masking tape, acidic mats, and cardboard. Freund began this part of the project with an inventory and a document condition survey. All of the maps were removed from the frames; some were treated to stabilize (deacidify) their condition. Many needed to be cleaned and mended. Next, they were encapsulated in Mylar on all four sides with identifying labels.

When the plans to digitize parts of the collection were proposed, the University of Florida first decided to contract out for the photography. However, the number of maps to be included in this aspect of the project was increased. The contractor was across town, which meant transporting the maps. The quality of the contractor's photos was not adequate, because the encapsulation interfered. All the maps had to be removed from the Mylar and then re-encapsulated. Freund decided it had to be done at the library. He and his other staff slipped the maps out of the Mylar just before taking the pictures and then returned them to the encapsulation states right away. Using a flatbed scanner, they directly scanned the small maps and photographed the larger ones using slide film to improve the legibility. Freund's general recommendations for anyone else starting up a project like this one: do it in-house; scan the small maps directly and scan them before encapsulation

The homepage for the antique maps of the Holy Land, still under construction, is http://www.uflib.ufl.edu/maps. Select special collections and then antique maps. The maps are arranged by time periods. A viewer can click on the images to bring up enlargements for more detail.

Barbara J. Arnold is Admissions and Placement Adviser, University of Wisconsin-Madison, School of Library and Information Studies, Madison, Wisconsin. bjarnold@facstaff.wisc.edu

Susan Charkes

What Professionals Need to Know about Interface Design

Alison Head, Ph.D., presented a refreshingly different approach to evaluating information resources. Head, a consultant on Web and intranet development, and formerly Director of Information Management at The (Santa Rosa, CA) Press-Democrat, is the author of Design Wise: A Guide for Evaluating the Interface Design of Information Resources (1999: Information Today, Inc., Medford, NJ). Attendees of the packed session were treated to a succinct, engaging presentation on using interface design principles as criteria for evaluation of Web-based information services.

The explosion in number and accessibility of information resources has resulted in the same content being available in many different formats. "Patrons" have become "end-users", and today's end-users have specific expectations for information resources ­ they want 24-hour access, no penalty for inexperience and above all ­ fun. Information professionals have become information coaches. The basic question asked of us is no longer "Can you get this for me?" but "Which is the easiest resource for me to use in order to get this?" To answer this question, traditional criteria such as cost, availability, and content still matter, but interface design is now just as important. Moreover, information professionals have become infrequent users of many services, whereas in the past they were often heavy users of a few services. Interface design issues therefore are as important for professionals as they are for end-users.

Head defined interface design as "what we have to deal with to interact with the system"; the design connects humans and machines. Key criteria for good design include: Does it meet users' expectations? Does it help users get their tasks done? Does it keep users focused and interested? Does it produce quality results? Does it keep users satisfied? A well-designed information resource requires less ongoing training; enhances information-seeking behavior, keeps users happy, and keeps them coming back.

How does one sort out the good designs from the bad? Head presented a three-step approach and illustrated her evaluation by using examples of DialogWeb 2.1, Insite2 from Gale, and a Hewlett-Packard Intranet (Head has worked with H-P on Intranet design).

The first question to ask is what the interface looks like. A design should help focus the users' attention to the parts of the screen that matter to the tasks they seek to accomplish, and help the users absorb information. Users want to use a tool, not to "have an experience". Simple, clean layouts are easier to use than cluttered ones. Graphics should help impart critical information, not distract from the task.

Second, one should ask whether there is ease of use, also called user-friendliness or usability. A usable design should be easy to remember and retain; it should be "forgiving" of user error or misunderstanding, or simply changing one's mind; and it should be easy to learn. Head suggested some indicators of usability: The interface should be intuitive: it should work like tools one has used elsewhere. It should be consistent from screen to screen. The user should not have to remember how to accomplish a task. And it should give feedback in response to user action.

Looking at the Dialog and Gale sites, Head pointed out some usability problems. For example, in search screens a common flaw is the lack of an "interrupt" button to stop a search that is in error or is taking too long: this is an "unforgiving" design. Users will typically resort to the browser "stop" button, which may or may not work ­ users have trouble learning and accepting the fact that browser navigation buttons are not consistently integrated into (or excluded from) interface design.

The third question to ask when evaluating the interface design is, how well are tasks supported? Users want to know what the interface is for and how to use it. What they need should be prominently displayed. The interface should be user-centered. Ask, who is doing the most work: the interface, or the user?

Head showed several examples of interface design that fall short on this criterion. For example, a login screen that has information about the vendor more prominently displayed than information about how to search does not help the user accomplish the typical task. Another common design error is making the user click through multiple screens before getting to the search input screen, making the primary task a chore.

Head concluded her presentation with some "words of wisdom": Nothing is done by accident; even the worst designed thing has had a lot of thought. Good design is about users and their needs, not only about the technology.

This session was very well received, and with good reason. Head's approach helps one to understand the principles behind what users instinctively discern, and therefore to objectively evaluate and compare resources. Interface evaluation is critical to helping users select the right tools to accomplish their tasks.

The slides for the presentation are available at http://metalab.unc.edu/slanews/conferences/sla1999/interface/index.htm

Susan Charkes is Systems Librarian, Dechert Price & Rhoads, Philadelphia, Pennsylvania. scharkes@dechert.com

Patricia W. Boody

Annie Brookings: Identifying and Valuing Your Firm's Intellectual Capital

Annie Brookings considers that the value of an enterprise equals the tangible assets of the organization plus the intellectual capital (intangible assets). Intellectual capital within the organization consists of four major components: Market Assets, Human Centered Assets, Infrastructure Assets, and Intellectual Property Assets.

Market Assets are derived from an organization's relationship with its market and customers. They include brands, reputation, repeat business, distribution channels, favorable licensing, and contracts. Customers fall into several categories ranging from suspect to prospect to champion to customer to evangelist. The most favorable customers are those who finally become evangelists, actively promoting the products and services of an external company.

Human Centered Assets are related to the people within the organization. These assets comprise the expertise, creativity, problem-solving ability, leadership, and entrepreneurial and managerial skills of the employees. They may also include an occupational assessment, psychometric testing, and personality profiling. Much of this area will involve helping managers to develop employees, encouraging continuous learning, and development of new competencies.

Infrastructure Assets include the technologies, methodologies, and processes that enable the organization to function. Examples include the corporate culture, management philosophy, methods of developing a sales force, financial structure, databases on customers, communication systems such as e-mail, and others. The information technology systems required need to be flexible, helping the organization respond to changes in the marketplace.

Intellectual Property (IP) Assets are the property that is developed by the mind and protected by law. These take the form of patent, copyright, designs, trade marks, trade secrets, proprietary technology, and know-how. Coca Cola's formula is an example of a trade secret, with two people who only know half of the formula each. The suggestion Brookings made relative to IP assets is that knowing that you own a patent is not a lot of use if it is not used in conjunction with information concerning its potential.

Once an organization does an audit of the assets in each component, the relative strength of the asset can be assigned a value. Each strength can be plotted in the quadrant representing the components. This visual picture can quickly show what areas of the organization need special attention and development.

Questions raised by the audience included the following:

1) Does the process work better in some industries than others? Brookings replied that she has seen it successfully used in manufacturing and high-tech industries. Her caveat was that the older the company, the more difficult and the longer it may take to fully complete the process.

2) How many people should be involved in the process? Brookings considers it essential to involve the entire management team plus key players from customer support, marketing, and other areas of the company.

3) Can some of the numbers be negative? Yes, if an asset is getting weaker, its location may be travelling away from the center.

4) Since much of the effort involves changing corporate culture, what resource do you recommend? Brookings suggested Corporate Culture by D. Kennedy.

Patricia W. Boody is Administrator of Corporate Research for TECO Energy, Tampa, Florida. pboody@ix.netcom.com

Jeanne Korman

Census 2000: The US Census Bureau and the New Millennium

This overview of the 2000 US census was presented by Peter Bounpane of the US Census Bureau and moderated by Bruce Calvin of the National League of Cities.

In his introduction to the US Census, Bounpane shared the information that the USA holds the record for the longest, continuous census taking in the world.

The presentation consisted of two parts; the plan for the 2000 census and plans for future census taking.

In an effort to cost cut and simplify, the Census Bureau presented a plan to take a 90 percent sample and extrapolate the remaining ten percent of the population from those results. A quality control sample would then be taken. The plan was contested and ultimately went to the Supreme Court. The Court ruled the results of a census conducted in this manner could not be used as a basis for reapportionment of the House of Representatives. It did allow a correction for other uses, but the Bureau abandoned the sampling concept.

The 2000 census will be a mail-out, mail-back form which is a self-responsive form. The Census Bureau feels this type of form provides better information than knocking on doors. If the form is not returned, a visit will then be made to the home. The questionnaire itself remains the same as that for the previous census. Five-sixths of households will receive a short form and one-sixth will get the long form which asks questions about such topics as education and travel.

There are five changes from the 1990 form. The form will be easier to complete and will be optically scanned. There will be a toll free number for questions. It will be possible to complete the form over the telephone or via the Internet (short form only). An advance letter will alert the population to the arrival of the form. Instructions for completion will be available in languages other than English, among them Spanish, Chinese, and Korean.

The Bureau will be using paid advertising for this census and plans to target those ads to areas where the message needs to be gotten out. There will be an expanded outreach and promotion program using intermediaries to get to reluctant people.

A significant difference will be how people are asked about race. According to Buonpane, race in the USA is a social perception and is not based on anthropology. Interracial categories will be included with more multi-racial choices. The question asks individuals to choose one or more races. The Bureau will tabulate each of the choices separately and will tabulate all together. The question now being addressed is how to use the results and how to compare these statistics with the 1990 results.

Two sets of statistics will be provided; the exact count and corrected counts. Levels of error have always been printed. The difference with the 2000 Census will be that corrections will be given.

The biggest change is how the data will be distributed. There will be minimal paper dissemination. CD-ROM will take the place of tapes and most of the information will be on the Internet as American Fact Finder. There will be one state set and one US set, which will be similar to Chapter 1 in other versions.

Redistricting data tapes will be sent to the states by 1 April 2000. The information will be produced on computer discs, with paper files gone. Summary tape files 1 and 3 will be replaced by CD-ROM. Everything else will come via the Internet. It will not be possible to download an entire file, but needed information can be printed or downloaded.

Some information that was in previous census materials will be missing. There will be table shells where the user will click on the desired shell. Selections then must be made for table, geography, and universe. A preview will appear, which will allow corrections to the request. These tables should cover 90 percent of the information used in 1990. If excluded information is needed, a query 3 must be made. This level query will query the entire database. This use of this query should be only as a last resort.

There are limitations on how far down the tract a user can go. A bloc will not be accessible. A message may appear which indicates that the information is not accessible and is confidential. It is possible to purchase a public use sample, four micro records, in geography of less than 100,000. The Census Bureau will also develop, upon request, a tabulation in tens of thousands (a cross-tab) on a purchase basis. The turnaround time for this type of request is slow and the cost is expensive.

Maps will be available on the Internet and will have a zoom in application. Maps will also be available for purchase on CD-ROM.

Sample information for a city, state, and Indian reservation is available for practice purposes on the Internet site, American Fact Finder. The Census Bureau is also working to put economic data from the last economic census on the site. The eventual goal is to link the economic census and the population census.

In the future, the Census Bureau is proposing to use only the short form for the ten-year census and use the "American Community Survey" to gather long form information. This would give moving data, will take more analysis, but should provide yearly data. One clear benefit of the "American Community Survey" will be the ability to provide statistics on data that have been difficult to collect and count in the past. Examples of this type of data would include counts of daytime populations of urban areas and count of multiple residencies, such as summer homes and second homes.

Unfortunately, due to various problems, Buonpane was unable to provide the promised demonstration of the Bureau's automated data dissemination system.

Government Data: Here Today, Gone Tomorrow

Kate Pissley of the Michigan Electronic Library discussed the issues associated with the transfer of government publishing to the private sector in this session moderated by Michael Kolakowski of the Library of Congress' Congressional Research Service.

Several questions for evaluating the effect of this change were presented by Pissley. Those questions included why government publications have special authority. Answers include the fact that the government as publisher is a known entity and a unique authority with special means to collect information. Detailed disclosure of data and methodologies is always provided.

Will authority be lost when government information is privately published, is another question to be considered. Pissley thinks this probably will not happen if the private publisher is reputable. In the case of statistical information, authority and usefulness could be diminished if less detailed disclosure of data and methodology is provided. Quality could go either way.

What will happen with variety and scope? The lack of copyright on government-generated information encourages variety. Exclusive contracts and copyright inhibit re-dissemination by other publishers and organizations. Will more obscure information that will not sell widely be published? Would government information be published at all? If private publishers had not stepped in would the information not be published?

Will the privately published information be more user-friendly? Private products often look better and are easier to use according to Pissley. Many government publications are not intended for public use. Additionally, printing regulations affect the "look" of government published materials.

There are budget issues to be considered. In the best-case scenario there will be reasonable prices and depository distribution. In the worst-case scenario, a monopolistic situation could lead to outrageous prices, licensing arrangements, and other unforeseen consequences. Pissley wonders if the materials are priced at what investment firms and corporations are willing to pay, and whether there will be reasonable pricing for the academic user. Also, will there be any options for public libraries?

The switch to private publication of government information will undeniably save money for the government entity that produced the information. Pissley suggests that overall, the government may not save because other agencies will have to pay for the information.

There are access issues to be considered. Private publishers do a better marketing job and also make access to publications easier than does the government. However, copyright issues could have a severe effect on the possibilities for free Internet access. Will networking of materials be allowed? Will costs affect access by less wealthy user communities?

Privatization is not necessarily a given. In some cases the government has a responsibility to publish its own information. Pissley feels that libraries will, in most instances, purchase the products and enjoy the improvements. In other cases, prices or licensing arrangements may be prohibitive. The private publisher should be encouraged to include methodology, and government agencies should make careful contracts that protect the public interest and avoid exclusive contracts.

Pissley left the audience with the thought that "you can be a voice for public access"; write a letter or have a discussion with your Congressperson.

Jeanne Korman is Manager of Library Services, Weil, Gotshal & Manges LLP, Miami, Florida. jeanne.korman@weil.com

Susan Fingerman

Waves of the Future: Digital Genres and Providing Information and Knowledge to the Virtual Office

The 1997 and 1998 SLA Steven I. Goldspeil research grant recipients presented results of their findings in two very different but relevant areas of research. Professor Andrew Dillon, of Indiana University, the 1998 winner, rushed through a lengthy presentation on "Understanding Users in Digital Environments: A Longitudinal Study of Genre Information in Information work." Claire R. McInerney, of the University of Oklahoma, the 1997 winner, followed with her findings on "Using Information in the Virtual Office: How Special Libraries are Serving Telecommuters".

Professor Dillon's research focuses on human-computer interaction (HCI). He is skeptical about the relationship of technology and learning, listing four basic myths about technology as follows:

1) naïve associatism ­ just linking is enough;

2) access is all;

3) exposure is learning; and

4) future technology will solve all the current problems with present technology.

In preparing for this research, Dillon and his students found only two studies in the last eight years that presented media as an effective teaching tool. Indeed, he gave two examples of contrary findings, one, that text is read on the screen at a 20 percent slower rate than on paper, and two, that technology is actually a hindrance to the slower learner.

The first phase of the research on new technology "genres" focused on the premise that the home page has become the first genre, or conventional format, created by the Internet (see his paper on this study at http://www.slis.indiana.edu/adillon/genre.html). Using two personal home page resource collections on the Internet, Dillon and his co-author identified common salient elements present on many pages, such as e-mail addresses, links, and graphics. They listed these in ranked order of occurrence. They then asked study participants to rank what they felt should be on a home page. The high correlation between the two lists indicated that home page design has become conventional and expected, and thus a new genre.

The second phase of the research was to ascertain how fast a genre can be created, since it is commonly thought that such an event is extremely slow to develop. Dillon and his students created an experiment to observe user interactions with an "ideal", or traditional, and a "non-ideal" newspaper format. Over 20 videotaped sessions, they watched as the "non-ideal" user struggled with the changing patterns of the pages (a different one at each session), and eventually became used to the "non-pattern" and learned how to cope with it. This learning took place relatively quickly. However, at first, the "ideal" users spent more time reading (and potentially learning) and the "non-ideal" users spent more time returning to the home page to figure out where to go next. These users displayed disorientation from too much cognitive overhead, dealing with both the format and the information. This research has many implications for both media designers and educators.

The Federal government estimates that 15 percent of its employees will be telecommuting by the year 2003. That's a lot of people, and Claire McInerney studied four organizations to explore the potential impact on information services. Through a combination of surveys and phone interviews, and one on-site visit, she profiled both telecommuter (user) needs and Information Center staffs' perception of how they were serving these users. Her results of this study were published in "Working in the Virtual Office: Providing Information and Knowledge to Remote Workers," Library and Information Science Research, Vol. 1, 1999.

The telecommuters found the biggest barrier to work satisfaction was the technology itself. The difference between home versus work infrastructure was a definite detriment to productivity. Corporate firewalls were a problem, as was the fact the telecommuters might be working totally different hours than Information Center staff. One Center has addressed this by staying open until 9 p.m. and having Saturday and Sunday hours. Interestingly, in a ranking of corporate department support, the library often ranked higher than the MIS department. Telecommuters also appreciated the collaborative attitude of Information Center staff, especially appreciated by those working alone.

One expected finding was that the Information Center has put as much information as possible on organization intranets, but McInerney also found that telecommuters had no idea of the source of this valuable information. She stressed the importance of branding in information presentation. For more information on these and other research, consult her Web page at http://faculty-staff.ou.edu/M/Claire.R.Mc-Inerney-1/

Susan Fingerman is with SMF Information Services, Ellicott City, Maryland. smfinfo@home.com

Michele Saunders

Managing Web Environments as a Core Strategy

The session began with a presentation by Andy Breeding, manager of desktop services and the WebLibrary for Information and Research Services Compaq Computer Corporation. The focus of his presentation was on how managing his company's Web environment has changed as a result of the merger between Compaq and Digital. The WebLibrary is "Compaq's intranet source of information on IT markets and technologies", he said. It contains news, market research, and technical information. Within the Library site are what Breeding referred to as sub-sites, market focused knowledge environments that concentrate on a particular business unit's interests or particular technologies. Information analysts customize content for each of these environments. Custom news pages allow users to select topics of interest and create queries that are profiled against news feeds and market research data and reports. Vendor content is integrated to create a consistent interface for browsing, searching, and profiling against the data. The Library is being heavily used, with an average of 2,300 user sessions per day.

As part of Digital, Information and Research Services was charted as a centralized corporate service. Compaq has different views and does not believe in supporting many corporate central services. Therefore some services once provided by Information and Research Services have been phased out or moved to individual business units. Walk-in reference centers and the Lending Library are examples. Since management is committed to Web enabling the company, the focus has shifted almost solely to the Web. A positive effect in this new environment has been the expansion of WebLibrary content. Information and Research Services is also finding itself in the position of having to negotiate about what is allowable as a centralized service. Entering this new environment has also meant adopting new intranet brand and infrastructure standards such as logos and color palettes; and contending with the fact that the intranet is owned by marketing but run by information management.

Breeding noted three main challenges that Information and Research Services is facing:

1) Marketing to the entire company.

2) Focusing on the Web without losing sight of other values.

3) Finding better ways to measure usage to validate expenditure.

Several strategies are being employed to market its services. It is "working to be pervasive through link placement". Webmasters and business units throughout the company are actively encouraged to link to any relevant sections of the Library. E-mail is being used as a mechanism to increase awareness by making use of existing distribution lists and the Reader's Choice service, which selects mail recipients based on function within the company, geography, and indicated interests. A concise definition of the WebLibrary is being developed to be incorporated into the Web site and marketing literature. Examples of how other online services have defined themselves include Dow Jones, which emphasizes the "customizable" nature of its site, and Lexis-Nexis, which highlights the "depth, breadth, and reliability" of its site. WebLibrary staff is working with marketing consultants to determine which value attributes they wish to highlight.

The results of usability testing are the basis for re-categorizing and labeling the site to align more closely with users' information-seeking behavior. Although the focus is on the Web, the circulating collection has been reinstated under a streamlined, low-cost service model. Options for better ways to track usage are also being examined. Increased granularity of usage information is needed to accurately assess content value. Breeding made the significant observation that usage and value are not equivalent. Some information may be used by only a few business units, but could be highly important to the functioning of those units.

Adding value to corporate intranets was the topic of Pam Klein's presentation. Klein is with the Information Technology Services section of Business Information Services for Proctor & Gamble (P&G). Business Information Services is the library function of the company and supports P&G worldwide. Service areas include library research, competitive intelligence, corporate archives records management, demographics, and technical intelligence. Currently there are 19 libraries worldwide. One of the responsibilities of Business Information Services is to manage external content for the research and development intranet. This involves supporting approximately 120 databases coming from external sources, negotiating licenses, developing training materials, and conducting training sessions. Business Information Services also manages the Global Knowledge Catalog, P&G's catalog of intranet metadata. Indexing of content, creation of a corporate taxonomy or subject classification scheme, maintenance of attributes to identify content or metadata are all part of the work of managing the catalog. This aspect of her unit's work was the focus of Klein's presentation.

The current structure of the intranet sites mirrors the organizational structure of P&G, with information organized functionally. For example, research and development has its own intranet called innovation net. Once the much-publicized Organization 2005 structure is implemented, the company will be organized around the business units and the products. New intranets are already emerging, reflecting this structure. Hair net, which contains information related to hair care products, is an example. New intranets such as hair net are now trying to mine content from the older intranets. It is hoped that the restructuring of the company will help to shift the corporate culture from a need-to-know toward a need-to-share sensibility. P&G employees are beginning to make use of technologies for team space, collaborative filtering and other modes of sharing information.

Klein asserted that in the future we would be seeing more work being done with metadata and the tagging of content in intranets as opposed to the Web. Intranets, even though they may grow quite large, are bounded sites which lend themselves more readily to control mechanisms. Value can be added to intranets by exercising some control. Klein proposed several ways to accomplish this:

  • Thoughtfully constructed site information architecture.

  • The development of controlled vocabularies and classification schemes.

  • Cataloging and indexing content.

  • Organization into categories similar to what Yahoo! and About.com have done.

If these issues are properly addressed, users will be able to navigate quickly to desired content.

For the Global Knowledge Catalog a corporate taxonomy, now in a testing phase, has been created. Content will be indexed according to the taxonomy and presented to the user in a browsable format. Content can be registered by any content owner and is manually indexed by Business Information Services staff. Attempts were made to purchase a taxonomy, but users found the vocabulary awkward because it did not match existing terminology at P&G. There were some difficulties in building and implementing the taxonomy. Vocabulary differences between business units had to be reconciled. Convincing users of the value of registering their content requires an ongoing effort. What level of granularity to adopt is still an unresolved issue. Currently there is no standard and items can be indexed as individual documents, sites, or databases. Tools for automating taxonomy creation and the registering of content could help to alleviate some of these difficulties.

Klein's closing comments reflected on both her presentation and Breeding's. She stated that technology should not be allowed to drive the process. Each tool needs to be evaluated, based on the business need. She also noted that librarians are now being called upon to deal with both content management and information architecture. Both presentations provided good examples of matching information management decisions to corporate needs and philosophies.

Michele Saunders is PHISA Classroom Services Coordinator, University Library, University of Michigan, Ann Arbor, Michigan. loumich@umich.edu

Mae A. Al-Hajjaj

From the CEO's Point of View: Making Sure Your Library Is in Sync with the Company

The relationship between the CEO of an organization and its library is one of imperative importance. This is easier said than done. Not many of the organizational libraries and information centers can boast that they have an open door policy with their upper management. This is not the case according to Duncan Highsmith, president and CEO of Highsmith Inc.

Highsmith began the session with an historical background on his company. Founded in 1956 by his father, Highsmith Inc. currently has 300 employees. It is a distributor of library equipment. Concerned with the blasé view towards the educational process, Highsmith Inc. established retail stores called "Mind Sparks" which provide parents, teachers, and students with educational toys, equipment, and materials.

In 1988, Highsmith Inc. hired their first librarian with the expectation that their library would add value to the organization.

Throughout the session, the attendees, mostly librarians and information specialists, were rewarded with a global perspective on the expectations of corporate libraries by upper management. Duncan Highsmith is an exception to the rule. He is certainly a man of vision. He set a Corporate Code of Ethics that included such ideal standards as "respect all people, promoting unity, trust, pride, and dedication to our mission". Another was "to achieve a high quality of work life through involvement of all our people in an environment of openness and fairness in which everyone is treated with dignity, honesty, and respect."

With these in mind, Highsmith developed the characteristics of information systems. Based on internally and externally generated information, the characteristics aim to synthesize the data processing functions with the responsibilities and deliverables of libraries. Highsmith divides information systems into transactional, tactical, and strategic functions, stating that each function respectively provides the library with routine, adaptive, and innovative roles. The implementation of each function is dependent on the accessibility of the information. For example, as data processing is a transactional function, the coverage by the library is routine. It has direct access and covers casual inquires, simple reference, and browsing.

On the other hand, Highsmith stated that management information systems (MIS), whose function is tactical, covered in-depth library services as academic study, directed research, and complex reference. It has a more adaptive role than the transactional function.

The aim for this entire complex ideology is to create knowledge.

The overall concept, stated Highsmith, was based on his ambitious theory, "Life, the Universe, and everything." Its aim is to "alert the management of Highsmith Inc. to trends that could affect the company's fortunes" (Buchanan, 1999).

After his philosophically inciting introduction, Highsmith gave the floor to his librarian, Lisa Guedea Carreno. She began her remarks stating that the mission of the Highsmith Corporate Library is "to provide timely, relevant information that supports and strengthens business decision making, operations, productivity, and the goals of the accountable organization".

Carreno stated that the success of her "library was due to the vision set by the CEO of Highsmith Inc. and the simplicity of its implementation". She added that the role of the Library was not just to provide objective information, but to participate in the decision process and be part of the corporate culture. The intention is to have the strategic unity required to strengthen the organization, so that everyone knows and understands their boundaries.

Following the session, a number of questions and comments were raised; probably most significant was a bank librarian's comment that Guedea Carreno was "so lucky to have a manger like Duncan!"

Reference

Buchanan, L. (1999), "The Smartest Little Company in America", Inc. Magazine, pp. 42-54. http://www.inc.com/incmagazine/archives/01990421.html

Mae A. Al-Hajjaj is Head, Center for Economic and Financial Information Resources (CEFIR), Economic Research Department, Central Bank of Kuwait, Safat, Kuwait. mae@cbk.gov.kw

Karen Bleakley

Hot Technologies ­ The Future Is Now

Within minutes of the session starting, it was clear that this would become an annual event. The room overflowed with people anxious to learn what's coming down the pipe in the world of technological innovations. Three speakers made their pronouncements to all: Michael Wilens of West Group, James King of Lexis-Nexis, and Hope Tillman, who filled in for Jayant Neogi of Norsham Technologies, whose technology is so hot he had to miss the conference in favour of a business opportunity.

All three speakers gave their views of what specific technologies are hot now, and will be shortly. The Norsham Technologies presentation spoke directly of their latest innovations in mass storage ­ a perennial hot technology as expansions seem never to end. The other two presenters spoke of hot technology trends ­ general paradigm shifts that they are seeing within their own companies and within the information industry in general.

Jim King started off the session with a look at technologies that are now hot or have at least been warm for some time now, including network speed and bandwidth advances, distributed search, device integration, and streaming video among others. King explained some of the technologies that are now becoming hot such as new human-computer interaction mechanisms (speech-based facilities, task-based integration), greater intelligence in search and retrieval, and superdistribution for commerce. Looking five years out into the future, King suggested that information itself will truly be considered a currency and personal communication devices (possibly embedded) will provide greater access to information. Key changes for the information industry, according to King, are metadata extraction, new forms of indexing, abstracting, and other content management technologies. King concluded by showing that technology fads are ever-present but that are nevertheless based on true value and that produce tremendous results which are not always apparent ­ artificial intelligence, object technologies, client-server computing, push technology, knowledge management.

The presentation by Michael Wilens started out by looking at technology's "track record" thus far with the "top five" predictions that were somewhat off the mark ­ No. 5: digital cash replaced real cash; No. 4: AOL bites the dust, No. 3: network PCs inherit the earth, No. 2: Internet usage grows quickly; and No 1: push technology wins. Wilens explained that when trying to predict the future, especially in the world of technology, one must look at three components: content, business models, and the technology itself. Content and technology are constantly changing and improving for the most part. The business models must also evolve rapidly enough to take advantage of technological and content changes. Wilens gave examples of some migrating business models to watch for the future: intermediaries/aggregators such as Catalog City; supermediaries such as CD-NOW; metamediaries such as MSN Home Advisor; indimediaries such as Motley Fool; and infomedairies such as DoubleClick. Wilens also spoke of the technology lifecycle, where technology enthusiasts and visionaries are in the early market of technology products and pragmatists, conservatives, and skeptics who are in the mainstream market. Wilens maintains there is a chasm in the life cycle between visionaries and pragmatists that sometimes results in difficulties in getting technology products into a company, or getting them adopted once they are introduced to the company. Wilens then presented his two "really big deal" technologies which are in-memory search and end-user communications.

Karen Bleakley is Manager of Knowledge Services for PricewaterhouseCoopers, Montreal. Karen.Bleakley@ca.pwcglobal.com

Beth Hanson

Overview of Data Mining and Its Importance to Librarians

Carolyn J. Crouch, Associate Professor, Department of Computer Science, University of Minnesota-Duluth, spoke on the historical perspective of data mining, gave definitions, and outlined problems, processes, and challenges in the field. She reviewed existing systems and completed her presentation with the impact of data mining on society. Crouch's talk was informative, concise, and left one with food for thought.

The explosive growth of databases has overwhelmed traditional methods of data analysis. Current databases include computerized transactions, such as credit cards and income tax returns, and health card records. For example, Wal Mart performs 20 million transactions per day. Earlier data management tools are now inadequate. Crouch stated that a new generation decision support tool (data warehousing) is needed and that a method of analyzing this data automatically (data mining) is required to discover the useful information in the data (knowledge discovery in databases or KDD).

Four tasks need to be completed to move data from many different databases into a data warehouse:

1) Extract data in many different formats from many different databases.

2) Transform this data into a format suitable for decision support.

3) Clean the data ­ in this step, errors are removed and missing values discovered.

4) Integrate the remaining data into a central warehouse.

For data mining to work, there needs to be knowledge and understanding of the data.

Crouch suggests that rather than devising and testing hypotheses against the evidence, data mining reverses the process. Thus, given a collection of data, it asks, What hypotheses are supported by the data?" Data mining relies on methods to find patterns in data. Crouch defines these methods as follows:

Classification maps a data item into a set of predefined classes. Clustering places an item into a set of classes determined by the data. In other words clustering produces natural groupings of data items based on their similarity to each other. Summarization provides a summary of the data. Dependency modeling describes dependencies between items. Link analysis determines relations between items. Sequence analysis models sequential patterns to find deviations and trends over time.

The problem as noted by Crouch is that the choice of a particular algorithm for a particular application is something of an art.

There are many challenges in managing data. For example, massive data sets usually have high dimensionality, data and knowledge constantly changes, data is missing, systems do not integrate well or they integrate not at all, user interaction and prior knowledge are variable, and finally data may be nonstandard or multimedia.

Crouch mentioned several existing systems as case studies. Under marketing there is the Management Discovery Tool system developed by AT&T and NCR. Lucent developed NicheWorks. For financial investment there is the Fidelity Stock Selector and LBS Capitol Management. For fraud detection, Crouch listed FALCON fraud assessment system, Financial Crimes Enforcement Network AI System, and a system for detecting international calling card fraud. The CASSIOPEE by GE is a well-known manufacturing and production system. The University of Helsinki is working on a large Telecommunications Network Management system. As you would expect, there are many science systems. Some examples are POSS-II (the second Palomar Observatory Sky Survey), the Human Genome Project, Quakefinder, and KEFIR, a health care system.

KDD is fairly new. The first conference was held in 1989. What is the impact of KDD on us? An Equifax survey in 1994 found that 76 percent of US citizens feel they have lost all control over their personal information. A survey conducted this year would certainly raise that percentage. The only federal protection we have is the Privacy Act of 1974. This act pertains to information collected on our exercise of First Amendment rights. Items not protected by federal law include medical records, insurance, credit card transactions, real estate, phone bills, criminal, and most state and federal government records. Some credit data, educational records, and cable and video rental records are protected.

Currently over 200 information superbureaus compile, collate, and sell this unprotected information. Crouch notes that "you" provide the information by filling out forms and by answering telemarketers. Today there is no way to know all the places one's personal data appears. The risks are many. Not only can this information be used for purposes other than that for which it was originally collected, but also it can be incorrect or incomplete. The Internet will accelerate this process.

This talk was one of my favorites at SLA '99. Not only was it informative, but I considered Carolyn Crouch's remarks long after I left the conference. Her presentation is available online at http://www.d.umn.edu/~ccrouch/DM-ppt/index.htm

Emerging Technologies Roundtable Breakfast

SLA's Legal Division hosted this breakfast. There were two speakers ­ Denis Hauptly and Nathan Rosen. A third speaker ­ Nancy Lemon ­ was unable to attend the breakfast.

Denis Hauptly is Vice President of Technology Development, West Online. He started using online systems in 1971. His first system was Obar, which later became Lexis. Hauptly stated that online systems were virtually unchanged from 1971 until 1997, at which time hypertext links appeared.

Hauptly believes the future will bring new applications in the following areas. Under Internet technology, we are now managing information. The relative question is now what can be done creatively with this information? In wireless technology, Hauptly said this is technology in search of a use case. A use case breaks the mold of using data. He doesn't know what this use case will be. Broadband access will change the face of technology. All three technologies ­ Internet, wireless, and broadband access ­ will change the way we look at information in the next year to 18 months. As Hauptly said, "You can find anything on the Internet and you can't find anything on the Internet".

People perform research differently, so searching for information should be in the user's control and be dictated by comfort. In the near future, a system will go out to find discrete pieces of information and will compile them in one place. According to Hauptly the trick will not be in the technology, but rather how information is designed.

Nathan Rosen is Vice President of the Legal and Compliance Department, Credit Suisse/First Boston. He defines emerging technology as come into view or come to fruition.

Rosen listed several trends he has observed. In the corporate environment there is a shift from WordPerfect to Word. A show of hands indicated that approximately half of the people present had moved to Word. Of that number, a third did not like the move. There is a shift from Netscape to Explorer. By show of hands most indicated that they left both browsers loaded on their computers. The corporate world seems to be setting rules on what software to use.

Most people in the room read e-journals, but a lot fewer like them. Neil Gershenfeld, Director of the Futuristic World of the MIT Media Lab, Physical Sciences program, gave an interesting talk on Monday ­ The News Industry in the New Millennium ­ where he mentioned e-books and electronic ink. His Web page is www.media.mit.edu/physics

Rosen said there would be an increase in laptop use. Laptops will be carried by the user and docked at work so they can use large monitors. There will an increase on security of e-mail with PGP (pretty good privacy) and public data keys. Voice recognition software is getting really close to working. There will be more outsourcing. Companies will move into extranets as they deal with customer service and send electronic billing to clients. As bandwidth expands there will be more personal videoconferencing. Telecommuting will increase.

There are four stages to new technology. First there is hype about what the technology will do. Disillusionment and problems follow. Realism sets in as it begins to work and builds up a user base. Finally productivity begins as the technology gets integrated with other tools. This process does not mean that new technology should be ignored. Rosen quoted Ms Frizzle: "Take chances, make mistakes, get dirty."

As Nancy Lemon was unable to attend, two of her articles were mentioned. They are:

  • "Managing demand: a strategy for the future" (1998), Marketing Library Services, March.

  • "Climbing the value chain ­ a case study in rethinking the corporate library function" (1996), Online, November/December. http://www.onlineinc.com/onlinemag

A general discussion ensued. A number of people are providing electronic routing. Rosen said he started by scanning pictures of family members for company employees. He would then make screen savers of them. After people were comfortable with scanning, he started scanning documents. Other people send tables of contents, people mark the articles they want, and those documents are sent to the desktop. BNA, the electronic version, was also discussed. Some liked it while others did not. Many people felt the price was too high. Others were disappointed that they could not photocopy tables of contents and highlights anymore. The breakfast was well received by all who attended.

Beth Hanson is Director of Virginia Technical Information Center, Virginia Tech, Blacksburg, Virginia. bfhanson@vt.edu

Howard Stephen McMinn

SLA Science and Technology Division's 75th Anniversary Speaker

The Science and Technology's 75th Anniversary speaker was Kathryn D. Sullivan, President and Chief Executive Officer of COSI, Ohio's Center of Science and Industry. Essentially, COSI is an interactive science museum with several locations in Ohio. However, Kathryn Sullivan is most noted for her accomplishment of being a three-time Space Shuttle astronaut and the first woman to perform a spacewalk.

The session was unique to most library conferences, as there was hardly any discussion of library or information sciences issues. Although her talk contained little that would relate directly to libraries, librarians, or the management of information, it was inspiring in that it reminded us how far we have come in the twentieth-century in terms of scientific advancement and technological achievements. Her talk also demonstrated how our understanding of the world in which we live has changed due to these advances. This inspiring talk played right into the Science and Technology Division's 75th Anniversary theme, "Looking Forward and Looking Back".

The session could have been viewed as a synopsis of her achievements, as it outlined her fascinating career. However, it was much more than that; it was an inspiring snapshot of some of the technologies and scientific changes that have shaped and will continue to shape the future. Sullivan herself likened the presentation to that of showing the best of her vacation photos. Since the presentation was essentially a slideshow, this has some validity. Oh, but what photos... The best were from the various space shuttle missions in which she participated. These were accompanied by a discussion of the key elements of the missions. It was the frank and sometime humorous observations and anecdotes which also accompanied the photos that really made the presentation. The informal, down-to earth talk highlighted her career from a beginning scientist, working on deep ocean geology, to her shift in career by joining the space program. There were photos of both the inside and outside workings (with some humorous photos of sleeping and playing in space) from the shuttle missions. Some of the most spectacular pictures were those of the earth from outerspace. The personal insight to these missions and activities almost made one feel as if he or she was there. Some of her activities with the space program included the Hubble Space Telescope deployment mission in 1990 and the ATLAS-1 Spacelab flight in 1992. Her first, and the first woman, spacewalk was performed in 1984.

Upon completion of her activities with the space program, Sullivan's remarkable career continued with her returning to her first love, the sea, as Chief Scientist for NOAA, the National Oceanographic and Atmospheric Administration. More incredible slides and commentary of unseen reaches of the ocean floor at remarkable depths highlighted her discussion of some of the major activities undertaken by the organization during her tenure. During her discussion of the accomplishments in deep sea exploration, she highlighted how much things had changed over the short course of her lifetime, including how humankind's view of space and our planet had changed due to the exploration of scientists like herself. This was illustrated by the discovery of living organisms, both plant and animals in deep ocean regions far below the depth in which sunlight reaches. Until these discoveries, light was deemed to be essential to the support of life in the oceans. The existence of life in these sub-regions of the planet, surviving on geothermal gases where the temperature gradients are unimaginable, has opened up a whole new view of the elements essential to the support of life. Her presentation was followed by a brief question and answer period, which included commentary on the information needs of scientists, as well as lots of questions that started with "What was it like ..." Overall, everyone I talked to left the session inspired and upbeat, as well as a little awed by Kathryn Sullivan and her accomplishments.

Howard Stephen McMinn is Assistant Director, Science and Engineering Library, Wayne State University, Detroit, Michigan. He is past chair, Science and Technology Division, Special Libraries Association. h_s_mcminn@wayne.edu

Susan S. DiMattia

Creating Value: The Year Ahead For SLA

Special Libraries Association (SLA), the premier international organization for information professionals, has several challenges in the 1999-2000 year. Individual members also have their work cut out for them. The transition from the old decade to the new, coupled with the dawn of a new millennium, form the perfect impetus for meeting the challenges head on.

The SLA Board of Directors has spent a considerable amount of time over the past two years taking a critical look at the governance structure of the association. Everything, from committees and their charges, to bylaws and the name of the association, have been scrutinized, with significant member involvement. That review will continue in the coming year. Although change is always difficult to propose, implement, and accept, the Board's goal is to make SLA:

1) able to react in a timely manner to rapidly emerging opportunities; and

2) responsive to member needs, expectations, and concerns.

In an effort to mesh member needs with association objectives, a variety of member task forces will be at work during the year. One will study implementation of a merger of the Government Relations Committee and the Copyright Committee to encompass all of the traditional concerns, in addition to the more broad-based intellectual property issues, within the increasing global emphasis of SLA. The International Relations Committee will examine its charge in relation to the global issues that are part of nearly every avenue in SLA. The Affirmative Action Committee has requested a name change to the Diversity Committee and will be exploring ways to broaden and strengthen its partnerships with and influence in all of the units of SLA. An expanded Networking Committee has been given a one-year charge to survey member attitudes toward proposed changes in SLA programs and services made possible through enhanced technology. The objective is to make certain that the "virtualization" of SLA is a benefit, not a hindrance, to members.

My presidential theme is "Creating the Value Proposition." Through it a training opportunity will be offered at the June 2000 Annual Conference in Philadelphia. A companion toolkit will be created to assist individuals in establishing the value they bring to their workplace. Increasingly, special librarians and information professionals are challenged to defend their value in an Internet-enhanced world. "Why do I need a library when I have the Internet on my desk, where I can get everything I want and it's all free?" Versions of that question have frustrated thousands of SLA members and other information professionals for the past two or three years. It is time for individual members of SLA to take matters into their own hands to create the value proposition for their services. SLA will continue to do a fine public relations job on behalf of the profession at large, but individual minds will only be changed through one-on-one communication. The training and toolkit will offer tips on how to formulate a message that will be meaningful in a specific setting and how to communicate it effectively and often, until the value proposition for libraries and librarians is firmly planted in the minds of administrators and purse-string holders.

The SLA document Competencies for Special Librarians in the 21st Century has had a tremendous impact since it was published in 1996. In addition to mastering the points it outlines, special librarians should master all of the "7Cs": Competencies, Communication, Creativity, Correlation, Culture, Cheering, and Chutzpa. Mastering competencies without the skill to communicate them will not create the value proposition. Creativity will aid in identifying new, unexpected ways of approaching the issues to be faced. Correlation is the art of making connections and building interdependence, essential to creating and communicating a value proposition. Culture makes SLA unique. Concentrating on the culture that defines what special librarians are, and sharing personal points of view will enhance the tools for success. By cheering each other on and celebrating small victories, we learn from each other and encourage everyone to continue the fight to prove the value of the special librarian/information professional. Having the nerve to stand up and say, emphatically, "I have value," takes chutzpa. That takes practice, but it can be learned. Mastering all of the "7Cs" will take active participation by all SLA members, but the end result will be a stronger sense of the value special librarians and information professionals bring to the success of their organizations.

Susan S. DiMattia is Editor, Library Hotline and Corporate Library Update, and President, 1999-2000, Special Libraries Association, New York, New York. sdimattia@cahners.com

©Susan S. DiMattia.

L. Susan Hayes, 1998-1999 President

A View of SLA from the Top

The Annual Conference of the Special Libraries Association was an upbeat event this year. When I reported on the conference last year in this space, I reported the same thing, so I think we have a trend in the works. Special librarians received some very positive and far-reaching publicity in the business community during the past year, and I think this contributed to our high spirits as we arrived in Minneapolis. The business world seems to finally be valuing the work that we do. This meant that during the conference, we could focus on improving our skills. In addition, several session presenters were top-level officers of corporations who reinforced the need for aligning with the strategic direction of our employers.

Programs dealing with technology were well attended. We made a special effort to have several sessions that were booked at the last minute, in order to present late-breaking technological developments. As would be expected for the last conference of the 1900s, several speakers spoke about future developments. We will see how accurately and how rapidly their ideas come to fruition.

In contrast to the technology-based content were the speakers who emphasized the need for the human touch. It was almost as if this had been posted as a conference theme. Librarians' skills as keepers and communicators of "stories" ­ knowledge presented in anecdotal form ­ was repeatedly encouraged. Packaged along with our evaluation and analysis abilities, these soft skills are gaining in value. People prefer to put their trust in other people more than in machines.

A focus on working in a global economy continues to be emphasized, both in discussions about our daily work and in discussions about the functioning of the association. SLA has members in 60 countries, and we want to improve our service to all our existing members, as well as recruit new ones. Our first non-North American board of directors meeting will be held in the UK in October 2000, along with an international conference on special librarianship called Global 2000 (details on the Web page http://www. sla.org). In addition to our usual contingent of international attendees, we were joined this year by two dozen librarians from the US Information Agency. A new service that made its debut at the conference is the members-only section of our Web page with its Who's Who membership directory. By being able to update records continuously throughout the year, members will now have more accurate contact information for our all-important networking communications.

If we measure a conference by its numbers, SLA was a success because attendance was up over last year and exhibitors were at an all-time high (6,000 and 550 respectively). But I think another good measure is that intangible feeling one has, not only during the programs, but also in the informal places such as hallways and hospitality suites. Besides, the members took the time to tell me it was a good conference! I congratulate all those responsible for putting it together. From my viewpoint as president of SLA, it was an honor to be amongst the knowledge leaders of the information future.

L. Susan Hayes is Construction Project Manager, Nova Southeastern University, Ft Lauderdale, Florida. She is the Outgoing President of SLA. suzi_hayes@prodigy.net

Related articles