Citation
Mihram, D. (2000), "Evaluating Performance: Strategies for a Technology-Enhanced Learning Environment", Library Hi Tech News, Vol. 17 No. 1. https://doi.org/10.1108/lhtn.2000.23917aac.003
Publisher
:Emerald Group Publishing Limited
Copyright © 2000, MCB UP Limited
Evaluating Performance: Strategies for a Technology-Enhanced Learning Environment
Danielle Mihram
This ambitious seminar (or, more descriptively, "workshop") dealt with two, among several, issues associated with the use of technology enhancements to "add value" to higher education. The first was a review of a set of strategies for using technology to help students to employ their personal learning styles and to learn how to use new ones. The second was an array of strategies for evaluating student performance in interactive learning environments.
The authors, Karen L. Smith, Director, and Bonnie Warren, Assistant Director, Faculty Center for Teaching and Learning, University of Central Florida, assembled an impressive amount of information, which they included in a 58-page handbook (based on their research: http://reach.ucf.edu/~fctl/research/research.html#Model) that was distributed to each seminar participant. The seminar was divided into four major topics:
- 1.
Who are the students? How do we meet their needs (data on individual learning styles; "Learning Styles Inventory"; learning style insights).
- 2.
What technologies help individuals learn and perform?
- 3.
Measuring performance and learning: What do we want them to do? How do we know they did it?
- 4.
Putting it all together: helping faculty plan and create an interactive learning environment.
Because of the sheer mass of information gathered and planned for presentation, and because the seminar took the form of group "brainstorming" activities, the entire session proved rather chaotic at times. The seminar leaders had indeed pointed out that learning styles are quite varied ranging from "accommodators" to "convergers" and that, ideally, a teacher should vary his or her class presentation to satisfy each of the following, at some point during class:
- •
Accommodators dynamic learners (best learning style: opportunity for new solutions; interviews; mind-maps; hands-on; group work).
- •
Assimilators analytic learners (best learning style: pros and cons; biases/critical thinking; identifying problems; researching).
- •
Divergers imaginative learners (best learning style: puzzling situations; role plays; brainstorming).
- •
Convergers common-sense learners (best learning style: reactions to real-world situations; experiments; solving problems).
One would have to conclude that the two presenters favored the "accommodators'" and "divergers'" learning styles, since much of the seminar was led in the form of group discussions or brainstorming in order to determine which activities/options (with their supporting technologies huge lists provided in the handbook) would best suit the four learning styles.
Two technology models were suggested:
1) WebCT (Course Management System) which includes several types of tools: a glossary, a discussion forum, a chat tool, E-mail, a tutorial, tool pages, URL references, and a calendar; and
2) MS NetMeeting: Conferencing Tool (free from Microsoft at http://www.microsoft.com/netmeeting) which includes a Windows multimedia collaboration tool, a white board program, a chat tool, Internet phone/audio conferencing, as well as application sharing and file transfer capabilities.
This writer (who discovered that she falls into the "assimilators" category) cannot quite determine whether it was all the brainstorming or the various group activities which led to the seminar's rushed conclusion. No time was left, for example, for one fundamental question to be asked or discussed: in all this array of technological options, which tools are being used to measure effectively, to assess convincingly, the students' learning outcomes? Of course, this might be "Part II" of such a seminar, one which should be offered in the future.
Shining a Flashlight on Teaching, Learning, and Technology
This "task-centered" workshop aimed primarily at introducing participants to a step-by-step process for analyzing technology applications in education, for assessing performance, and for modeling resource use. This approach is the one which the Flashlight Program offers in its training and consulting services.
The Flashlight Project began in 1993 with the support of several funding sources: the Annenberg CPB Project, the Fund for the Improvement of Postsecondary Education (FIPSE), the American Association for Higher Education (AAHE), the Andrew W. Mellon Foundation, corporate sponsors, and other benefactors.
Stephen C. Ehrmann, Director, The Flashlight Program the TLT Group, stressed, at the very outset, that asking the right question about technology is an essential aspect of successful evaluations of various educational technologies. He noted that too many times we hear the statement "Too bad that there has never been a study that definitely shows how much computers help learning". Replace the word "computers" by "books", and you will soon realize that both statements become irrelevant. How the technology is used is what matters.
The Flashlight method re-aligns the thinking of evaluators by focusing on clear goals to be attained when technology is to be used. Examples include:
- •
Better learning use the tests, same as before, but examine the role of instructional technology (IT) in improving outcomes.
- •
Better goals equal "new" goals; e.g. teach differently: teach students to compose computer-generated music; compare and contrast tests, scores, syllabuses.
- •
Access to education.
- •
Efficiency at the "micro" and "macro" levels of instructional technology.
- •
Attracting faculty.
- •
The fear of losing face or place.
Flashlight also has tools kits to help experts and novices design their own evaluations.
An example of a successful Flashlight project was described by Patricia Harvey who designed a pilot project for teaching English composition at Mount Royal College (Canada) http://www.mtroyal.ab.ca/olt The consultants at Flashlight helped her frame the pertinent questions, and technology training was part of the student learning goals.
Ehrmann's well-prepared, well-organized workshop was a cornucopia of information, anecdotes, tips, and just "good old common sense". For a sampling of the contents of his presentation, several resources can be accessed by going to: "Studying teaching, learning and technology: a tool kit from the Flashlight Program", Active Learning IX (December 1998), pp. 38-42; full text available online at http://www.cti.ac.uk/publ/actlea/al9.html#contents.
Seven of the eight studies in the bibliography at the end of this article are available full text, at the URLs noted for each citation.
Flashlight also produces a free monthly newsletter about its programs and related research. Send E-mail to listproc@listproc.wsu.edu with the message SUBSCRIBE F-LIGHT yourname.
To learn about Flashlight products and services, including recent reports from users, see http://www.tltgroup. org/programs/flashlight.html
To arrange a workshop or talk, obtain external evaluation, or chat about evaluation, e-mail to Flashlight@tltgroup.org To contact Ehrmann, e-mail to ehrmann@tltgroup.org
Assessing the Impact on Students of Online Materials in University Courses
The speakers, Nicholas C. Laudato, Associate Director, Instructional Technology, and Joanne M. Nicoll, Associate Director, Instructional Development and Distance Education, University of Pittsburgh, presented the results of a pilot project undertaken at the University of Pittsburgh to assess the impact on the use of online materials for the enhancement of courses. They used "Courseinfo", an "Integrated Software Suite for Unix or NT Web servers", for several reasons: user interface, the ability to create effective materials with no knowledge of HTML, its adherence to standards (IMS, SQL, Browsers, Plug-ins), and vendor support staff.
The pilot project concentrated on large classes in the sciences and Courseinfo was adapted to suit the university's needs: a "Pitt Web Course Template" was created, and faculty training was implemented within Courseinfo. Support services were planned in advance: Classroom technology (computer projection), student computer labs (with access to browsers and plug-ins), 24-Hour Help Desk (analysts were trained in Courseinfo), and Web server operations included tuning and backup.
Twenty faculty members, 22 course sections, and 1,850 students participated in this pilot project in the Fall term 1998. Evaluations included students and faculty surveys, Web server and Help Desk statistics, and anecdotal data. The success of this initial pilot has led to a significant growth of Courseinfo: by Fall 1999, 289 instructors and 9,840 students were using Courseinfo in 350 sections of science courses.
For more information on this project contact: Joanne Nicoll: nicoll+@pitt.edu or nicoll@cidde.pitt.edu; (412) 624-3335 or Nick Laudato: laudato+@ pitt.edu or laudato@imap.pitt.edu; (412) 624-3335.
CNI Update: Assessment of Technology in Teaching and Learning
The Coalition for Networked Information (CNI) was founded in 1990 by the Association of Research Libraries (ARL), CAUSE, and Educom. Paul Evan Peters was the founding Executive Director of the Coalition and served until his untimely death in 1996. Joan Lippincott, now Associate Executive Director, served as Interim Executive Director until the appointment of Clifford A. Lynch as the new Executive Director in 1997. The Coalition is now supported by a task force of about 200 dues-paying member institutions representing higher education, publishing, network and telecommunications, information technology, and libraries and library organizations.
The purpose of this session was to provide an update to the activities of the Coalition for "those who are not engaged in these activities", noted Lynch in his opening remarks. Much of the overview presented at the session is contained in the booklet, Program 1998-1999, which was available to those who attended the session. The activities of CNI are grouped in three major "Themes" and both Lynch and Lippincott provided the audience with an update on the status of the initiatives listed below:
- •
Developing networked information content some initiatives under way: licensing museum information; networked digital library of theses and dissertations (NDLTD); moving metadata beyond descriptive information to support resource discovery (Dublin Core descriptive metadata initiative; Internet Scout Project, co-sponsored by the National Science Foundation); digital preservation; scholarly communication.
- •
Transforming organizations, professions and individuals by means of collaborative workshops or discussions, foster the sharing of new strategies, and help shape policies and best practices; and, in collaboration with the EDUCAUSE National Learning Infrastructure Initiative (NLII) CNI, will work with both ARL and EDUCAUSE to explore institutional readiness factors and organizational roles to support distance education and digital instructional media.
- •
Building technology, standards and infrastructure authentication and authorization; identifiers for digital information; Z39.50; instructional management system and libraries; Internet 2.
For detailed information on all of these initiatives go to: http://www.cni.org
Danielle Mihram is Assistant Dean for the Leavey Library and Director, Center for Excellence in Teaching, University of Southern California, Los Angeles. dmihram@calvin.usc.edu, http://www.usc.edu/cet/