Welfare robotics

Kybernetes

ISSN: 0368-492X

Article publication date: 1 December 2000

41

Citation

Andrew, A.M. (2000), "Welfare robotics", Kybernetes, Vol. 29 No. 9/10. https://doi.org/10.1108/k.2000.06729iag.003

Publisher

:

Emerald Group Publishing Limited

Copyright © 2000, MCB UP Limited


Welfare robotics

Welfare robotics

The usefulness of computers able to judge the emotional state of their users is pointed out by Hashimoto et al. (1999) with reference to robotic devices meant to assist infirm or disabled users. This is a specialised application of the "personal robot" principle, particularly useful where there is difficulty with conventional means of communication, but of general applicability in providing truly user-friendly and sympathetic interfaces.

A project within IBM to investigate ways of making computers responsive to the emotions of their users is described by Davidson (1999). Ways of sensing emotional states electronically are well-known, both to experimental psychologists and notoriously in their application in "lie detectors". Since the latter respond to a variety of physiological variables, the term "polygraph" is given to the equipment.

The input to a polygraph normally comes from electrodes and other gadgets attached to the subject and, although these need not be particularly cumbersome, an ordinary computer user would not want to be bothered with them. Less obtrusive ways of collecting data are being considered, the least obtrusive being voice analysis and automatic recognition of facial expression. Other possibilities are the monitoring of temperature and possibly the "psycho-galvanic reflex" (electrical changes in the skin associated with emotion) by sensors disguised as jewellery, or incorporated in the mouse.

Assuming that the computer has obtained data bearing on the user's emotional state, there are many problems in deciding how such information can be used. The system might switch on soothing music, or in some way alter its responses, either to give more guidance, or less, if the data suggest which the user wants. It could also suggest that the user takes a break. Experiments have been made with an artificial representation of a face, allowing the computer system to respond with indications of emotions to be attributed to it. As acknowledged in Davidson's article, emotional computing requires very careful handling and can be intolerable if badly planned. It is easy to imagine an infuriating situation where a computer system can be seen to be making wrong assumptions about the user's feelings.

The IBM project to investigate these aspects has the name "Blueeyes" and information on it can be found at the site: http://www.almaden.ibm.com/cs/blueeyes/

Related articles