Keywords
Citation
Andrew, A.M. (1999), "The Handbook of Brain Theory and Neural Networks", Kybernetes, Vol. 28 No. 9, pp. 1084-1094. https://doi.org/10.1108/k.1999.28.9.1084.1
Publisher
:Emerald Group Publishing Limited
This is an extremely impressive piece of work. Its size alone makes it outstanding, with the large number of pages shown, of almost A4 size; hence quite a tome. The pages are well filled, mostly with smallish type in a two‐column format, so there is a great deal of material here.
The rate of proliferation of theories and research findings in the areas covered, in recent decades, is such that this survey is extremely welcome, and Michael Arbib was probably the best‐qualified and best‐placed person anywhere to bring it into being.
In his preface he says that the volume is inspired by two great questions: “How does the brain work?” and “How can we build intelligent machines?” He acknowledges that no single answer is given to either question, but the book reviews great progress in recent years in answering more specific related questions. Approaches to machine intelligence other than that through neural nets are included, and the term“neural net” is used very generally, sometimes referring to biological structures and sometimes to artificial nets that may have no demonstrable close correspondence to any biological example. Connectionist theories of another kind are also included, namely those employed in AI and by psychologists and linguists, where a network is visualised but with elements well above the single‐neuron level.
Following the preface, and some notes on: “How to use this book”, the content is presented in three parts, of which the third occupies exactly 1,000 pages and hence constitutes the bulk of the book. The first part gives useful background information, and in view of the wide‐ranging interdisciplinary character of what follows, there will be few readers who do not benefit from perusing something here. The background material comes under the three headings of: “Introducing the Neuron”, “Levels and Styles of Analysis” and “Dynamics and Adaptation in Neural Networks”. The background material is developed in a fair degree of detail, with numerous forward references to Parts 2 and 3, to some extent trespassing on the “road map” function of Part 2. It is interesting to note that there are appreciative references to much early work, including papers that appeared in the influential collection by Feigenbaum and Feldman (1963), as well as, of course, the famous paper of McCulloch and Pitts in 1943 and treatment in terms of cellular automata by von Neumann, and the work of Turing, and so on.
Part 2 offers what are termed “Road Maps”, or guided tours around different sets of articles that appear in Part 3. There are eight maps under relatively general headings, referred to as meta‐maps, and then 23 road maps whose headings are subordinate to these eight, at the rate of three subheadings for each of the eight main ones, except that the main heading of “Sensory Systems” has only the two subheadings of “Vision” and “Other”. The eight general headings are as follows:
- 1.
(1) Connectionism: psychology, linguistics, and artificial intelligence.
- 2.
(2) Dynamics, self‐organization, and cooperativity.
- 3.
(3) Learning in artificial neural networks.
- 4.
(4) Applications and implementations.
- 5.
(5) Biological neurons and networks.
- 6.
(6) Sensory systems.
- 7.
(7) Plasticity in development and learning.
- 8.
(8) Motor control.
Part 3 of the book then consists of a total of 266 separately authored articles, which, as the editor comments, cover a vast range of topics in brain theory and neural networks, from language to motor control and from the neurochemistry of memory to the applications of artificial neural networks in steelmaking. Because of the richness of interrelations between the topics, no attempt has been made to group the articles. They are simply placed in alphabetical order according to title (which emphasises the variety by producing some bizarre juxtapositions). A set of overlapping groupings according to subject‐matter is implicit in the road maps of Part 2.
The total number of contributors is no fewer than 342, and the 19‐strong Editorial Advisory Board includes some well‐known figures. The articles are intended to be self‐contained as far as possible, but ample links are given to related material elsewhere in the volume. The number of external references given with any article is limited to about 15, by editorial decision, and these are chosen so as to be helpful for further study. The articles have a pleasing uniformity of style, attributed to a process of editing and polishing by advisers and reviewers, which must have been a formidable task. The intention is that each article, possibly with attention to the related material to which links are given, should be intelligible to readers with varied backgrounds. The example of the Scientific American is quoted as indicating the desired level of presentation.
There is of course no compulsion to use the road maps, and the majority of readers already involved in the area will start by going directly to Part 3 and browsing for what interests them. There is a comprehensive subject index, and one function of the book is to serve as a conventional encyclopaedia so that, for example, to get a definition of just what is meant by, say, radial basis functions, along with a discussion of their significance, it is easy to find an article in Part 3 that provides exactly that. However, to use the book solely as a look‐up reference, ignoring the unifying threads embodied in the “road maps” is to miss an important part of its value. The “road maps” provide a valuable overview of the “state of the art” and reassurance to anyone who has lost faith in the fundamental tenet of Cybernetics, that there is much of importance that is common to the study of artifacts and of living systems.
On his page 14 the editor refers to the two groups of workers primarily concerned with these issues, namely those in neural computation, of whom most know little of brain function, and neuroscientists, of whom few know much about neural computation. The Handbook is designed to increase the flow of information between these scientific communities.
The range of topics included in the 266 articles is even wider than the list of eight headings might suggest. The various forms of analysis that have been applied to artificial neural nets are presented, both for feed‐forward nets and for the re‐entrant variety. These include, for feed‐forward nets, discussion of a highly general property called the Vapnik‐Chervonenkis Dimension of a net, and for re‐entrant nets they bring in considerations of modern dynamic theory, including the ideas of bifurcations, chaos and strange attractors. It should be mentioned that the level of mathematical sophistication required in places is, while not extreme, considerably greater than suggested by the reference to Scientific American.
On the biological side, there is much about functional neuroanatomy, under such headings as “basal ganglia”, “hippocampus” and “thalamus”. Some recent interesting findings are reported, including an analysis of the lamprey spinal cord as a chain of coupled oscillators, and a very interesting study of visual distance perception in frogs and toads that shows that a variety of different estimation methods are used, implemented by separate neural structures. An article on “Somatotopy: Plasticity of Sensory Maps” surprisingly makes no mention of the results on Pat Wall and his group (Wall and Egger, 1971), mainly on rats, and instead refers only to later work on primates. The work of Wall and Melzack is, however, given prominence under the heading of “Pain Networks”.
These references to specific topics do not give a proper impression of the many ramifications that are explored in the 266 articles. As well as treating numerous aspects of cognition, neural computation and motor control they include references to philosophical issues and theories of consciousness. Treatment of the latter does not include any reference to quantum mechanics, so showing a difference of viewpoint from that, for example, of Marcer and Schempp (1998). The related topic of the possible contribution of microtubules to neural functioning is acknowledged by a reference to Hameroff in the article on “Biomaterials for Intelligent Systems”.
The extreme richness of coverage of topics can be illustrated by returning to neurocomputing and artificial neural nets. As well as discussing the theory of the various kinds of net, accounts are given of ways of implementing neural elements, either by optical methods (which give the possibility of economical implementation of extremely large nets), or by analog or digital VLSI techniques. Software systems for the simulation of nets on conventional machines also receive attention, with discussion of stability considerations when using particular techniques for solution of sets of ordinary differential equations. Separate from this is an article on “Computing with Attractors” dealing with simulations according to the modern version of system dynamics which can involve bifurcations, attractors etc. In addition, a great many of the papers refer to applications of artificial neural nets.
Other topic areas are given a similarly rich coverage. It is probably unnecessary to elaborate further to show that this is a book that anyone concerned with Cybernetics will want to have on his shelves. Its weight is such that it should be on a shelf of robust construction and at a height that allows easy transfer to a reading desk.
References
Feigenbaum, E.A. and Feldman, J. (Eds) (1963), Computers and Thought, McGraw‐Hill, New York, NY.
Marcer, P. and Schempp, W. (1998), “The brain as a conscious system”, Int. J. General Systems, Vol. 27 Nos 1‐3, pp. 231‐48.
Wall, P.D. and Egger, M.D. (1971), “Formation of new connections in adult rat brains after partial deafferentation”, Nature, Vol. 232, pp. 542‐5.