## New Developments in Neural Computing,John Gerald Taylor, C. L. T. Mannion, Institute of Physics (Great Britain), London Mathematical Society Research in neural computing is advancing rapidly at present, with important developments being made constantly. In such a fast-moving field it is important for workers to have access to the most up-to-date results, and this book, containing new information from researchers from all over the world, fulfills that need. New Developments in Neural Computing comprises the proceedings of a workshop on neural computing held in London in April 1989. The book begins with four tutorials, intended for beginners in the field, giving an introduction to some of the major topics in neural computing. There follow fifteen contributed papers on a wide variety of topics of current interest, and four invited papers by acknowledged world experts in particular areas. Eduardo Caianello (Italy), one of the founding fathers of the subject, writes on synthesising nets made up of binary decision elements, John Daugman (USA) discusses visual coding and Gabor functions, Rolfe Eckmiller (Germany) covers visuo-motor control in robots and Patrick Gallinari (France) discusses feedforward nets and their learning rules. New Developments in Neural Computing presents the state of the art in neural net research, and so is an important book for anyone interested in the mathematics and physics of the brain, computer science, neurophysiology and brain science in general. It will also be of interest to undergraduate students studying this new field. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Common terms and phrases

1989 IOP Publishing 2-D Gabor ABSTRACT activity adaptation Aleksander algorithm Amit analysis antiphon application architecture attractor axon behaviour binary biological bits boolean functions brain chip coefficients components connections corresponding cortex defined distribution dynamics Eckmiller elementary functions equation error example feature feedforward Figure frequency hardware Hebbian learning hippocampus Hopfield Hopfield net implementation input vector interneurones lattice layer learning rule linear machine Markov chain matrix meeting on Neural MLPs Multilayer Perceptrons N-tuple Neural Computing neural net neural nets neural network models neurocomputer neuron nodes Noise Gate non-orthogonal optimal orthogonal output parallel parameters pattern recognition performance Phys possible potential presented at meeting problem processing elements programming PYGMALION random iterative map receptive field representation retrieval Sherrington signal silicon simulation solution space spatial spin glasses statocyst stochastic storage capacity stored pattern structure synapses task techniques threshold function topology training noise trajectory transputers values variables visual weights