## Evolution, Learning and CognitionThis review volume represents the first attempt to provide a comprehensive overview of this exciting and rapidly evolving development. The book comprises specially commissioned articles by leading researchers in the areas of neural networks and connectionist systems, classifier systems, adaptive network systems, genetic algorithm, cellular automata, artificial immune systems, evolutionary genetics, cognitive science, optical computing, combinatorial optimization, and cybernetics. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Connectionist Learning through Gradient Following R J Williams | 3 |

Efficient Stochastic Gradient Learning Algorithm for Neural | 27 |

Information Storage in Fully Connected Networks | 51 |

Neuronic Equations and their Solutions E R Caianiello | 91 |

The Dynamics of Searches Directed by Genetic Algorithms | 111 |

Probabilistic Neural Networks J W Clark | 129 |

Some Quantitative Issues in the Theory of Perception A Zee | 183 |

Speech Perception and Production by a SelfOrganizing Neural Network | 217 |

Learning to Predict the Secondary Structure of Globular Proteins | 257 |

Exploiting Chaos to Predict the Future and Reduce Noise | 277 |

Contents | 279 |

Scaling of Error Estimates | 296 |

Experimental Data Analysis | 314 |

Adaptive Dynamics | 322 |

How Neural Nets Work A Lapedes R Farber | 331 |

A Neural Network Model for Visual Pattern | 233 |

### Other editions - View all

### Common terms and phrases

accuracy activity algorithm applied approach approximation associative assume average becomes behavior called cell chaotic connections consider construct convergence corresponding defined depends derivatives described determined dimension dimensional direct discussed distribution dynamics effect eigenvalues equation error estimates example expected exponents fact Figure firing fixed forecasts function give given hidden higher improve increases independent initial input interest iterated layer learning limit linear Lyapunov matrix mean memory methods nature neighborhoods nets neural networks neurons noise nonlinear Note operation optimal output parameters particular pattern performance physical points positive possible prediction present probability problem procedure produce properties random recognition reinforcement representation represents rule scaling scheme signal simple space statistical structure suggested techniques theorem theory units vector weights