Graphical Models: Foundations of Neural Computation

Front Cover
Michael Irwin Jordan, Terrence Joseph Sejnowski
MIT Press, 2001 - Computers - 421 pages
1 Review

Graphical models use graphs to represent and manipulate joint probability distributions. They have their roots in artificial intelligence, statistics, and neural networks. The clean mathematical formalism of the graphical models framework makes it possible to understand a wide variety of network-based approaches to computation, and in particular to understand many neural network algorithms and architectures as instances of a broader probabilistic methodology. It also makes it possible to identify novel features of neural network algorithms and architectures and to extend them to more general graphical models.This book exemplifies the interplay between the general formal framework of graphical models and the exploration of new algorithms and architectures. The selections range from foundational papers of historical importance to results at the cutting edge of research.Contributors H. Attias, C. M. Bishop, B. J. Frey, Z. Ghahramani, D. Heckerman, G. E. Hinton, R. Hofmann, R. A. Jacobs, Michael I. Jordan, H. J. Kappen, A. Krogh, R. Neal, S. K. Riis, F. B. Rodr guez, L. K. Saul, Terrence J. Sejnowski, P. Smyth, M. E. Tipping, V. Tresp, Y. Weiss.

 

What people are saying - Write a review

User Review - Flag as inappropriate

Wisdom. Kurzweilai.net Kurzweil big thinkers.

Contents

Probabilistic Independence Networks for Hidden Markov Probability Models
1
Learning and Relearning in Boltzmann Machines
45
Learning in Boltzmann Trees
77
Deterministic Boltzmann Learning Performs Steepest Descent in WeightSpace
89
Attractor Dynamics in Feedforward Neural Networks
97
Efficient Learning in Boltzmann Machines Using Linear Response Theory
121
Asymmetric Parallel Boltzmann Machines are Belief Networks
141
Variational Learning in Nonlinear Gaussian Belief Networks
145
Independent Factor Analysis
207
Hierarchical Mixtures of Experts and the EM Algorithm
257
Hidden Neural Networks
291
Variational Learning for Switching StateSpace Models
315
Nonlinear TimeSeries Prediction with Missing and Noisy Data
349
Correctness of Local Probability Propagation in Graphical Models with Loops
367
Index
409
Copyright

Mixtures of Probabilistic Principal Component Analyzers
167

Common terms and phrases

References to this book

All Book Search results »

About the author (2001)

Michael I. Jordan is Professor of Computer Science and of Statistics at the University of California, Berkeley, and recipient of the ACM/AAAI Allen Newell Award.

Bibliographic information