## The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political ForecastingNow, for the first time, publication of the landmark work inbackpropagation! Scientists, engineers, statisticians, operationsresearchers, and other investigators involved in neural networkshave long sought direct access to Paul Werbos's groundbreaking,much-cited 1974 Harvard doctoral thesis, The Roots ofBackpropagation, which laid the foundation of backpropagation. Now,with the publication of its full text, these practitioners can gostraight to the original material and gain a deeper, practicalunderstanding of this unique mathematical approach to socialstudies and related fields. In addition, Werbos has provided threemore recent research papers, which were inspired by his originalwork, and a new guide to the field. Originally written for readerswho lacked any knowledge of neural nets, The Roots ofBackpropagation firmly established both its historical andcontinuing significance as it: * Demonstrates the ongoing value and new potential ofbackpropagation * Creates a wealth of sound mathematical tools useful acrossdisciplines * Sets the stage for the emerging area of fast automaticdifferentiation * Describes new designs for forecasting and control which exploitbackpropagation * Unifies concepts from Freud, Jung, biologists, and others into anew mathematical picture of the human mind and how it works * Certifies the viability of Deutsch's model of nationalism as apredictive tool--as well as the utility of extensions of thiscentral paradigm "What a delight it was to see Paul Werbos rediscover Freud'sversion of 'back-propagation.' Freud was adamant (in The Projectfor a Scientific Psychology) that selective learning could onlytake place if the presynaptic neuron was as influenced as is thepostsynaptic neuron during excitation. Such activation of bothsides of the contact barrier (Freud's name for the synapse) wasaccomplished by reducing synaptic resistance by the absorption of'energy' at the synaptic membranes. Not bad for 1895! But Werbos1993 is even better." --Karl H. Pribram Professor Emeritus,Stanford University |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Other editions - View all

### Common terms and phrases

actual adaptive critic algorithm analysis applications approach Argentina ARMA estimation ARMA model armawt assimilation backpropagation Box and Jenkins brain calculate chain rule Chapter classical coefficients communications complex concepts constant convergence Cyprus database defined described Deutsch Deutsch—Solow model differentiated discussed dynamic feedback dynamic programming empirical Equation example ext2 EXTRAP F_net factor Finland Finnmark function human idea input iteration Karl Deutsch language long-term prediction mathematical maximize maximum likelihood measurement noise mobilization multivariate neural network neurocontrol neurons nonlinear normal distribution Nynorsk one's optimization ordered derivative output parameters pattern percentage errors population possible prediction errors predictive power probability problem procedure random regression model reinforcement learning robust method Section simple simulation social sciences statistical steepest descent studies subroutine supervised learning Table Taiwan techniques theory thesis tion urban variables vector verbal weights Werbos zero