Foundations of Wavelet Networks and Applications
Traditionally, neural networks and wavelet theory have been two separate disciplines, taught separately and practiced separately. In recent years the offspring of wavelet theory and neural networks-wavelet networks-have emerged and grown vigorously both in research and applications. Yet the material needed to learn or teach wavelet networks has remained scattered in various research monographs.
Foundations of Wavelet Networks and Applications unites these two fields in a comprehensive, integrated presentation of wavelets and neural networks. It begins by building a foundation, including the necessary mathematics. A transitional chapter on recurrent learning then leads to an in-depth look at wavelet networks in practice, examining important applications that include using wavelets as stock market trading advisors, as classifiers in electroencephalographic drug detection, and as predictors of chaotic time series. The final chapter explores concept learning and approximation by wavelet networks.
The potential of wavelet networks in engineering, economics, and social science applications is rich and still growing. Foundations of Wavelet Networks and Applications prepares and inspires its readers not only to help ensure that potential is achieved, but also to open new frontiers in research and applications.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Separating Order from Disorder
Radial Wavelet Neural Networks
Predicting Chaotic Time Series
applications architecture backpropagation basic called Cauchy sequence chaotic attractors chaotic time series chapter cluster coefficients complex numbers components Computer condition converges correlation integrals data points defined denoted dilation and translation dynamical system Echauz elements embedding dimension equation equilibria example feed-forward finite Fourier transform given gradient Hebb's rule Hebbian learning Ikeda Ikeda map initial input integrals inverse image iterating L2 norm layer learning algorithm Lorenz attractor matrix McCulloch-Pitts mother wavelet neuron nonlinear one-dimensional optimization orthogonal output PAC learning parameter values pattern perceptron predicted attractor predicting chaotic predictive model problem real numbers recurrent neural networks sample satisfies scalar time series shown in Figure sigmoidal function signal step stochastic approximation structure subset subspace technique Theorem tion variable vector space wavelet decomposition wavelet function wavelet networks wavelet neural networks wavelet transform weights Zhang Zufiria