Learning from Data: Concepts, Theory, and MethodsAn interdisciplinary framework for learning methodologies-covering statistics, neural networks, and fuzzy logic This book provides a unified treatment of the principles and methods for learning dependencies from data. It establishes a general conceptual framework in which various learning methods from statistics, neural networks, and fuzzy logic can be applied-showing that a few fundamental principles underlie most new methods being proposed today in statistics, engineering, and computer science. Complete with over one hundred illustrations, case studies, and examples, Learning from Data: * Relates statistical formulation with the latest methodologies used in artificial neural networks, fuzzy systems, and wavelets * Features consistent terminology, chapter summaries, and practical research tips * Emphasizes the conceptual framework provided by Statistical Learning Theory (VC-theory) rather than its commonly practiced mathematical aspects * Provides a detailed description of the new learning methodology called Support Vector Machines (SVM) This invaluable text/reference accommodates both beginning and advanced graduate students in engineering, computer science, and statistics. It is also indispensable for researchers and practitioners in these areas who must understand the principles and methods for learning dependencies from data. |
Contents
Regularization Framework | 59 |
Statistical Learning Theory | 92 |
Nonlinear Optimization Strategies | 131 |
Copyright | |
9 other sections not shown
Common terms and phrases
adaptive methods algorithm applications approach approximating functions backpropagation basis functions centers Chapter Cherkassky class of approximating classification clustering data points data set density estimation described distribution empirical risk encoding example feature space formulation functions f(x gaussian given Hence hidden units high-dimensional implementation indicator functions inductive principle initial input space input variables interpretation iteration kernel function learning machine learning methods learning problem learning rate linear estimators loss function mapping matrix minimize the empirical model complexity model selection neural network nonlinear optimization number of samples output parameters penalization penalty polynomial prediction risk principal curve priori knowledge procedure projection pursuit provides representation risk functional risk minimization Section self-organizing self-organizing map set of approximating set of functions smoothing solution spline squared error statistical learning theory stochastic approximation strategy support vector machine support vectors target function training data training samples univariate values Vapnik VC-dimension vector quantization wavelet width