Statistical Learning Theory

Front Cover
Wiley, Sep 30, 1998 - Mathematics - 736 pages
Introduction: The Problem of Induction and Statistical Inference. Two Approaches to the Learning Problem. Appendix to Chapter1: Methods for Solving III-Posed Problems. Estimation of the Probability Measure and Problem of Learning. Conditions for Consistency of Empirical Risk Minimization Principle. Bounds on the Risk for Indicator Loss Functions. Appendix to Chapter 4: Lower Bounds on the Risk of the ERM Principle. Bounds on the Risk for Real-Valued Loss Functions. The Structural Risk Minimization Principle. Appendix to Chapter 6: Estimating Functions on the Basis of Indirect Measurements. Stochastic III-Posed Problems. Estimating the Values of Function at Given Points. Perceptrons and Their Generalizations. The Support Vector Method for Estimating Indicator Functions. The Support Vector Method for Estimating Real-Valued Functions. SV Machines for Pattern Recognition. SV Machines for Function Approximations, Regression Estimation, and Signal Processing. Necessary and Sufficient Conditions for Uniform Convergence of Frequencies to Their Probabilities. Necessary and Sufficient Conditions for Uniform Convergence of Means to Their Expectations. Necessary and Sufficient Conditions for Uniform One-Sided Convergence of Means to Their Expectations.

From inside the book

Contents

The Problem of Induction and Statistical
1
0
10
THEORY OF LEARNING AND GENERALIZATION
17
Copyright

25 other sections not shown

Common terms and phrases

About the author (1998)

Vladimir Naumovich Vapnik is one of the main developers of the Vapnik-Chervonenkis theory of statistical learning, and the co-inventor of the support vector machine method, and support vector clustering algorithm.