## Learning from Data: Concepts, Theory, and MethodsAn interdisciplinary framework for learning methodologies—covering statistics, neural networks, and fuzzy logic, this book provides a unified treatment of the principles and methods for learning dependencies from data. It establishes a general conceptual framework in which various learning methods from statistics, neural networks, and fuzzy logic can be applied—showing that a few fundamental principles underlie most new methods being proposed today in statistics, engineering, and computer science. Complete with over one hundred illustrations, case studies, and examples making this an invaluable text. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

xi | |

xvii | |

1 | |

19 | |

3 Regularization Framework | 61 |

4 Statistical Learning Theory | 99 |

5 Nonlinear Optimization Strategies | 151 |

6 Methods for Data Reduction and Dimensionality Reduction | 177 |

8 Classification | 340 |

9 Support Vector Machines | 404 |

10 Noninductive Inference and Alternative Learning Formulations | 467 |

11 Concluding Remarks | 499 |

Appendix A Review of Nonlinear Optimization | 507 |

Appendix B Eigenvalues and Singular Value Decomposition | 514 |

References | 519 |

533 | |

### Other editions - View all

Learning from Data: Concepts, Theory, and Methods Vladimir Cherkassky,Filip M. Mulier No preview available - 2007 |

Learning from Data: Concepts, Theory, and Methods Vladimir Cherkassky,Filip M. Mulier No preview available - 2007 |

### Common terms and phrases

adaptive algorithm applications approach approximating functions basis functions Chapter Cherkassky classiﬁcation classiﬁcation problems clustering coefﬁcients data points data set decision boundary deﬁned density estimation described difﬁcult distribution empirical risk encoding equivalent error example falsiﬁability feature space ﬁnal ﬁnd ﬁnite samples ﬁrst ﬁxed ﬂexible formulation Gaussian given goal high-dimensional hyperplane implementation indicator functions inductive principle input space interpretation iteration kernel learning machine learning methods learning problem linear estimators loss function mapping margin-based matrix misclassiﬁcation model complexity model selection neural network noise number of samples output parameterization parameters penalization polynomial posterior probability prediction risk predictive learning principal curve priori knowledge procedure projection pursuit provides quantization reﬂects risk functional risk minimization set of approximating set of functions solution speciﬁc spline statistical stochastic approximation strategy support vectors target function tion training data training samples transduction unsupervised learning values Vapnik variance VC dimension VC theory wavelet