Introduction to Statistical Machine LearningMachine learning allows computers to learn and discern patterns without actually being programmed. When Statistical techniques and machine learning are combined together they are a powerful tool for analysing various kinds of data in many computer science/engineering areas including, image processing, speech processing, natural language processing, robot control, as well as in fundamental sciences such as biology, medicine, astronomy, physics, and materials. Introduction to Statistical Machine Learning provides a general introduction to machine learning that covers a wide range of topics concisely and will help you bridge the gap between theory and practice. Part I discusses the fundamental concepts of statistics and probability that are used in describing machine learning algorithms. Part II and Part III explain the two major approaches of machine learning techniques; generative methods and discriminative methods. While Part III provides an in-depth look at advanced topics that play essential roles in making machine learning algorithms more useful in practice. The accompanying MATLAB/Octave programs provide you with the necessary practical skills needed to accomplish a wide range of data analysis tasks. - Provides the necessary background material to understand machine learning such as statistics, probability, linear algebra, and calculus - Complete coverage of the generative approach to statistical pattern recognition and the discriminative approach to statistical machine learning - Includes MATLAB/Octave programs so that readers can test the algorithms numerically and acquire both mathematical and practical skills in a wide range of data analysis tasks - Discusses a wide range of applications in machine learning and statistics and provides examples drawn from image processing, speech processing, natural language processing, robot control, as well as biology, medicine, astronomy, physics, and materials |
Contents
3 GENERATIVE APPROACH TO STATISTICAL PATTERN RECOGNITION | 111 |
4 DISCRIMINATIVE APPROACH TO STATISTICAL MACHINE LEARNING | 235 |
5 FURTHER TOPICS | 341 |
Other editions - View all
Common terms and phrases
2-constrained algorithm approximation argmax autoencoder behavior is illustrated called CHAPTER CONTENTS class-posterior probability computed cross validation decision boundary decision stump dimensionality reduction Example expectation explained in Section FIGURE figure(1 Fisher discriminant analysis Gaussian kernel model Gaussian mixture model Gaussian model Gibbs sampling given hinge loss Huber loss minimization illustrated in Fig inequality iteratively reweighted LS KL divergence Let us consider linear-in-parameter model log q(x log-likelihood LS regression machine learning marginal likelihood MATLAB code maximum likelihood method model selection moment-generating function multitask normal distribution number of parameters number of training obtained optimization problem output parametric model posterior probability Pr(x prior probability probability density function probability distribution probability mass function provided in Fig random variable rejection sampling repmat semisupervised statistic stochastic gradient supervised learning support vector classification training samples unsupervised variance variance-covariance matrix zero