The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Google eBook)

Front Cover
Springer Science & Business Media, Jan 1, 2001 - Mathematics - 533 pages
23 Reviews
During the past decade there has been an explosion in computation and information technology. With it have come vast amounts of data in a variety of fields such as medicine, biology, finance, and marketing. The challenge of understanding these data has led to the development of new tools in the field of statistics, and spawned new areas such as data mining, machine learning, and bioinformatics. Many of these tools have common underpinnings but are often expressed with different terminology. This book describes the important ideas in these areas in a common conceptual framework. While the approach is statistical, the emphasis is on concepts rather than mathematics. Many examples are given, with a liberal use of color graphics. It should be a valuable resource for statisticians and anyone interested in data mining in science or industry. The book's coverage is broad, from supervised learning (prediction) to unsupervised learning. The many topics include neural networks, support vector machines, classification trees and boosting---the first comprehensive treatment of this topic in any book.

This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for wide'' data (p bigger than n), including multiple testing and false discovery rates.

Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote apopular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting.

  

What people are saying - Write a review

User ratings

5 stars
17
4 stars
5
3 stars
0
2 stars
1
1 star
0

Review: The Elements of Statistical Learning: Data Mining, Inference, and Prediction

User Review  - Danial - Goodreads

Unnecessarily dry and difficult to read through; but as a reference book with a solid index it hits its mark. Read full review

Review: The Elements of Statistical Learning: Data Mining, Inference, and Prediction

User Review  - Scott - Goodreads

A classic! One of the first books I read on Machine Learning. Comes at things from the statistics perspective, probably wouldn't recommend as a first introduction. Also would recommend the updated electronic editions (freely available form Hastie's webpage Read full review

Contents

II
1
III
9
IV
11
V
18
VI
22
VII
28
VIII
32
IX
33
LXXII
254
LXXIII
255
LXXIV
257
LXXV
266
LXXVI
279
LXXVII
283
LXXVIII
290
LXXIX
293

X
37
XI
39
XIII
41
XIV
42
XV
50
XVI
55
XVII
75
XX
79
XXI
81
XXII
84
XXIII
95
XXIV
105
XXV
111
XXVII
115
XXVIII
117
XXIX
126
XXX
127
XXXI
134
XXXII
137
XXXIII
138
XXXIV
144
XXXV
148
XXXVI
155
XXXVIII
160
XL
163
XLI
165
XLII
172
XLIII
174
XLIV
175
XLV
179
XLVI
182
XLVII
186
XLVIII
188
XLIX
190
LII
193
LIII
196
LIV
200
LV
203
LVI
205
LVII
206
LVIII
208
LIX
210
LX
214
LXI
217
LXII
222
LXIV
225
LXV
231
LXVI
235
LXVII
236
LXVIII
243
LXIX
246
LXX
250
LXXI
253
LXXX
295
LXXXII
296
LXXXIII
299
LXXXIV
303
LXXXV
304
LXXXVI
305
LXXXVII
306
LXXXVIII
308
LXXXIX
312
XC
314
XCI
316
XCII
319
XCIII
323
XCIV
324
XCV
331
XCVI
335
XCVII
340
XCVIII
344
XCIX
347
C
350
CI
353
CII
355
CIII
359
CIV
362
CV
366
CVI
367
CVIII
368
CIX
371
CX
377
CXI
390
CXII
391
CXIII
397
CXIV
399
CXV
406
CXVII
411
CXVIII
415
CXIX
427
CXX
432
CXXI
433
CXXIII
437
CXXIV
439
CXXV
453
CXXVI
480
CXXVII
485
CXXVIII
494
CXXIX
502
CXXX
503
CXXXI
504
CXXXII
509
CXXXIII
523
CXXXIV
527
Copyright

Common terms and phrases

Popular passages

Page 513 - Experiments with a new boosting algorithm. Machine Learning: Proceedings of the Thirteenth International Conference, Morgan Kauffman, San Francisco, pp.
Page 513 - Proceedings of the Ninth Annual Conference on Computational Learning Theory, pp. 325-332. Freund, Y. and Schapire, R. (1997). A decision-theoretic generalization of online learning and an application to boosting, Journal of Computer and System Sciences 55: 119-139.
Page 516 - Another interpretation of the EM algorithm for mixture distributions.
Page 514 - Gelman, A., Carlin, J., Stern, H., and Rubin, D. (1995). Bayesian Data Analysis. London: Chapman & Hall.
Page 517 - Kohonen, T. (1989). Self-Organization and Associative Memory (3rd edition), Springer- Verlag, Berlin.
Page 509 - AR ( 1993 | Universal approximation bounds for superpositions of a sigmoid function. IEEE Transactions on Information Theory 39.
Page viii - The quiet statisticians have changed our world - not by discovering new facts or technical developments but by changing the ways we reason, experiment, and form our opinions about it.
Page 509 - JA and Rosenfeld, E. (eds) (1988). Neurocomputing: Foundations of Research. MIT Press : Cambridge, MA.
Page 518 - Madigan, D. and Raftery, A. (1994), Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window.

References to this book

All Book Search results »

Bibliographic information