Advanced Lectures on Machine Learning: Machine Learning Summer School 2002, Canberra, Australia, February 11-22, 2002, Revised Lectures
Springer Science & Business Media, Jan 31, 2003 - Computers - 257 pages
Machine Learning has become a key enabling technology for many engineering applications and theoretical problems alike. To further discussions and to dis- minate new results, a Summer School was held on February 11–22, 2002 at the Australian National University. The current book contains a collection of the main talks held during those two weeks in February, presented as tutorial chapters on topics such as Boosting, Data Mining, Kernel Methods, Logic, Reinforcement Learning, and Statistical Learning Theory. The papers provide an in-depth overview of these exciting new areas, contain a large set of references, and thereby provide the interested reader with further information to start or to pursue his own research in these directions. Complementary to the book, a recorded video of the presentations during the Summer School can be obtained at http://mlg. anu. edu. au/summer2002 It is our hope that graduate students, lecturers, and researchers alike will ?nd this book useful in learning and teaching Machine Learning, thereby continuing the mission of the Summer School. Canberra, November 2002 Shahar Mendelson Alexander Smola Research School of Information Sciences and Engineering, The Australian National University Thanks and Acknowledgments We gratefully thank all the individuals and organizations responsible for the success of the workshop.
What people are saying - Write a review
We haven't found any reviews in the usual places.
A Few Notes on Statistical Learning Theory
A Short Introduction to Learning with Kernels
Bayesian Kernel Methods
An Introduction to Boosting and Leveraging
Value Function Methods
Other editions - View all
absolute constant AdaBoost applications approach approximation apriori algorithm assume assumption base learner Bayesian binary Boosting algorithms Bregman divergence classification coefficients compute consider constraints convergence convex corresponding data constructors defined deﬁnition denote density discussed dot product empirical entropy error estimate feature space finite frequent itemsets Gaussian Process given Glivenko-Cantelli class gradient Hence hyperplane hypothesis set inequality iteration kernel Laplacian learning algorithm Lemma loss function Machine Learning margin matrix maximize methods minimizing Neural Networks norm normal distribution obtain optimization problem overfitting p-norm perceptron parameters perceptron perceptron algorithm positive definite positive definite kernel predicate prior Proof random variables regression Schölkopf Section sequence solution solve subset support vector machines Theorem training examples uniform entropy update upper bound value function Vapnik VC dimension weak learner weight vector zero