Combining Pattern Classifiers: Methods and Algorithms (Google eBook)
Covering pattern classification methods, Combining Classifiers: Ideas and Methods focuses on the important and widely studied issue of how to combine several classifiers together in order to achieve improved recognition performance. It is one of the first books to provide unified, coherent, and expansive coverage of the topic and as such will be welcomed by those involved in the area. With case studies that bring the text alive and demonstrate 'real-world' applications it is destined to become essential reading.
What people are saying - Write a review
Recently I've read the book Combining Pattern Classifiers, Methods and Algorithms by Ludmila I Kuncheva. The book addresses the problem of using multiple pattern classifiers to enhance the classification task in quantitative (e.g. accuracy) and qualitative (e.g. robustness) directions.
After taking three courses on pattern recognition and machine learning, the book helped me to get an idea over the whole subject of pattern analysis and classification. It starts with a compact but rich introduction to the theme of pattern recognition and the basic classifier types. Classifier design methods are studied in classifier vs. boundary design and parametric vs non-parametric learning dimensions. After the two introductory chapters on the discipline of pattern recognition, the concept of multi-classifier systems is introduced in chapter three. It is clarified in this chapter that the book is dedicated mainly to combining classifiers at the decision level and topics such as using diverse set of base classifiers are not discussed intensively. Then, combining of classifier outputs are treated in label output and continuous output categories. Various algorithms for majority voting and fusion of the label outputs are studied and compared throughout chapters three and four. Fusion of continuous valued outputs are studied in chapter five.
Selecting different classifiers for different sections of the input spaces are introduced in chapter six and the well-known knn method with its variations are studied in this chapter. Bagging and boosting methods for classifier selection are the subject of the seventh chapter.
The two stages before and after classifier combination; i.e. feature selection as the preprocessing and error corrections as the post-processing; discussed in chapter eight. The importance of feature selection and feature space partitioning are analyzed and demonstrated through the representative examples.
Most of the topics covered so far in the book were based on the results of the experiments in different application areas. Chapters nine and ten contain theoretical views and analysis on the classifier combination methods and rules.
Chapter ten is totally devoted to the diversity of classifiers in an ensemble. Diversity of the classifiers are studied in simple observable methods (e.g. training on different sections of input spaces) as well as statistical analysis tools and methods. Some open research directions are mentioned in the field of classifier combination.
As a student, I found this book helpful for gaining an understanding of the nature of pattern classifiers and the combination architectures. I'll recommend the book for those who took the elementary courses such as statistical pattern recognition and machine learning. The book will serve as a guiding resource for combining and using various familiar methods and algorithms.
2 Base Classifiers
3 Multiple Classifier Systems
4 Fusion of Label Outputs
5 Fusion of ContinuousValued Outputs
6 Classifier Selection
7 Bagging and Boosting