Feature Extraction, Construction and Selection: A Data Mining Perspective

Front Cover
Huan Liu, Hiroshi Motoda
Springer Science & Business Media, Aug 31, 1998 - Computers - 410 pages
0 Reviews
There is broad interest in feature extraction, construction, and selection among practitioners from statistics, pattern recognition, and data mining to machine learning. Data preprocessing is an essential step in the knowledge discovery process for real-world applications. This book compiles contributions from many leading and active researchers in this growing field and paints a picture of the state-of-art techniques that can boost the capabilities of many existing data mining tools. The objective of this collection is to increase the awareness of the data mining community about the research of feature extraction, construction and selection, which are currently conducted mainly in isolation. This book is part of our endeavor to produce a contemporary overview of modern solutions, to create synergy among these seemingly different branches, and to pave the way for developing meta-systems and novel approaches. Even with today's advanced computer technologies, discovering knowledge from data can still be fiendishly hard due to the characteristics of the computer generated data. Feature extraction, construction and selection are a set of techniques that transform and simplify data so as to make data mining tasks easier. Feature construction and selection can be viewed as two sides of the representation problem.
 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

Less is More
3
Selection
5
13 Future Work
9
References
11
Feature Weighting for Lazy Learning Algorithms
13
22 Feature Weighting in A Lazy Learning Algorithm
15
23 A Categorization of Feature Weighting in Lazy Learners
16
24 Additional Contributions
27
133 Feature Spaces
209
134 The IGLUE System
212
135 Experimental Results
213
136 Conclusion
216
References
217
Constructive Function Approximation
219
142 The Need for Features
220
143 Features Spaces
221

25 Summary
28
References
29
The Wrapper Approach
33
32 Relevance of Features
35
33 The Filter Approach
36
34 The Wrapper Approach
38
35 Related Work
45
36 Future Work
46
37 Conclusion
47
References
48
Datadriven Constructive Induction Methodology and Applications
51
41 Introduction
52
42 An Illustration of the Importance of the Representation Space
53
43 The Need fro Constructive Induction
54
44 A General Schema for Constructive Induction
56
45 Experimental Applications
59
46 Conclusion
66
References
67
Selecting Features by Vertical Compactness of Data
71
52 The Vertical Compactness Criterion
73
53 Search Methods
75
54 Hybrid Search Algorithm
77
55 Experiments
80
56 Conclusion
83
References
84
Relevance Approach to Feature Subset Selection
85
62 Characterisation of Feature Subset Selection
87
63 A Relevancebased Algorithm for Feature Selection
90
64 Experiment and Evaluation
91
65 Comparison with Related Work
92
66 Conclusion
95
A unified framework for relevance
96
References
97
Novel Methods for Feature Subset Selection with Respect to Problem Knowledge
101
72 Feature Subset Selection Problem in Statistical Pattern Recognition
103
73 Basic Situation with Respect to Problem Knowledge
105
74 Floating Search Methods
106
75 Feature Selection by Modified Gaussian Mixtures
108
76 Experimental Results
112
77 Subset Selection Guide
114
References
115
Feature Subset Selection Using A Genetic Algorithm
117
82 Related Work
118
83 Feature Selection Using a Genetic Algorithm for Neural Network Pattern Classifiers
122
84 Implementation Details
125
85 Experiments
127
86 Summary and Discussion
131
References
133
A Relevancy
137
92 Relevance of Literal and Features
139
The EastWest Challenge
144
94 Handing Noisy Data
150
95 Noise and Relevance
151
97 Conclusion
152
References
153
Lexical Contextual Relations for the Unsupervised Discovery of Texts Features
157
102 Collocational Expressions as Text Features
160
103 Discovery of Contextual Text Features
163
104 Results and Discussion
164
105 Conclusion
170
References
172
Integrated Feature Extraction Using Adaptive Wavelets
175
112 Wavelet Analysis of Spectra
179
113 Multivariate Prediction Models
183
114 The Data Set
185
116 Regression Applications
187
117 Future Directions
188
References
189
Feature Extraction via Neural Networks
191
122 Feature Extraction via Neural Networks
193
123 Illustrative Examples
195
124 Empirical Study and Analysis
199
125 CNF9a Revisited for Decision Tree Induction
200
126 Summary
202
Using Latticebased Framework as a Tool for Feature Extraction
205
132 Galois Lattice
207
144 Comparison of Three Function Approximation Algorithms
223
145 Related Work
233
References
235
A Comparison of Constructing Different Types of New Feature for Decision Tree Learning
239
152 Four Type of New Feature
241
153 A Single Algorithm for Creating Four Different Types of New Feature
243
154 Experiments
244
155 Discussion
251
156 Related Work
252
157 Conclusion and Future Work
253
References
254
Construction Using Fragmentary Knowledge
257
162 Important Issues for Constructive Induction
258
Combining Neural Network with Iterative Attribute Construction
261
164 Discussion
269
References
271
Feature Construction Using Fragmentary Knowledge
273
172 Types of Fragmentary Knowledge
274
173 Incorporating Fragmentary Knowledge into Search
278
174 Bankruptcy Experiments
280
175 Turfgrass Management Experiments
283
176 Discussion
286
References
287
Constructive Induction on Continuous Space
289
182 An Illustrative Example
291
183 Growing the Tree
294
184 Related Work
298
185 Experiments
299
186 Conclusions
301
Evolutionary Feature Space Transformation
307
192 Approaches to Feature Selection and Construction
308
193 General System Architecture
310
194 Experimental Setup
315
195 Comparative studies
320
196 Summary and Conclusions
321
References
322
Feature Transformation by Function Decomposition
325
202 SingleStep Function Decomposition
327
203 Finding the Best Feature Partition
329
204 Redundancy Discovery and Removal
330
205 Discovering Feature Hierarchies
331
206 Feature Construction
334
207 Related Work
336
208 Summary
338
Constructive Induction of Cartesian Product Attributes
341
212 A Wrapper Approach for Creating Cartesian Product Attributes
342
213 Experiments
344
214 Attribute deletion
349
215 Related Work
350
216 Conclusions
351
References
352
Towards Automatic Fractal Feature Extraction fro Images Recognition
357
222 Fractals and IFS
359
223 Extraction of Fractal Features
361
224 Experiments
363
225 Discussion and future work
370
References
372
Feature Transformation Strategies for a Robot Leaning Problem
375
232 Robot Programming by Demonstration
376
233 Learning Structured Concepts
377
234 Constructive Induction
379
235 Experimental Results
380
236 Feature Transformation Strategies
381
237 Experimental Results
386
238 Conclusion
390
References
391
Interactive Genetic Algorithm Based Feature Selection and Its Application to Marketing Data Analysis
393
241 Introduction
394
242 Interactive GAs Feature Selection and Knowledge Extraction
395
243 Algorithm of SIBILE
396
244 Experiments
399
245 Conclusion Remarks
404
References
405
Index
407
Copyright

Other editions - View all

Common terms and phrases

References to this book

All Book Search results »

Bibliographic information