Semisupervised Learning for Computational LinguisticsThe rapid advancement in the theoretical understanding of statistical and machine learning methods for semisupervised learning has made it difficult for nonspecialists to keep up to date in the field. Providing a broad, accessible treatment of the theory as well as linguistic applications, Semisupervised Learning for Computational Linguistics offer |
Contents
| 1 | |
| 13 | |
| 31 | |
| 43 | |
| 67 | |
| 95 | |
Clustering | 131 |
Generative Models | 153 |
Agreement Constraints | 175 |
Propagation Methods | 193 |
Mathematics for Spectral Methods | 221 |
Spectral Methods | 237 |
Bibliography | 277 |
Index | 301 |
Other editions - View all
Common terms and phrases
algorithm arbitrary assignment Association for Computational averaging property boundary nodes centers centroid choose classifier clustering co-training column component Computational Linguistics conditional accuracy conditional independence consider constraint corresponding cross entropy data points decision boundary decision list defined diagonal direction discussed distance distribution dot product East Stroudsburg edge eigenvalues eigenvector equal equation error example feasible set figure fixed flow Gaussian given gradient graph half-instances harmonic function Hence hyperplane i-th input iteration label propagation labeled instances labeled nodes Laplacian learner linear combination machine learning matrix maximizes measure methods minimizes Naive Bayes Natural Language negative neighbors objective function optimal orthonormal matrix pairs part-of-speech tagging particle perpendicular positive prediction predictor probability problem Proceedings Rayleigh quotient represents rule sample self-training semisupervised learning solution space target function training data transductive unlabeled data unsupervised learning update word sense WordNet zero
Popular passages
Page 291 - K. Nigam, A. McCallum, S. Thrun, and T. Mitchell, "Text classification from labeled and unlabeled documents using EM,
Page 290 - McCallum, K. Nigam, J. Rennie, and K. Seymore. Automating the construction of internet portals with machine learning.
Page 298 - In Proceedings of the Workshop on WordNet and Other Lexical Resources, Second meeting of the North American Chapter of the Association for Computational Linguistics.
Page 283 - ... Finite Mixture Distributions, John Wiley and Sons, New York, NY, 1985. 7. Poston, WL and Marchette, DJ, "Recursive dimensionality reduction using Fisher's linear discriminant," Pattern Recognition, (to appear), 1998. 8. Kambhatla, N. and Leen, T. (1994). "Fast non-linear dimensionality reduction," Advances in Neural Information Processing Systems 6, Morgan Kaufmann Publishers, San Francisco. Construction of Hybrid Templates from Collected and Simulated Data for SAR ATR Algorithms Rajesh Sharma...
Page 281 - Cohn, Z. Ghahramani, and MI Jordan, "Active learning with statistical models," Journal of Artificial Intelligence Research 4, pp.
Page 283 - Alon Itai and Ulrike Schwall. Two languages are more informative than one.
Page 282 - Yarowsky. Language independent named entity recognition combining morphological and contextual evidence.
Page 290 - ... ACM Computing Surveys (CSUR) 34(1) (2002) 1-47 2. Mitchell, T.: Machine Learning. McGraw-Hill Higher Education (1997) 3. Lewis, D., Gale, W.: A sequential algorithm for training text classifiers. Proceedings of the 17th International ACM SIGIR (1994) 3-12 4. McCallum, A., Nigam, K.: Employing EM in Pool-Based Active learning for Text Classification.
Page 74 - It has the property that it always points in the direction in which the function is increasing most rapidly.
Page 291 - Learning Domain Ontologies from Document Warehouses and Dedicated Web Sites".


