Constrained Principal Component Analysis and Related Techniques

Front Cover
CRC Press, Oct 24, 2013 - Mathematics - 251 pages
0 Reviews

In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data.

  • How can regression analysis and PCA be combined in a beneficial way?
  • Why and when is it a good idea to combine them?
  • What kind of benefits are we getting from them?

Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.

The book begins with four concrete examples of CPCA that provide readers with a basic understanding of the technique and its applications. It gives a detailed account of two key mathematical ideas in CPCA: projection and singular value decomposition. The author then describes the basic data requirements, models, and analytical tools for CPCA and their immediate extensions. He also introduces techniques that are special cases of or closely related to CPCA and discusses several topics relevant to practical uses of CPCA. The book concludes with a technique that imposes different constraints on different dimensions (DCDD), along with its analytical extensions. MATLAB® programs for CPCA and DCDD as well as data to create the book’s examples are available on the author’s website.

  

What people are saying - Write a review

We haven't found any reviews in the usual places.

Common terms and phrases

About the author (2013)

Yoshio Takane is an emeritus professor at McGill University and an adjunct professor at the University of Victoria. He is a former president of the Psychometric Society and a recipient of a Career Award from the Behaviormetric Society of Japan and a Special Award from the Japanese Psychological Association. His recent interests include regularization techniques for multivariate data analysis, acceleration methods for iterative model fitting, the development of structural equation models for analyzing brain connectivity, and various kinds of singular value decompositions. He earned his DL from the University of Tokyo and PhD from the University of North Carolina at Chapel Hill.

Bibliographic information