Multivariate AnalysisMultivariate Analysis deals with observations on more than one variable where there is some inherent interdependence between variables. Most available books on the subject concentrate on either the theoretical or the data analytic approach. This book not only combines theses two approaches but also emphasizes modern developments, so, although primarily designed as a textbook for final year undergraduates and postgraduate students in mathematics and statistics, certain of the sections will commend themselves to research workers. Broadly speaking the first half of the book contains direct extensions of univariate ideas and techniques, including exploratory data analysis, distribution theory and problems of inference. The remaining chapters concentrate on specifically multivariate problems which have no meaningful analogues in the univariate case. Topics covered include econometrics, principal component analysis, factor analysis, canonical correlation analysis, discriminate analysis, cluster analysis, multi-dimensional scaling and directional data. Several new methods of presentation are used, for example, the data matrix is emphasized throughout, and density-free approach is given to normal theory, tests are constructed using the likelihood ratio principle and the union intersection principle, and graphical methods are used in explanation. The reader is assumed to have a basic knowledge of mathematical statistics at an undergraduate level together with an elementary understanding of linear algebra. There are, however, appendices which provide a sufficient background of matrix algebra, a summary of univariate statistics and some statistical tables. |
Contents
Chapter 1Introduction | 1 |
3 | 6 |
Chapter 2Basic Properties of Random Vectors | 26 |
Copyright | |
33 other sections not shown
Other editions - View all
Common terms and phrases
A₁ asymptotic b₁ Biometrika bivariate C₁ canonical correlation cluster coefficient columns configuration Consider coordinates Corollary correlation matrix corresponding covariance matrix data matrix defined denote density diag dimensions discriminant rule distance matrix eigenvalues eigenvectors elements equation estimate Euclidean Example Exercise factor analysis function given H₁ Hence hypothesis independent likelihood ratio linear combination M₁ Mahalanobis distance Mardia maximized maximum likelihood mean vector measure method multidimensional scaling multivariate analysis n₁ non-singular non-zero eigenvalues normal distribution Note orthogonal matrix p-vector P₁ parameters points population principal component analysis problem Proof properties r₁ random sample random vector rank regression result rows S₁ scores Section similarity matrix solution SSP matrix Statist sum of squares Suppose symmetric symmetric matrix Table Theorem tion transformation univariate variance von Mises-Fisher distribution Wishart distribution x₁ y₁ zero