## Multivariate analysis: methods and applicationsStructural Sensitivity in Econometric Models Edwin Kuh, John W. Neese and Peter Hollinger Provides a pathbreaking assessment of the worth of linear dynamic systems methods for probing the behavior of complex macroeconomic models. Representing a major improvement upon the standard "black box" approach to analyzing economic model structure, it introduces the powerful concept of parameter sensitivity analysis within a linear systems root/vector framework. The approach is illustrated with a good mediumsize econometric model (Michigan Quarterly Econometric Model of the United States). EISPACK, the Fortran code for computing characteristic roots and vectors has been upgraded and augmented by a model linearization code and a broader algorithmic framework. Also features an interface between the algorithmic code and the interactive modeling system (TROLL), making an unusually wide range of linear systems methods accessible to economists, operations researchers, engineers and physical scientists. 1985 (0-471-81930-1) 324 pp. Linear Statistical Models and Related Methods With Applications to Social Research John Fox A comprehensive, modern treatment of linear models and their variants and extensions, combining statistical theory with applied data analysis. Considers important methodological principles underlying statistical methods. Designed for researchers and students who wish to apply these models to their own work in a flexible manner. 1984 (0 471-09913-9) 496 pp. Statistical Methods for Forecasting Bovas Abraham and Johannes Ledolter This practical, user-oriented book treats the statistical methods and models used to produce short-term forecasts. Provides an intermediate level discussion of a variety of statistical forecasting methods and models and explains their interconnections, linking theory and practice. Includes numerous time-series, autocorrelations, and partial autocorrelation plots. 1983 (0 471-86764-0) 445 pp. |

### From inside the book

Results 1-3 of 89

Page 14

convenient to express the sums-of-squares and cross-products in

corrected terms. The

which we will ...

**Mean**Corrected Sums-of-Squares and Cross- Products Matrix It is oftenconvenient to express the sums-of-squares and cross-products in

**mean**corrected terms. The

**mean**corrected sums-of-squares and cross-products matrix,which we will ...

Page 108

By proximities we

or difference between pairs of objects. By objects we simply

events. MDS procedures provide information on the perceived relationships ...

By proximities we

**mean**any set of numbers that express the amount of similarityor difference between pairs of objects. By objects we simply

**mean**any things orevents. MDS procedures provide information on the perceived relationships ...

Page 220

The p variable regression line (surface) passes through the

and explanatory variables, that is, through F and 2. The

estimated Y, given by Y = Xb (6.2-16) will equal the

The p variable regression line (surface) passes through the

**means**of the criterionand explanatory variables, that is, through F and 2. The

**mean**value Y of theestimated Y, given by Y = Xb (6.2-16) will equal the

**mean**value of the actual Y.### What people are saying - Write a review

User Review - Flag as inappropriate

This is one of the best books on Multivariate Statistics thta I have ever read. I strongly recomend it to any scientist interested in multivariate statistis.

### Contents

SELECTED ASPECTS OF MULTIVARIATE ANALYSIS | 1 |

PRINCIPAL COMPONENTS ANALYSIS | 23 |

FACTOR ANALYSIS | 53 |

Copyright | |

12 other sections not shown

### Common terms and phrases

algorithm approach associated assumptions canonical correlation analysis canonical variate causal Chapter cluster column common factors computed conditional probabilities coordinates correlation matrix corresponding covariance matrix criterion data matrix defined deletion denoted derived space diagonal dimension dimensional discriminant analysis discriminant function discussed distance effects eigenvalues eigenvectors elements endogenous variables equation Euclidean distance example F-value factor analysis Figure given independent variables indicated individual KSI 2 KSI LAMBDA latent class model maximum likelihood mean measures method multiple multiple discriminant analysis multivariate normal Note null hypothesis objects observed variables obtained orthogonal overidentified parameter estimates posterior probability predictor variables principal components analysis probability problem procedure regression analysis regression coefficients regression model relationship residuals restrictions rotation sample scores shown similarity solution squared standard statistically significant stimulus space structure sums-of-squares Table techniques test statistic variance variance-covariance matrix vector zero