Multivariate Analysis: Methods and ApplicationsStructural Sensitivity in Econometric Models Edwin Kuh, John W. Neese and Peter Hollinger Provides a pathbreaking assessment of the worth of linear dynamic systems methods for probing the behavior of complex macroeconomic models. Representing a major improvement upon the standard "black box" approach to analyzing economic model structure, it introduces the powerful concept of parameter sensitivity analysis within a linear systems root/vector framework. The approach is illustrated with a good mediumsize econometric model (Michigan Quarterly Econometric Model of the United States). EISPACK, the Fortran code for computing characteristic roots and vectors has been upgraded and augmented by a model linearization code and a broader algorithmic framework. Also features an interface between the algorithmic code and the interactive modeling system (TROLL), making an unusually wide range of linear systems methods accessible to economists, operations researchers, engineers and physical scientists. 1985 (0-471-81930-1) 324 pp. Linear Statistical Models and Related Methods With Applications to Social Research John Fox A comprehensive, modern treatment of linear models and their variants and extensions, combining statistical theory with applied data analysis. Considers important methodological principles underlying statistical methods. Designed for researchers and students who wish to apply these models to their own work in a flexible manner. 1984 (0 471-09913-9) 496 pp. Statistical Methods for Forecasting Bovas Abraham and Johannes Ledolter This practical, user-oriented book treats the statistical methods and models used to produce short-term forecasts. Provides an intermediate level discussion of a variety of statistical forecasting methods and models and explains their interconnections, linking theory and practice. Includes numerous time-series, autocorrelations, and partial autocorrelation plots. 1983 (0 471-86764-0) 445 pp. |
From inside the book
Results 1-3 of 52
Page 157
... objects into subgroups on the basis of the inter - object similarities . As we indicated , the goal in many cluster applications is to arrive at clusters of Clusters Objects Objects 01 02 X1 Variables X2 X3 Xp 157 CLUSTER ANALYSIS ...
... objects into subgroups on the basis of the inter - object similarities . As we indicated , the goal in many cluster applications is to arrive at clusters of Clusters Objects Objects 01 02 X1 Variables X2 X3 Xp 157 CLUSTER ANALYSIS ...
Page 168
... object joins a cluster it is never removed and fused with other objects belonging to some other cluster . Agglomerative methods proceed by forming a series of fusions of the n objects into groups . Divisive methods partition the set of n ...
... object joins a cluster it is never removed and fused with other objects belonging to some other cluster . Agglomerative methods proceed by forming a series of fusions of the n objects into groups . Divisive methods partition the set of n ...
Page 186
... objects that may have been incorrectly classified at an early stage in the clustering process . A second liability relates to what is called chaining : the tendency of hierarchical techniques to cluster together objects linked by chains ...
... objects that may have been incorrectly classified at an early stage in the clustering process . A second liability relates to what is called chaining : the tendency of hierarchical techniques to cluster together objects linked by chains ...
Contents
SELECTED ASPECTS OF MULTIVARIATE ANALYSIS | 1 |
PRINCIPAL COMPONENTS ANALYSIS | 26 |
FACTOR ANALYSIS | 56 |
Copyright | |
12 other sections not shown
Common terms and phrases
algorithm approach associated assumptions B₁ B₂ canonical correlation analysis canonical variate causal Chapter cluster column common factors computed correlation matrix corresponding covariance matrix criterion data matrix defined deletion denoted derived space dimension dimensional discriminant analysis discriminant function discussed distance distribution effects eigenvalues endogenous variables equation error Euclidean distance example F-value factor analysis Figure given independent variables indicated individual KSI 1 KSI LAMBDA latent class model LISREL maximum likelihood mean measures methods multiple multiple discriminant analysis n₁ n₂ null hypothesis objects obtained orthogonal parameter estimates posterior probability predictor variables principal components analysis probability problem procedure regression analysis regression coefficients regression model relationship residuals restrictions rotation sample scores shown solution squared standard statistically significant stimulus space structure sums-of-squares Table techniques test statistic variance variance-covariance matrix vector X₁ Y₁ Y₂ zero