Multivariate Analysis: Methods and ApplicationsStructural Sensitivity in Econometric Models Edwin Kuh, John W. Neese and Peter Hollinger Provides a pathbreaking assessment of the worth of linear dynamic systems methods for probing the behavior of complex macroeconomic models. Representing a major improvement upon the standard "black box" approach to analyzing economic model structure, it introduces the powerful concept of parameter sensitivity analysis within a linear systems root/vector framework. The approach is illustrated with a good mediumsize econometric model (Michigan Quarterly Econometric Model of the United States). EISPACK, the Fortran code for computing characteristic roots and vectors has been upgraded and augmented by a model linearization code and a broader algorithmic framework. Also features an interface between the algorithmic code and the interactive modeling system (TROLL), making an unusually wide range of linear systems methods accessible to economists, operations researchers, engineers and physical scientists. 1985 (0-471-81930-1) 324 pp. Linear Statistical Models and Related Methods With Applications to Social Research John Fox A comprehensive, modern treatment of linear models and their variants and extensions, combining statistical theory with applied data analysis. Considers important methodological principles underlying statistical methods. Designed for researchers and students who wish to apply these models to their own work in a flexible manner. 1984 (0 471-09913-9) 496 pp. Statistical Methods for Forecasting Bovas Abraham and Johannes Ledolter This practical, user-oriented book treats the statistical methods and models used to produce short-term forecasts. Provides an intermediate level discussion of a variety of statistical forecasting methods and models and explains their interconnections, linking theory and practice. Includes numerous time-series, autocorrelations, and partial autocorrelation plots. 1983 (0 471-86764-0) 445 pp. |
From inside the book
Results 1-3 of 88
Page 162
... distance between two objects i and j . If we set r = 2 , then we have the familiar Euclidean distance between objects i and j : dij = ( Xik - If r = 1 , then we have 1/2 ( 5.2-2 ) P d11 = Σ 1xik - Xjkl dij k = 1 ( 5.2-3 ) which is ...
... distance between two objects i and j . If we set r = 2 , then we have the familiar Euclidean distance between objects i and j : dij = ( Xik - If r = 1 , then we have 1/2 ( 5.2-2 ) P d11 = Σ 1xik - Xjkl dij k = 1 ( 5.2-3 ) which is ...
Page 168
... distance rule that starts out by first finding those two objects having the shortest distance . They constitute the first cluster . At the next stage one of two things can happen : Either a third object will join the already formed ...
... distance rule that starts out by first finding those two objects having the shortest distance . They constitute the first cluster . At the next stage one of two things can happen : Either a third object will join the already formed ...
Page 182
... distance to the splinter group is less than its distance to the main cluster , it should be removed and fused with the splinter group . Once the composition of the two clusters has stabilized - that is , each object's average distance ...
... distance to the splinter group is less than its distance to the main cluster , it should be removed and fused with the splinter group . Once the composition of the two clusters has stabilized - that is , each object's average distance ...
Contents
SELECTED ASPECTS OF MULTIVARIATE ANALYSIS | 1 |
PRINCIPAL COMPONENTS ANALYSIS | 26 |
FACTOR ANALYSIS | 56 |
Copyright | |
12 other sections not shown
Common terms and phrases
algorithm approach associated assumptions B₁ B₂ canonical correlation analysis canonical variate causal Chapter cluster column common factors computed correlation matrix corresponding covariance matrix criterion data matrix defined deletion denoted derived space dimension dimensional discriminant analysis discriminant function discussed distance distribution effects eigenvalues endogenous variables equation error Euclidean distance example F-value factor analysis Figure given independent variables indicated individual KSI 1 KSI LAMBDA latent class model LISREL maximum likelihood mean measures methods multiple multiple discriminant analysis n₁ n₂ null hypothesis objects obtained orthogonal parameter estimates posterior probability predictor variables principal components analysis probability problem procedure regression analysis regression coefficients regression model relationship residuals restrictions rotation sample scores shown solution squared standard statistically significant stimulus space structure sums-of-squares Table techniques test statistic variance variance-covariance matrix vector X₁ Y₁ Y₂ zero