Applied Regression AnalysisAn outstanding introduction to the fundamentals of regression analysis-updated and expanded The methods of regression analysis are the most widely used statistical tools for discovering the relationships among variables. This classic text, with its emphasis on clear, thorough presentation of concepts and applications, offers a complete, easily accessible introduction to the fundamentals of regression analysis. Assuming only a basic knowledge of elementary statistics, Applied Regression Analysis, Third Edition focuses on the fitting and checking of both linear and nonlinear regression models, using small and large data sets, with pocket calculators or computers. This Third Edition features separate chapters on multicollinearity, generalized linear models, mixture ingredients, geometry of regression, robust regression, and resampling procedures. Extensive support materials include sets of carefully designed exercises with full or partial solutions and a series of true/false questions with answers. All data sets used in both the text and the exercises can be found on the companion disk at the back of the book. For analysts, researchers, and students in university, industrial, and government courses on regression, this text is an excellent introduction to the subject and an efficient means of learning how to use a valuable analytical tool. It will also prove an invaluable reference resource for applied scientists and statisticians. |
Contents
Basic Prerequisite Knowledge | 1 |
Fitting a Straight Line by Least Squares | 15 |
Checking the Straight Line | 47 |
Special Topics | 79 |
Straight Line Case | 115 |
The General Regression Situation | 135 |
Extra Sums of Squares and Tests for Several Parameters | 149 |
115 | 169 |
Ridge Regression | 387 |
Generalized Linear Models GLIM | 401 |
Mixture Ingredients as Predictor Variables | 409 |
The Geometry of Least Squares | 427 |
More Geometry of Least Squares | 447 |
Orthogonal Polynomials and Summary Data | 461 |
Multiple Regression Applied to Analysis of Variance Problems | 473 |
An Introduction to Nonlinear Estimation | 505 |
Serial Correlation in the Residuals and the DurbinWatson Test | 179 |
135 | 201 |
More on Checking Fitted Models | 205 |
Special Topics | 217 |
Bias in Regression Estimates and Expected Values of Mean | 235 |
On Worthwhile Regressions Big Fs and | 243 |
Models Containing Functions of the Predictors Including | 251 |
Transformation of the Response Variable | 277 |
Dummy Variables | 299 |
Selecting the Best Regression Equation | 327 |
Appendix 15A Hald Data Correlation Matrix and All 15 Possible | 348 |
205 | 353 |
Exercises for Chapter 15 | 355 |
149 | 364 |
217 | 370 |
Other editions - View all
Common terms and phrases
a₁ analysis of variance ANOVA b₁ calculations coefficients columns confidence interval d₁ degrees of freedom distribution dL du dL dummy variable estimation space example Exercises for Chapters extra sum F-statistic F-test F-value Figure fit the model fitted equation fitted values function given H₁ hypothesis inverse lack of fit least squares estimates linear model matrix mean square method MINITAB model Y n₁ nonlinear normal equations Note observations obtained orthogonal parameters plot polynomial predicted predictor variables problem procedure pure error quadratic regression equation residual sum response ridge regression second-order Section selected serial correlation set of data shown significant Source df SS SS MS F steam data straight line studentized residuals sum of squares Suppose transformation usually variance table variation vector versus X₁ and X2 Y₁ Z₁ zero β₁ βι βο σ² Χβ