## Regression Analysis: Theory, Methods, and ApplicationsAny method of fitting equations to data may be called regression. Such equations are valuable for at least two purposes: making predictions and judging the strength of relationships. Because they provide a way of em pirically identifying how a variable is affected by other variables, regression methods have become essential in a wide range of fields, including the social sciences, engineering, medical research and business. Of the various methods of performing regression, least squares is the most widely used. In fact, linear least squares regression is by far the most widely used of any statistical technique. Although nonlinear least squares is covered in an appendix, this book is mainly about linear least squares applied to fit a single equation (as opposed to a system of equations). The writing of this book started in 1982. Since then, various drafts have been used at the University of Toronto for teaching a semester-long course to juniors, seniors and graduate students in a number of fields, including statistics, pharmacology, engineering, economics, forestry and the behav ioral sciences. Parts of the book have also been used in a quarter-long course given to Master's and Ph.D. students in public administration, urban plan ning and engineering at the University of Illinois at Chicago (UIC). This experience and the comments and criticisms from students helped forge the final version. |

### What people are saying - Write a review

User Review - Flag as inappropriate

this is good

### Contents

I | 1 |

II | 2 |

III | 5 |

IV | 7 |

V | 10 |

VI | 11 |

VII | 13 |

VIII | 14 |

LXXXIV | 188 |

LXXXV | 189 |

LXXXVI | 190 |

LXXXVII | 192 |

LXXXVIII | 194 |

LXXXIX | 195 |

XC | 197 |

XCII | 200 |

IX | 17 |

X | 18 |

XI | 20 |

XII | 23 |

XIII | 28 |

XIV | 30 |

XV | 31 |

XVI | 35 |

XVIII | 37 |

XIX | 39 |

XX | 41 |

XXI | 42 |

XXII | 44 |

XXIV | 46 |

XXV | 49 |

XXVI | 60 |

XXVII | 62 |

XXVIII | 64 |

XXIX | 65 |

XXX | 66 |

XXXI | 67 |

XXXII | 71 |

XXXV | 72 |

XXXVI | 73 |

XXXVII | 74 |

XXXVIII | 83 |

XXXIX | 84 |

XL | 88 |

XLI | 89 |

XLII | 92 |

XLIII | 95 |

XLIV | 100 |

XLV | 101 |

XLVI | 105 |

XLVII | 106 |

XLVIII | 107 |

XLIX | 108 |

L | 110 |

LI | 111 |

LII | 114 |

LIII | 115 |

LIV | 118 |

LV | 128 |

LVI | 132 |

LVII | 133 |

LVIII | 134 |

LX | 136 |

LXI | 138 |

LXII | 140 |

LXIII | 142 |

LXIV | 143 |

LXVI | 144 |

LXVII | 146 |

LXVIII | 154 |

LXIX | 155 |

LXX | 156 |

LXXII | 157 |

LXXIII | 158 |

LXXIV | 160 |

LXXV | 161 |

LXXVI | 173 |

LXXVII | 176 |

LXXVIII | 180 |

LXXIX | 181 |

LXXXI | 182 |

LXXXIII | 186 |

XCIII | 204 |

XCIV | 209 |

XCVI | 211 |

XCVII | 213 |

XCVIII | 218 |

XCIX | 222 |

C | 223 |

CI | 224 |

CII | 225 |

CIII | 231 |

CIV | 233 |

CV | 234 |

CVI | 235 |

CVII | 236 |

CIX | 237 |

CX | 238 |

CXI | 239 |

CXII | 240 |

CXIII | 243 |

CXV | 251 |

CXVI | 253 |

CXVII | 255 |

CXVIII | 256 |

CXIX | 257 |

CXX | 258 |

CXXI | 261 |

CXXII | 263 |

CXXIII | 267 |

CXXIV | 268 |

CXXVI | 269 |

CXXVII | 270 |

CXXVIII | 271 |

CXXX | 272 |

CXXXI | 273 |

CXXXII | 276 |

CXXXIII | 277 |

CXXXIV | 278 |

CXXXV | 279 |

CXXXVI | 280 |

CXXXVII | 282 |

CXXXVIII | 284 |

CXXXIX | 285 |

CXLII | 286 |

CXLIII | 288 |

CXLIV | 290 |

CXLV | 292 |

CXLVI | 293 |

CXLVII | 295 |

CXLVIII | 297 |

CXLIX | 298 |

CL | 299 |

CLII | 300 |

CLIII | 301 |

CLIV | 302 |

CLVI | 303 |

CLVIII | 305 |

CLX | 306 |

CLXI | 308 |

CLXII | 310 |

CLXIII | 316 |

CLXIV | 319 |

327 | |

336 | |

345 | |

### Other editions - View all

Regression Analysis: Theory, Methods and Applications Ashish K. Sen,Muni S. Srivastava Limited preview - 2013 |

Regression Analysis: Theory, Methods, and Applications Ashish Sen,Muni Srivastava Limited preview - 2012 |

Regression Analysis: Theory, Methods, and Applications Ashish Sen,M. S. Srivastava Snippet view - 1990 |

### Common terms and phrases

alternative Appendix approximately assume bias called cent Chapter coefficients columns computer packages Condition Numbers confidence regions consider Continuation of Example corresponding covariance matrix CRHS data set degrees of freedom deleted denoted density dependent DFFITS diagonal matrix E(yi ei's eigenvalues equation estimate of a2 examine Exercise F distribution FLR ST FP follows function G-M conditions Gauss-Markov conditions given in Exhibit Hence heteroscedasticity homoscedastic idempotent idempotent matrix independent variables indicator variable influential points iterations least squares estimate linear combination linear least squares mean methods MINITAB multicollinearity nonlinear least squares normally distributed obtained Obviously orthogonal outliers parameter estimates predicted values problem random variables rankit regression model relationship residual plots ridge regression Section shown shows ST FP BDR standard error statistic stepwise procedure Studentized residuals subsets sum of squares t-value test the hypothesis Theorem transformations unbiased estimator usually variance a2 vector weights zero