## Improving Efficiency by Shrinkage: The James--Stein and Ridge Regression EstimatorsOffers a treatment of different kinds of James-Stein and ridge regression estimators from a frequentist and Bayesian point of view. The book explains and compares estimators analytically as well as numerically and includes Mathematica and Maple programs used in numerical comparison.;College or university bookshops may order five or more copies at a special student rate, available on request. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Preface | 1 |

The Stein Paradox | 71 |

Regression Estimator | 104 |

The Ridge Estimators of Hoerl and Kennard | 111 |

Estimation for a Single Linear Model | 167 |

The Positive Parts | 307 |

Other Linear Model Setups | 371 |

The Precision of Individual Estimators | 441 |

The Multivariate Linear Model | 491 |

Other Linear Model Setups | 531 |

Summary and Conclusion | 579 |

References | 591 |

619 | |

625 | |

### Other editions - View all

### Common terms and phrases

approximate MMSE average MSE Bayes estimator Bayes risk Bayesian C:Define C.R. Rao Chapter compared components computation conditional MSE considered contraction estimator Corollary derived diagonal matrix dimensional ellipsoid empirical Bayes estimators Equation estimable parametric functions estimator of C.R. estimator of Hoerl Example Exercise formulated frequentist full rank given Hoerl and Kennard IMSE IMSE)r inequality James-Stein estimator Kalman filter least square estimator linear model linear regression loss function LS estimators mean square error minimax minimax estimator minimizing mixed estimator MSEP multicollinearity multivariate normal distribution non-full rank observations obtained optimal optimum ordinary ridge estimator positive definite positive part estimator prior assumptions prior distribution prior information prior mean problem Proof quadratic loss function random variables result ridge regression estimator ridge type estimators sample Section Show shrinkage estimators smaller MSE smaller risk Statistics sufficient condition Table Theorem unbiased estimator uniformly smaller values variance vector Wind estimator zero