Improving Efficiency by Shrinkage: The James--Stein and Ridge Regression Estimators
Offers a treatment of different kinds of James-Stein and ridge regression estimators from a frequentist and Bayesian point of view. The book explains and compares estimators analytically as well as numerically and includes Mathematica and Maple programs used in numerical comparison.;College or university bookshops may order five or more copies at a special student rate, available on request.
What people are saying - Write a review
We haven't found any reviews in the usual places.
The Stein Paradox
The Ridge Estimators of Hoerl and Kennard
Estimation for a Single Linear Model
The Positive Parts
Other Linear Model Setups
The Precision of Individual Estimators
Other editions - View all
approximate MMSE average MSE Bayes estimator Bayes risk Bayesian C:Define C.R. Rao Chapter compared components computation conditional MSE considered contraction estimator Corollary derived diagonal matrix dimensional ellipsoid empirical Bayes estimators Equation estimable parametric functions estimator of C.R. estimator of Hoerl Example Exercise formulated frequentist full rank given Hoerl and Kennard IMSE IMSE)r inequality James-Stein estimator Kalman filter least square estimator linear model linear regression loss function LS estimators mean square error minimax minimax estimator minimizing mixed estimator MSEP multicollinearity multivariate normal distribution non-full rank observations obtained optimal optimum ordinary ridge estimator positive definite positive part estimator prior assumptions prior distribution prior information prior mean problem Proof quadratic loss function random variables result ridge regression estimator ridge type estimators sample Section Show shrinkage estimators smaller MSE smaller risk Statistics sufficient condition Table Theorem unbiased estimator uniformly smaller values variance vector Wind estimator zero