Bayesian estimation and experimental design in linear regression models
Presents a clear treatment of the design and analysis of linear regression experiments in the presence of prior knowledge about the model parameters. Develops a unified approach to estimation and design; provides a Bayesian alternative to the least squares estimator; and indicates methods for the construction of optimal designs for the Bayes estimator. Material is also applicable to some well-known estimators using prior knowledge that is not available in the form of a prior distribution for the model parameters; such as mixed linear, minimax linear and ridge-type estimators.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Bayesian regression and prior distributions
Bayesian experimental design
4 other sections not shown
Other editions - View all
arbitrary assume Assumption Bandemer Bayes affine estimator Bayes estimator Bayes linear estimators Bayes optimality Bayes risk Bayesian experimental design Bayesian information matrix compact condition conjugate prior convex Corollary corresponding covariance matrix defined denote density design problem design vn eigenvalues eigenvectors ellipsoid error esti estimator 9 estimator for 9 estimator with respect exact design experimental design favourable moment matrix favourable prior distribution full rank given holds homoscedastic LB-optimal LB-optimal design least favourable prior least squares estimator Lemma likelihood function linear regression loss function mator MB(f minimax estimator minimax linear estimator minimization normally distributed observations obtain one-point designs optimal designs orem parameter region Pilz positive definite posterior distribution prior knowledge Proof q(Mp quadratic loss regression parameter restricted result risk function robustness satisfied Section Statistical supporting points Theorem tion variance vector xeXE Z-optimal