Pitman's measure of closeness: a comparison of statistical estimators
Society for Industrial and Applied Mathematics, 1993 - Mathematics - 226 pages
Pitman's Measure of Closeness (PMC) is simply an idea whose time has come. Certainly there are many different ways to estimate unknown parameters, but which method should you use? Posed as an alternative to the concept of mean-squared-error, PMC is based on the probabilities of the closeness of competing estimators to an unknown parameter. Renewed interest in PMC over the last 20 years has motivated the authors to produce this book, which explores this method of comparison and its usefulness. Written with research oriented statisticians and mathematicians in mind, but also considering the needs of graduate students in statistics courses, this book provides a thorough introduction to the methods and known results associated with PMC. Following a foreword by C .R. Rao, the first three chapters focus on basic concepts, history, controversy, paradoxes and examples associated with the PMC criterion.
37 pages matching Hence in this book
Results 1-3 of 37
What people are saying - Write a review
We haven't found any reviews in the usual places.
Introduction to probability and mathematical statistics
Lee J. Bain,Max Engelhardt
Snippet view - 1987
Development of Pitmans Measure of Closeness
Anomalies with PMC
5 other sections not shown
Other editions - View all
ancillarity BAN estimators Bayes Bayesian candidate Cauchy distribution Chapter choice class of estimators competing estimators consider convergence criteria crossing points defined Definition denote density function discussed distribution function efficiency equivariant equivariant estimators esti estimation problems estimation theory estimator of 9 example Geary-Rao Theorem Ghosh given Hence illustrate inequality influence function intransitiveness invariant estimator likelihood function linear combination loss function M-estimators mators maximum likelihood estimator mean squared error measure of closeness median unbiased estimator minimum chi-square Nayak's normal distribution observations obtain order statistics pairwise comparisons paradox parameter 9 parameter space Pitman closeness criterion Pitman estimator Pitman-closest estimator Pitman's Lemma Pitman's measure posterior distribution preference procedure produces random sample random variables random vector Rao's real parameter regularity conditions risk sample mean scale parameter second-order sufficient statistic switching point symmetric tion transformation transitiveness UMVUE unbiasedness unknown parameter