Introduction to the Theory of StatisticsProbability; Random variables, distribution functions, and expectation; Special parametric families of univariate distributions; Joint and conditional distributions, stochastic independence, more expectation; Distributions of functions of random variables; Sampling and sampling distributions; Parametric interval estimation; Tests of hypotheses; Linear models; Nonparametric method. |
Contents
Introduction and Summary | 1 |
ProbabilityAxiomatic | 8 |
Density Functions | 51 |
Copyright | |
30 other sections not shown
Other editions - View all
Introduction to the Theory of Statistics Alexander MacFarlane Mood,Franklin A. Graybill,Duane C. Boes No preview available - 1974 |
Common terms and phrases
A₁ assume asymptotic B₁ balls Bayes estimator Bernoulli binomial distribution Chap chi-square distribution confidence interval continuous random variables Cramér-Rao lower bound cumulative distribution function defined Definition degrees of freedom denote the number density f(x discrete density function discrete random variables distribution with mean distribution with parameters equal estimator of t(0 event EXAMPLE exponential find the distribution fx(x ƒx(x given hence hypothesis independent integer joint density joint distribution Let X1 likelihood-ratio loss function maximum-likelihood estimator mean and variance mean-squared error moment generating function n₁ normal distribution Note observations obtained order statistics outcomes P[X₁ P₁ percent confidence interval pivotal quantity Poisson distribution population probability density function PROOF random sample sample mean sample space standard normal subsets sufficient statistics Suppose T₁ T₂ Theorem tion tossing UMVUE unbiased estimator uniformly values variance o² X₁ X₂ Y₁ Y₂ σ²