Probability and StatisticsThe revision of this well-respected text presents a balanced approach of the classical and Bayesian methods and now includes a new chapter on simulation (including Markov chain Monte Carlo and the Bootstrap), expanded coverage of residual analysis in linear models, and more examples using real data. Probability Statistics was written for a one or two semester probability and statistics course offered primarily at four-year institutions and taken mostly by sophomore and junior level students, majoring in mathematics or statistics. Calculus is a prerequisite, and a familiarity with the concepts and elementary properties of vectors and matrices is a plus. Introduction to Probability; Conditional Probability; Random Variables and Distribution; Expectation; Special Distributions; Estimation; Sampling Distributions of Estimators; Testing Hypotheses; Categorical Data and Nonparametric Methods; Linear Statistical Models; Simulation For all readers interested in probability and statistics. |
From inside the book
Results 1-3 of 30
Page 276
... statistician can observe the value x of the random vector X before estimating 0 , and let ( 0 | x ) denote the posterior p.d.f. of 0 on the interval Q. For any estimate a that the statistician might use , his expected loss will now be E ...
... statistician can observe the value x of the random vector X before estimating 0 , and let ( 0 | x ) denote the posterior p.d.f. of 0 on the interval Q. For any estimate a that the statistician might use , his expected loss will now be E ...
Page 282
Morris H. DeGroot. distribution that each statistician assigns to 0. b ) Find the Bayes estimate for each statistician . c ) Show that after the opinions of the 1000 registered voters in the random sample had been obtained , the Bayes ...
Morris H. DeGroot. distribution that each statistician assigns to 0. b ) Find the Bayes estimate for each statistician . c ) Show that after the opinions of the 1000 registered voters in the random sample had been obtained , the Bayes ...
Page 300
... statistician B who can learn only the value of the statistic T and cannot observe the individual values of X1 , ... , X. If T is a sufficient statistic , then the conditional joint distribution of X1 , ... , X ,, given that T = t , is ...
... statistician B who can learn only the value of the statistic T and cannot observe the individual values of X1 , ... , X. If T is a sufficient statistic , then the conditional joint distribution of X1 , ... , X ,, given that T = t , is ...
Common terms and phrases
a₁ assume B₁ Bayes estimator Bernoulli distribution beta distribution binomial distribution c₁ c₂ conditions of Exercise confidence interval Consider constant continuous distribution defined by Eq degrees of freedom Determine the value distribution with mean distribution with parameters drug Example exponential distribution following hypotheses follows from Eq form a random Furthermore gamma distribution given in Eq given value H₁ Hence joint distribution joint p.d.f. level of significance likelihood function linear loss function mean µ median null hypothesis observed values obtained P₁ Poisson distribution possible values posterior distribution Pr(X prior distribution problem random sample random variables X1 regression rejected sample mean selected at random specified standard deviation standard normal distribution statistician sufficient statistic Suppose that X1 Table tail area test procedure test the following Theorem UMP test unbiased estimator uniform distribution Var(X variance o² X₁ x² distribution Y₁ σ²