A First Course in Linear Model Theory
This innovative, intermediate-level statistics text fills an important gap by presenting the theory of linear statistical models at a level appropriate for senior undergraduate or first-year graduate students. With an innovative approach, the author's introduces students to the mathematical and statistical concepts and tools that form a foundation for studying the theory and applications of both univariate and multivariate linear models
A First Course in Linear Model Theory systematically presents the basic theory behind linear statistical models with motivation from an algebraic as well as a geometric perspective. Through the concepts and tools of matrix and linear algebra and distribution theory, it provides a framework for understanding classical and contemporary linear model theory. It does not merely introduce formulas, but develops in students the art of statistical thinking and inspires learning at an intuitive level by emphasizing conceptual understanding.
The authors' fresh approach, methodical presentation, wealth of examples, and introduction to topics beyond the classical theory set this book apart from other texts on linear models. It forms a refreshing and invaluable first step in students' study of advanced linear models, generalized linear models, nonlinear models, and dynamic models.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Properties of Special Matrices
Generalized Inverses and Solutions to Linear Systems
The General Linear Model
Multivariate Normal and Related Distributions
Sampling from the Multivariate Normal Distribution
Inference for the General Linear Model
Multiple Regression Models
ANOVA table assume Chapter chi-square distribution coefficients columns components compute confidence interval Consider correlation corresponding covariance decomposition defined Definition degrees of freedom denote derive eigenvalues eigenvectors equal error estimable functions exists F-statistic F-test fixed-effects full rank g-inverse given idempotent implies independently distributed inference inverse least squares estimate levels of Factor likelihood function linear model linear model theory linear regression m x n matrix matrix of rank model theory multiple multivariate normal distribution n x n noncentral nonsingular matrix nonzero normal equations null hypothesis observations obtained orthogonal matrix P'AP parameters partitioned procedure Proof prove property quadratic form random variable random vector random-effects random-effects model regression model residuals Result sample scalar solution space subspace sum of squares Suppose symmetric matrix test statistic transformation unbiased estimator variance verify x'Ax zero