## Applied linear statistical models: regression, analysis of variance, and experimental designsSome basic results in probability and statistics. basic regression analysis. Linear regression with one independent variable. Inferences in regression analysis. Aptness of model and remedial measures. Topics in regression analysis - I. General regression and correlation analysis. Matrix appreach to simple regression analysis. Multiple regression. Polymonial regression. Indicator variables. Topics in regression analysis - II. Search for "best" set of independent variables. Normal correlation models. Basic analysis of variance. Single - factor analysis of variance. Analysis of factor effects. Implementation of ANOVA model. Topics in analysis of variance - I. Multifactor analysis of variance. Two factor analysis of variance. Analysis of two - factor studies. To pics in analysis of variance - II. Multifactor studies. Experimental designs. Completely randomized designs. Analysis of covariance for completely randomized designs. Randomized block designs. Latin square designs. |

### From inside the book

Results 1-3 of 37

Page 439

on the deviations of the factor level sample means F, around the overall mean Y .

If all factor level sample means Yj are the same,

**SSTR**: A measure of the extent of differences between factor level means, basedon the deviations of the factor level sample means F, around the overall mean Y .

If all factor level sample means Yj are the same,

**SSTR**= 0. The more the factor ...Page 572

When we square (17.29) and sum over all observations, the cross-product term

drops out and we obtain : =

the variability between the ab treatment means and is the ordinary treatment sum

...

When we square (17.29) and sum over all observations, the cross-product term

drops out and we obtain : =

**SSTR**+ SSE III(Yijk-YJ2 i j * »xm.-F...)2**SSTR**reflectsthe variability between the ab treatment means and is the ordinary treatment sum

...

Page 574

The breakdown of

orthogonal decomposition. While many such decompositions are possible, this

one is of interest because the three components provide information about the

factor A ...

The breakdown of

**SSTR**into the components SSA, SSB, and SSAB is anorthogonal decomposition. While many such decompositions are possible, this

one is of interest because the three components provide information about the

factor A ...

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

Some Basic Results in Probability and Statistics | 1 |

Linear Regression with One Independent Variable | 21 |

Inferences in Regression Analysis | 53 |

Copyright | |

22 other sections not shown

### Other editions - View all

### Common terms and phrases

95 percent analysis of variance ANOVA appropriate blocking variable Bonferroni column Company example completely randomized design conclude C2 confidence interval correlation covariance analysis decision rule degrees of freedom denoted equal error sum error terms error variance experimental units factor effects factor level means family confidence coefficient Figure follows Hence illustration independent variables indicator variables interval estimate latin square latin square design level of significance linear regression main effects matrix mean response method normally distributed Note observations obtain parameters percent confidence prediction prediction interval probability distribution procedure random variables Refer to Problem regression analysis regression approach regression coefficients regression function regression line residual plots response function sample sizes shown significance of 05 Source of Variation SSAB SSE(F SSE(R SSTO SSTR sum of squares test statistic three-factor transformation treatment effects treatment means two-factor study Type I error variance model vector Westwood Company zero