## Elements of Computational StatisticsComputationally intensive methods have become widely used both for statistical inference and for exploratory analyses of data. The methods of computational statistics involve resampling, partitioning, and multiple transformations of a dataset. They may also make use of randomly generated artificial data. Implementation of these methods often requires advanced techniques in numerical analysis, so there is a close connection between computational statistics and statistical computing. This book describes techniques used in computational statistics, and addresses some areas of application of computationally intensive methods, such as density estimation, identification of structure in data, and model building. Although methods of statistical computing are not emphasized in this book, numerical techniques for transformations, for function approximation, and for optimization are explained in the context of the statistical methods. The book includes exercises, some with solutions. The book can be used as a text or supplementary text for various courses in modern statistics at the advanced undergraduate or graduate level, and it can also be used as a reference for statisticians who use computationally-intensive methods of analysis. Although some familiarity with probability and statistics is assumed, the book reviews basic methods of inference, and so is largely self-contained. James Gentle is University Professor of Computational Statistics at George Mason University. He is a Fellow of the American Statistical Association and a member of the International Statistical Institute. He has held several national offices in the American Statistical Association and has served as associate editor for journals of the ASA as well as for other journals in statistics and computing. He is the author of Random Number Generation and Monte Carlo Methods and Numerical Linear Algebra for Statistical Applications. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

C | 23 |

2 | 37 |

Randomization and Data Partitioning | 67 |

4 | 83 |

5 | 97 |

6 | 125 |

8 | 193 |

Exercises | 199 |

Nonparametric Estimation of Probability Density Functions | 201 |

Structure in Data | 225 |

11 | 291 |

15 | 301 |

29 | 308 |

404 | |

409 | |

415 | |

### Other editions - View all

### Common terms and phrases

algorithm approach approximation Bézier curves bias bins bootstrap bootstrap estimate called classification clustering computations confidence intervals consider convergence correlations corresponding covariance curves dataset defined density estimation depends describe determine dimensions discuss distance distribution function ECDF elements equation example factor finite gamma distribution given graphical displays histogram integral iterations jackknife jackknife estimator kernel L2 norm least squares maximum likelihood mean measure methods minimal Monte Carlo study Monte Carlo test multivariate data normal distribution number of observations objective optimization orthogonal polynomials parallel coordinates parameter plot points principal components probability density function problem projection pursuit properties quantiles random number random sample random variable regression represent resampling residuals rotation S-Plus scale sequence shown in Figure similar simulation smoothing space splines standard structure tessellation test statistic tion transformation tree two-dimensional univariate values variance variance-covariance matrix variation various vector