## A Distribution-Free Theory of Nonparametric RegressionThis book provides a systematic in-depth analysis of nonparametric regression with random design. It covers almost all known estimates such as classical local averaging estimates including kernel, partitioning and nearest neighbor estimates, least squares estimates using splines, neural networks and radial basis function networks, penalized least squares estimates, local polynomial kernel estimates, and orthogonal series estimates. The emphasis is on distribution-free properties of the estimates. Most consistency results are valid for all distributions of the data. Whenever it is not possible to derive distribution-free results, as in the case of the rates of convergence, the emphasis is on results which require as few constrains on distributions as possible, on distribution-free inequalities, and on adaptation. The relevant mathematical theory is systematically developed and requires only a basic knowledge of probability theory. The book will be a valuable reference for anyone interested in nonparametric regression and is a rich source of many useful mathematical techniques widely scattered in the literature. In particular, the book introduces the reader to empirical process theory, martingales and approximation properties of neural networks. |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

mparametric Regression Important? | 1 |

stric versus Nonparametric Estimation | 9 |

f Convergence 277 | 13 |

ersus Random Design Regression | 15 |

A Dimensionality | 23 |

inds | 31 |

ig Estimates | 52 |

ency | 60 |

raphic Notes | 281 |

f Convergence | 294 |

sis Function Networks | 329 |

Series Estimates | 353 |

ency | 366 |

Techniques from Empirical Process Theory | 380 |

ise Polynomial Partitioning Estimates | 397 |

iate Penalized Least Squares Estimates | 408 |

raphic Notes | 67 |

f Convergence | 77 |

f Convergence | 93 |

he Sample | 100 |

raphic Notes | 108 |

f Theorem 8 1 | 115 |

t Neighbor Estimates | 126 |

Exponential Inequalities | 131 |

ig and Packing Numbers | 140 |

orm Law of Large Numbers | 153 |

ency from Bounded to Unbounded Y | 165 |

Rate of Convergence | 183 |

Complexity Regularization | 222 |

y of DataDependent Partitioning Estimates | 235 |

Partitions with DataDependent Grid Size | 241 |

raphic Notes | 250 |

f Lemma 20 1 | 414 |

striate Penalized Least Squares Estimates | 425 |

f Convergence | 433 |

ition of Complexity Regularization | 440 |

raphic notes | 446 |

Index Models | 456 |

sive Estimates | 493 |

cursive Partitioning Estimate | 507 |

ive Kernel Estimate | 517 |

Dbservations | 540 |

sion Estimation for Model B | 555 |

Autoregression | 569 |

ting Smooth Regression Functions | 582 |

lities for Martingales | 598 |

### Other editions - View all

A Distribution-Free Theory of Nonparametric Regression László Györfi,Michael Kohler,Adam Krzyzak,Harro Walk No preview available - 2002 |

A Distribution-Free Theory of Nonparametric Regression László Györfi,Michael Kohler,Adam Krzyzak,Harro Walk No preview available - 2010 |

### Common terms and phrases

approximation arbitrary arg min Assume B-splines Borel-Cantelli lemma C)-smooth choose constant covering numbers cross-validation defined depends Devroye distribution empirical L2 risk estimate mn exists fi(dx follows function g graphic Notes Gyorfi Hoeffding's inequality i=l i=l id Exercises implies independent inequality Jensen's inequality kernel estimate L2 error large numbers law of large least squares estimates Lemma Let mn lim sup log(n m(Xi minimax mn(x multivariate nearest neighbor neighbor estimate neural networks nonnegative nonparametric regression norm numbers parameters penalized least squares piecewise polynomials polynomial partitioning estimate probability Problem proof of Theorem prove random variables rate of convergence regression estimate regression function estimate right-hand side sample satisfies sequence set of functions smoothing spline Statistics subset Theorem 9.4 truncated universally consistent VC dimension zero