## Regression Analysis Under A Priori Parameter RestrictionsThis monograph focuses on the construction of regression models with linear and non-linear constrain inequalities from the theoretical point of view. Unlike previous publications, this volume analyses the properties of regression with inequality constrains, investigating the flexibility of inequality constrains and their ability to adapt in the presence of additional a priori information The implementation of inequality constrains improves the accuracy of models, and decreases the likelihood of errors. Based on the obtained theoretical results, a computational technique for estimation and prognostication problems is suggested. This approach lends itself to numerous applications in various practical problems, several of which are discussed in detail The book is useful resource for graduate students, PhD students, as well as for researchers who specialize in applied statistics and optimization. This book may also be useful to specialists in other branches of applied mathematics, technology, econometrics and finance |

### What people are saying - Write a review

We haven't found any reviews in the usual places.

### Contents

1 | |

Asymptotic Properties of Parameters in Nonlinear Regression Models
| 29 |

Method of Empirical Means in Nonlinear Regression and Stochastic Optimization Models
| 73 |

Determination of Accuracy of Estimation of Regression Parameters Under Inequality Constraints
| 121 |

Asymptotic Properties of Recurrent Estimates of Parameters of Nonlinear Regression with Constraints
| 182 |

### Other editions - View all

Regression Analysis Under A Priori Parameter Restrictions Pavel S. Knopov,Arnold S. Korkhin No preview available - 2011 |

Regression Analysis Under A Priori Parameter Restrictions Pavel S. Knopov,Arnold S. Korkhin No preview available - 2013 |

### Common terms and phrases

ˇ ˇ ˇ According to Assumption Algorithm arbitrary Assume Assumption 2.9 asymptotic distribution Asymptotic Properties calculation compact set conditions of Theorem Consider consistent estimate continuous converges in distribution converges in probability convex defined Denote determined distribution function eigenvalue elements equation estimation problem exists follows Fr(x gi(g holds true implies inactive constraints independent inequality constraints iteration ith component Knopov and Kasitskaya Korkhin Lagrange multipliers least squares estimates Lemma limit distribution limT-co linear regression matrix KT matrix of m.s.e. minimum nonlinear regression normally distributed obtain Optimization orthogonal orthogonal matrix plane positive definite probability space Proof proved quadratic programming quadratic programming problem random variable random vector regression models regression parameter estimate right-hand side sample estimate Sect sequence Springer Science+Business Media statement stochastic programming Taking into account Theorem 2.1 transformations unknown parameters values