Mathematical Theory of Control Systems Design
Springer Netherlands, Jan 31, 1996 - Mathematics - 672 pages
Give, and it shall be given unto you. ST. LUKE, VI, 38. The book is based on several courses of lectures on control theory and appli cations which were delivered by the authors for a number of years at Moscow Electronics and Mathematics University. The book, originally written in Rus sian, was first published by Vysshaya Shkola (Higher School) Publishing House in Moscow in 1989. In preparing a new edition of the book we planned to make only minor changes in the text. However, we soon realized that we like many scholars working in control theory had learned many new things and had had many new insights into control theory and its applications since the book was first published. Therefore, we rewrote the book especially for the English edition. So, this is substantially a new book with many new topics. The book consists of an introduction and four parts. Part One deals with the fundamentals of modern stability theory: general results concerning stability and instability, sufficient conditions for the stability of linear systems, methods for determining the stability or instability of systems of various type, theorems on stability under random disturbances.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Continuous and Discrete Deterministic Systems
and First Approximation Stability
Use of Degenerate Lyapunov Functions
32 other sections not shown
Other editions - View all
admissible control arbitrary assume asymptotically stable Bellman equation Bellman function Cauchy matrix Chapter coefficients conditions for optimal Consider constant control system control u(t coordinates corresponding cost functional covariance matrix denote derivative described determined differential equations dx(t eigenvalues equal equation 1.1 equations of motion example exists expression F-control filtering formula function V(t Gaussian given Hence infimum initial condition integral interval ip(t Kalman filter logarithmic norm Lyapunov function maximum principle minimizing minimum necessary conditions noted observation process obtain optimal control optimal control problem optimal trajectory parameters phase positive definite prescribed problem 3.1 proof random rank reactor relations respect Riccati equation satisfies the equation scalar segment sequence solution of equation stochastic Suppose switch points switching curve Theorem time-optimal problem transfer trivial solution uo(t vector x(t velocity virtue Wiener process Xi(t xo(t zero
All Book Search results »