Optimal ControlA NEW EDITION OF THE CLASSIC TEXT ON OPTIMAL CONTROL THEORY As a superb introductory text and an indispensable reference, this new edition of Optimal Control will serve the needs of both the professional engineer and the advanced student in mechanical, electrical, and aerospace engineering. Its coverage encompasses all the fundamental topics as well as the major changes that have occurred in recent years. An abundance of computer simulations using MATLAB and relevant Toolboxes is included to give the reader the actual experience of applying the theory to real-world situations. Major topics covered include:
|
Contents
OPTIMAL CONTROL OF DISCRETETIME SYSTEMS | 19 |
OPTIMAL CONTROL OF CONTINUOUSTIME | 110 |
4 | 177 |
5 | 213 |
DYNAMIC PROGRAMMING | 260 |
OPTIMAL CONTROL FOR POLYNOMIAL SYSTEMS | 287 |
OUTPUT FEEDBACK AND STRUCTURED CONTROL | 297 |
ROBUSTNESS AND MULTIVARIABLE | 355 |
DIFFERENTIAL GAMES | 438 |
REINFORCEMENT LEARNING AND OPTIMAL ADAPTIVE | 461 |
APPENDIX A REVIEW OF MATRIX ALGEBRA | 518 |
527 | |
535 | |
Other editions - View all
Common terms and phrases
According adaptive algorithm apply approach assume Bellman equation Chapter closed-loop component compute condition consider constant constraint control input control law converges cost costate defined definite depends derive desired determine difference discrete discuss dynamics eigenvalues equal error Example expressed feedback gain FIGURE final fixed frequency function gain given guarantee Hamiltonian holds implement important initial integral interval iteration Kalman known learning linear loop Lyapunov matrix means measured method minimize nonlinear Note observer obtain optimal control output parameters performance index plant plot poles positive problem quadratic reachability reference regulator response Riccati equation robustness satisfies scalar sequence shown simulation solution solve stable steady-state step Suppose Table techniques tracking trajectory update vector weighting write xk+1 yields zero