Optimal ControlThis new, updated edition of Optimal Control reflects major changes that have occurred in the field in recent years and presents, in a clear and direct way, the fundamentals of optimal control theory. It covers the major topics involving measurement, principles of optimality, dynamic programming, variational methods, Kalman filtering, and other solution techniques. To give the reader a sense of the problems that can arise in a handson project, the authors have included new material on optimal output feedback control, a technique used in the aerospace industry. Also included are two new chapters on robust control to provide background in this rapidly growing area of interest. Relations to classical control theory are emphasized throughout the text, and a rootlocus approach to steadystate controller design is included. A chapter on optimal control of polynomial systems is designed to give the reader sufficient background for further study in the field of adaptive control. The authors demonstrate through numerous examples that computer simulations of optimal controllers are easy to implement and help give the reader an intuitive feel for the equations. To help build the reader's confidence in understanding the theory and its practical applications, the authors have provided many opportunities throughout the book for writing simple programs. Optimal Control will also serve as an invaluable reference for control engineers in the industry. It offers numerous tables that make it easy to find the equations needed to implement optimal controllers for practical applications. All simulations have been performed using MATLAB and relevant Toolboxes. Optimal Control assumes a background in the statevariable representation of systems; because matrix manipulations are the basic mathematical vehicle of the book, a short review is included in the appendix. A lucid introductory text and an invaluable reference, Optimal Control will serve as a complete tool for the professional engineer and advanced student alike. As a superb introductory text and an indispensable reference, this new edition of Optimal Control will serve the needs of both the professional engineer and the advanced student in mechanical, electrical, and aerospace engineering. Its coverage encompasses all the fundamental topics as well as the major changes of recent years, including outputfeedback design and robust design. An abundance of computer simulations using MATLAB and relevant Toolboxes is included to give the reader the actual experience of applying the theory to realworld situations. Major topics covered include:

What people are saying  Write a review
Contents
Static Optimization  1 
Optimal Control of DiscreteTime Systems  27 
Optimal Control of ContinuousTime Systems  129 
The Tracking Problem and Other LQR Extensions  215 
FinalTimeFree and Constrained Input Control  259 
Dynamic Programming  315 
Optimal Control for Polynomial Systems  347 
Output Feedback and Structured Control  359 
Control  409 
Robustness and Multivariable FrequencyDomain Techniques  421 
Appendix A  509 
Appendix B  521 
529  
535  