Optimal Control with Engineering Applications
Springer Science & Business Media, Mar 23, 2007 - Technology & Engineering - 134 pages
Because the theoretical part of the book is based on the calculus of variations, the exposition is very transparent and requires mostly a trivial mathematical background. In the case of open-loop optimal control, this leads to Pontryagin’s Minimum Principle and, in the case of closed-loop optimal control, to the Hamilton-Jacobi-Bellman theory which exploits the principle of optimality.
Many optimal control problems are solved completely in the body of the text. Furthermore, all of the exercise problems which appear at the ends of the chapters are sketched in the appendix.
The book also covers some material that is not usually found in optimal control text books, namely, optimal control problems with non-scalar-valued performance criteria (with applications to optimal filtering) and Lukes’ method of approximatively-optimal control design.
Furthermore, a short introduction to differential game theory is given. This leads to the Nash-Pontryagin Minimax Principle and to the Hamilton-Jacobi-Nash theory. The reason for including this topic lies in the important connection between the differential game theory and the Hinfinity-control theory for the design of robust controllers.