Optimal Control with Engineering Applications

Front Cover
Springer Science & Business Media, Mar 23, 2007 - Technology & Engineering - 134 pages
0 Reviews

Because the theoretical part of the book is based on the calculus of variations, the exposition is very transparent and requires mostly a trivial mathematical background. In the case of open-loop optimal control, this leads to Pontryagin’s Minimum Principle and, in the case of closed-loop optimal control, to the Hamilton-Jacobi-Bellman theory which exploits the principle of optimality.

Many optimal control problems are solved completely in the body of the text. Furthermore, all of the exercise problems which appear at the ends of the chapters are sketched in the appendix.

The book also covers some material that is not usually found in optimal control text books, namely, optimal control problems with non-scalar-valued performance criteria (with applications to optimal filtering) and Lukes’ method of approximatively-optimal control design.

Furthermore, a short introduction to differential game theory is given. This leads to the Nash-Pontryagin Minimax Principle and to the Hamilton-Jacobi-Nash theory. The reason for including this topic lies in the important connection between the differential game theory and the Hinfinity-control theory for the design of robust controllers.

 

What people are saying - Write a review

We haven't found any reviews in the usual places.

Contents

II
3
III
4
IV
5
V
18
VII
19
VIII
22
IX
23
X
24
XXXVIII
67
XXXIX
68
XLI
69
XLII
72
XLIII
75
XLV
78
XLVI
80
XLVII
81

XII
25
XIV
28
XV
32
XVI
35
XVII
36
XIX
38
XXII
39
XXIII
41
XXIV
43
XXVI
44
XXVII
46
XXVIII
48
XXX
49
XXXI
51
XXXII
54
XXXIII
59
XXXV
60
XXXVI
62
XXXVII
65
XLVIII
83
XLIX
86
L
87
LI
88
LII
92
LIII
96
LIV
99
LV
103
LVII
104
LVIII
105
LIX
106
LX
107
LXI
109
LXIII
111
LXIV
113
LXV
116
LXVI
129
LXVII
131
Copyright

Other editions - View all

Common terms and phrases

Bibliographic information