Optimal Control Theory: An Introduction
Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.
What people are saying - Write a review
great introduction to the optimal control theory, with lots of examples it makes the reader to get the intuition of the regarding mathematical expressions. the target readers are obviously the ones who are novice to the field but still no computation script (i.e. with Matlab) is provided. So, for programming task, other sources should be used.
Describing the System and Evaluating Its Performance
The Performance Measure
The Calculus of Variations and Pontryagins Minimum Principle
The Calculus of Variations
The Variational Approach to Optimal Control Problems
Iterative Numerical Techniques for Finding Optimal Controls and Trajectories
Numerical Determination of Optimal Trajectories
Other editions - View all
Mathematical Control Theory: Deterministic Finite Dimensional Systems
Eduardo D. Sontag
Limited preview - 1998