Calculus of Variations and Optimal Control Theory: A Concise IntroductionThis textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a selfcontained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a onesemester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the HamiltonJacobiBellman theory of dynamic programming and linearquadratic optimal control. Calculus of Variations and Optimal Control Theory also traces the historical development of the subject and features numerous exercises, notes and references at the end of each chapter, and suggestions for further study.
Leading universities that have adopted this book include:

What people are saying  Write a review
Contents
Chapter 1 Introduction  1 
Chapter 2 Calculus of Variations  26 
Chapter 3 From Calculus of Variations to Optimal Control  71 
Chapter 4 The Maximum Principle  102 
Chapter 5 The HamiltonJacobiBellman Equation  156 
Chapter 6 The Linear Quadratic Regulator  180 
Chapter 7 Advanced Topics  200 
225  
231  
Other editions  View all
Calculus of Variations and Optimal Control Theory: A Concise Introduction Daniel Liberzon Limited preview  2011 