Dynamic Optimization and Differential Games
DYNAMIC OPTIMIZATION AND DIFFERENTIAL GAMES has been written to address the increasing number of Operations Research and Management Science problems (that is, applications) that involve the explicit consideration of time and of gaming among multiple agents. It is a book that will be used both as a textbook and as a reference and guide to engineers, operation researchers, applied mathematicians and social scientists whose work involves the theoretical aspects of dynamic optimization and differential games. Included throughout the text are detailed explanations of several original dynamic and game-theoretic mathematical models, which are of particular relevance in today’s technologically-driven-global economy: revenue management, supply chain management, electric power systems, urban freight systems, dynamic congestion pricing, dynamic traffic assignment, electronic commerce and the Internet. In addition, there will be some more traditional applications with useful pedagogical content included in Chapter 1.
The book combines an emphasis on deterministic models and methods along with an introduction to stochastic optimal control and stochastic differential games. And most important, the book covers both theory and applications. It develops the key results of deterministic, continuous time, optimal control theory from both the classical calculus of variations perspectives and the more modern approach of infinite dimensional mathematical programming. Infinite dimensional mathematical programming provides greater utility for solving continuous-time-differential-game problems.
What people are saying - Write a review
Chapter 2 Nonlinear Programming and DiscreteTime Optimal Control
Chapter 3 Foundations of the Calculus of Variations and Optimal Control
Chapter 4 Infinite Dimensional Mathematical Programming
Chapter 5 Finite Dimensional Variational Inequalities and Nash Equilibria
Chapter 6 Differential Variational Inequalities and Differential Nash Games