Introduction to Dynamic Systems: Theory, Models, and ApplicationsIntegrates the traditional approach to differential equations with the modern systems and control theoretic approach to dynamic systems, emphasizing theoretical principles and classic models in a wide variety of areas. Provides a particularly comprehensive theoretical development that includes chapters on positive dynamic systems and optimal control theory. Contains numerous problems. |
Contents
INTRODUCTION | 1 |
DIFFERENCE AND DIFFERENTIAL EQUATIONS | 14 |
LINEAR ALGEBRA | 55 |
Copyright | |
10 other sections not shown
Common terms and phrases
a₁ algebra analysis applied arbitrary assume asymptotically stable behavior c₁ canonical forms chapter characteristic equation characteristic polynomial closed class coefficients column completely controllable components Consider constant continuous-time systems converges corresponding defined definition denote determined diagonal difference equation differential equations discrete-time discrete-time system dominant eigenvalue dynamic system eigenvalue elements equal equilibrium point example expressed feedback finite follows Frobenius-Perron Gambler's Ruin geneotype geometric sequence growth homogeneous equation initial conditions input left eigenvector Liapunov function linear combination linear systems linearly independent Markov chain mathematical multiple n×n matrix nonhomogeneous nonnegative nonzero optimal control optimal control problem original system output parameters population positive systems probability probability vector represents result roots S₁ satisfy Sect Show simple solution specific state-transition matrix structure Suppose system equation system matrix system x(t Theorem theory time-invariant tion trajectory transfer function transform variables vector x₁ x₁(k yields z-transform zero λο



