This book is written is such a way that the level of mathematical sophistication builds up from chapter to chapter. It has been reorganized into four parts: basic analysis, analysis of feedback systems, advanced analysis, and nonlinear feedback control. Updated content includes subjects which have proven useful in nonlinear control design in recent years-- new in the 3rd edition are: expanded treatment of passivity and passivity-based control; integral control, high-gain feedback, recursive methods, optimal stabilizing control, control Lyapunov functions, and observers. For use as a self-study or reference guide by engineers and applied mathematicians.
What people are saying - Write a review
Other editions - View all
adaptive control approximation autonomous system behavior change of variables Chapter choose class K function closed-loop system Consider the system continuously differentiable defined derivative differential equation eigenvalues error estimate Example Exercise existence exponentially stable equilibrium finite function V(x given globally asymptotically stable Hence Hurwitz initial input interval invariant set Jacobian matrix Lemma limit cycle linear system locally Lipschitz Lyapunov equation Lyapunov function Lyapunov function candidate Lyapunov stability mapping negative definite neighborhood nominal system nonlinear system Nyquist plot origin is asymptotically origin is globally oscillation parameters pendulum equation periodic orbit periodic solution phase portrait positive constants positive definite positive real problem proof of Theorem region of attraction right-hand side scalar second-order system Section sector condition Show singular perturbation solution x(t stable equilibrium point sufficiently small Suppose system x time-varying system trajectory starting transfer function uniformly asymptotically stable unique solution unstable vector x(to xTPx ym(t zero