ECE 515
CONTROL SYSTEM THEORY AND DESIGN
This is a fundamental graduatelevel course on the modern theory of dynamical systems and control. It builds on an introductory undergraduate course in control (such as ECE 486), and emphasizes state space techniques for the analysis of dynamical systems and the synthesis of control laws meeting given design specifications.
To follow the course, some familiarity with linear algebra as well as ordinary differential equations is strongly recommended, although the necessary material will be reviewed at appropriate junctions throughout the semester.
SPRING 2014 OFFERING
Instructor : Professor Seth Hutchinson
Office : 158 Coordinated Science Laboratory (CSL); Phone: 2445570
Email : seth@illinois.edu
Lecture Schedule: Daily lecture topics and reading assignments can be found here.
Required Text: Tamer Başar, Sean Meyn, and William R. Perkins, Lecture Notes on Control System Theory and Design (Available for purchase in Everitt Lab for one month starting January 21, 2014)
Recommended Text: Joao P. Hespanha, Linear Systems Theory, Princeton University Press, 2009
Supplementary Text 1: ChiTsong Chen, Linear System Theory and Design, 3rd edition, Oxford University Press, 1999
Supplementary Text 2: William L. Brogan, Modern Control Theory, 3rd edition, Prentice Hall, 1991
Other useful textbooks have been put on reserve in the Grainger Engineering Library.
Meeting times : Tuesdays and Thursdays, 12:30 p.m.  13:50 p.m. in 243 MEB
COURSE OUTLINE
I. Modeling and Analysis of Control Systems

Introduction and classification

State space models in both discrete and continuous time

Linear and nonlinear systems

Discretization and linearization

Transfer function description of linear systems. Relationship with state space models.

Minimal realizations. Controllable and observable forms.

Vector spaces and linear transformations

Review of linear algebra; the CayleyHamilton theorem

State transition matrix and solutions of linear state equations
II. Structural Properties of Control Systems

Stability (Lyapunov, InputOutput)

Stability tests for linear systems; stability subspaces

Stability tests for nonlinear systems

Controllability; controllable subspaces

Observability; unobservable subspaces
III. Feedback Controller Design

Pole placement with state feedback

Observers and observerbased designs

Tracking and disturbance rejection

Performance issues; robustness and sensitivity
IV. Optimal Control

Dynamic programming for both discretetime and continuoustime systems; the HamiltonJacobiBellman (HJB) equation; relationship between openloop and closedloop controllers

Linearquadratic (LQ) optimal control problem and design of optimum regulators

The matrix Riccati differential equation and some of its properties

The infinitehorizon case: Timeinvariant optimal controllers and the algebraic Riccati equation

The minimum principle

Timeoptimal control of continuoustime linear systems
For further information click on: