ECE 515

CONTROL SYSTEM THEORY AND DESIGN


This is a fundamental graduate-level course on the modern theory of dynamical systems and control. It builds on an introductory undergraduate course in control (such as ECE 486), and emphasizes state space techniques for the analysis of dynamical systems and the synthesis of control laws meeting given design specifications.

To follow the course, some familiarity with linear algebra as well as ordinary differential equations is strongly recommended, although the necessary material will be reviewed at appropriate junctions throughout the semester.



SPRING 2014 OFFERING

Instructor : Professor Seth Hutchinson

Office : 158 Coordinated Science Laboratory (CSL); Phone: 244-5570

Email : seth@illinois.edu

Lecture Schedule: Daily lecture topics and reading assignments can be found here.

Required Text: Tamer Başar, Sean Meyn, and William R. Perkins, Lecture Notes on Control System Theory and Design (Available for purchase in Everitt Lab for one month starting January 21, 2014)

Recommended Text: Joao P. Hespanha, Linear Systems Theory, Princeton University Press, 2009

Supplementary Text 1: Chi-Tsong Chen, Linear System Theory and Design, 3rd edition, Oxford University Press, 1999

Supplementary Text 2: William L. Brogan, Modern Control Theory, 3rd edition, Prentice Hall, 1991

Other useful textbooks have been put on reserve in the Grainger Engineering Library.

Meeting times : Tuesdays and Thursdays, 12:30 p.m. - 13:50 p.m. in 243 MEB


COURSE OUTLINE

I. Modeling and Analysis of Control Systems

  1. Introduction and classification
  2. State space models in both discrete and continuous time
  3. Linear and nonlinear systems
  4. Discretization and linearization
  5. Transfer function description of linear systems. Relationship with state space models.
  6. Minimal realizations. Controllable and observable forms.
  7. Vector spaces and linear transformations
  8. Review of linear algebra; the Cayley-Hamilton theorem
  9. State transition matrix and solutions of linear state equations

II. Structural Properties of Control Systems

  1. Stability (Lyapunov, Input-Output)
  2. Stability tests for linear systems; stability subspaces
  3. Stability tests for nonlinear systems
  4. Controllability; controllable subspaces
  5. Observability; unobservable subspaces

III. Feedback Controller Design

  1. Pole placement with state feedback
  2. Observers and observer-based designs
  3. Tracking and disturbance rejection
  4. Performance issues; robustness and sensitivity

IV. Optimal Control

  1. Dynamic programming for both discrete-time and continuous-time systems; the Hamilton-Jacobi-Bellman (HJB) equation; relationship between open-loop and closed-loop controllers
  2. Linear-quadratic (LQ) optimal control problem and design of optimum regulators
  3. The matrix Riccati differential equation and some of its properties
  4. The infinite-horizon case: Time-invariant optimal controllers and the algebraic Riccati equation
  5. The minimum principle
  6. Time-optimal control of continuous-time linear systems

For further information click on:

Useful Information

Reserve Books

Announcements

Homework Assignments

Solutions to Homework Problems

Supplementary Notes