ECE 566: COMPUTATIONAL INFERENCE AND LEARNING, FALL 2017

**Course Description:**Computational inference and machine learning have seen a surge of interest in the last 15 years, motivated by applications as diverse as computer vision, speech recognition, analysis of networks and distributed systems, big-data analytics, large-scale computer simulations, and indexing and searching of very large databases. This new course will introduce the mathematical and computational methods that enable such applications. Topics include computational methods for statistical inference, information theory, sparsity analysis, approximate inference and search, and fast optimization.

The course will complement ECE561 (Detection and Estimation), ECE544NA (Pattern Recognition and Machine Learning) and ECE543 (Statistical Learning Theory) which introduce core theory for statistical inference and machine learning respectively, but do not focus on computational methods. Teaching materials include notes from the instructor and articles from scientific journals.

**Prerequisites:**ECE490 and ECE534.**Class time and place:**1400 - 1520 TR, 3081 Electrical Engineering Building.

**Instructor:**Prof. Pierre Moulin Room 310 CSL. Email: moulin at ifp dot uiuc dot edu

Office Hours: Wednesdays 10-11.30am Wednesdays, Room 310 CSL

**TA:**Amish Goel, Room 309 CSL. Email: agoel10 at illinois dot edu

Office Hours: 2-3.30pm Mondays, Room 309 CSL

**Grading:**Homeworks (20%), midterm exam (40%), and a final project (40%).

**Project:**List of Topics

**ECE561 book by P. Moulin and V. Veeravalli:**

Chapter 1 and 2: Introduction, Hypothesis Testing

- Homework-1, Due Date: Sep 28, 2017, Solutions

Homework-2, Due Date: Oct 26, 2017, Problem-5 dataset, Solutions

Homework-3, Due Date: Nov 27, 2017, Solutions

Homework-4, Due Date: Dec 7, 2017, Solutions

- Midterm on October 30, 7-9 PM, Midterm Solutions

- Lectures:1-4

Lectures:5-6

Lectures:7-9

Lectures:10-12

Lectures:13-14

Lectures:15-17
Lectures:18-21

**Lecture 1:**Introduction, Review of Optimization concepts.

**Lecture 2:**Bayes inference, maximum likelihood principle, Maximum A Priori (MAP) estimation, Minimum Mean Squared Error (MMSE) estimation.

**Lecture 3:**Empirical Risk Minimization.

**Lectures 4,5:**Stochastic Approximation and Stochastic Gradient Descent.

**Lecture 6:**Statistical performance analysis via Monte Carlo methods and importance sampling.

**Lecture 7:**Bootstrap.

**Lecture 8:**Bayesian recursive estimation using Particle Filtering.

**Lectures 9-11:**Parameter estimation via Expectation Maximization (EM) algorithm.

**Lectures 12, 13:**Hidden Markov Models, Viterbi algorithm, Baum-Welch learning.

**Lectures 14, 15:**Linear Dynamical Systems, RTS smoother.

**Lectures 16-18:**Graphical models.

**Lectures 19-22:**Variational inference, mean-field techniques.

**Lectures 23-25:**L-1 penalized least squares minimization.

**Lectures 26-28:**Reconstruction of sparse signals using Compressive Sensing.

**Lecture 29:**Dimensionality reduction using random projections; hashing.