ECE 563 -
Information Theory Fall 2016 |
Course Outline
The course is taught in three parts. The first two parts are standard topics in information theory and we will
stay true to the text book for the most part. (a) Data Representation and Compression (b) Data Transmission and Reception.
The last part covers the basic principles of information theory as used in machine learning and statistics: estimation of information theoretic quantities from samples, the Maximum Entropy principle that underlies most supervised learning algorithms (including SVMs and Lasso) and the Info Max principle behind unsupervised learning. We will also summarize each lecture very briefly in terms of material covered and resources/references below. One significant feature of this course is the focus on discrete mathematics throughout
the course.
Prerequisite
The basic prerequisites are an undergraduate probability course and (mathematical) maturity to handle abstract thinking.
Requirements
Weekly or Biweekly homeworks 70%
Final 30%
The final will be open notes. The students are expected to work on the problems in the homework and final individually.
Final
The final exam will be during Finals week at the assigned date and time.
Lecture Material
There is a text book for this course.
Summary of each lecture :
Teaching Staff
Instructor: Professor
Pramod Viswanath
Office Hours: Tuesdays 2-3 pm in 118 CSL or by appointment.
Teaching Assistants: Shaileshh Venkatakrishnan
Office Hours: Wednesdays 3-5 pm in 114 CSL
Homeworks
Homework 1 Due date: September 1 in class.
Homework 2 Due date: September 16 in class.
Homework 3 Due date: October 4 in class.
Homework 4 Due date: October 13 in class.
Homework 5 Due date: November 17 in class.
Final Exam Due date: December 12, 5pm to TA.