| 1 | Introduction, Review of Random Variables, Entropy, Mutual Information, Chain Rules (PDF) |
| 2 | Jensen's Inequality, Data Processing Theorem, Fanos's Inequality (PDF) |
| 3 | Markov Chain, Entropy Rate of Random Processes (PDF) |
| 4 | Different Types of Convergence, Asymptotic Equipartition Property (AEP), Typical Set, Joint Typicality (PDF) |
| 5 | Data Compression, Kraft Inequality, Optimal Codes (PDF) |
| 6 | Huffman Codes, Sensitivity of Distribution, Elias Code (PDF) |
| 7 | Gambling (PDF) |
| 8 | Channel Capacity, Symmetric and Erasure Channels (PDF) |
| 9 | Coding Theorem (PDF) |
| 10 | Strong Coding Theorem (PDF) |
| 11 | Strong Coding Theorem (cont.) (PDF) |
| 12 | Feedback Capacity (PDF) |
| 13 | Joint Source Channel Coding (PDF) |
| 14 | Differential Entropy (PDF) |
| Recitation: Background Materials Review (PDF) |
| 15 | Gaussian Channel (PDF) |
| 16 | Gaussian Channels: Parallel, Colored Noise, Inter-symbol Interference (PDF) |
| 17 | Maximizing Entropy (PDF) |
| 18 | Gaussian Channels with Feedback (PDF) |
| 19 | Fading Channels (PDF) |
| 20 | Types, Universal Source Coding, Sanov's Theorem (PDF) |
| 21 | Multiple Access Channels (PDF) |
| 22 | Slepian-Wolf Coding (PDF) |
| 23 | Broadcast Channels (PDF) |
| 24 | Channel Side Information, Wide-band Channels (PDF) |