top of page
ACCQ202: Information Theory
1. What is information? How do we store information? Source coding theorem CT 5.1-5.5, 2.1, 2.3, 2.6, complement to prefix-free codes: info, HW1, HW1sol
2. Huffman coding, entropy, CT 5.6, 2.2: HW2, HW2sol, (old notes1-2)
3. Entropy, mutual information, channel coding theorem, CT 2.3-2.5, 2.10: HW3, HW3sol
4. DPI, Fano's inequality, proof of the converse of the channel coding theorem, typicality (2.8, 2.10, 7.9, 3-3.2): HW4, HW4sol
5. Typicality (cont.), proof of the direct part of the channel coding theorem, zero-error capacity (7.6, 7.7): HW5, HW5sol
6. Zero-error capacity, joint-source channel coding, compression beyond entropy (rate-distortion), 7.13, 10, Shannon paper: HW6, HW6sol
7. Gambling (beautiful minds at play), Samuelson's paper, Thorp and Shannon (and Kelly): old exam as HW7
bottom of page