Course Archives Theoretical Statistics and Mathematics Unit
Course: Information Theory
Level: Undergraduate
Time: Currently not offered
Syllabus
Past Exams


Syllabus: Introduction to Markov chains (if necessary). Shannons entropy, Gibbs inequality, Typical sequences of random vectors, Shannons theorem. Capacity-cost function and channel coding theorem. Rate distortion function and source coding theorem. Steins lemma and properties of information measures. Uniquely decipherable codes, Codes on trees, Krafts inequality, Krafts code, Huffmans code, Shannon-Fano-Elias code. Parsing codes and trees, Turnstalls code. Universal source coding, Empirical distributions, Kullback-Leibler divergence. Parsing entropy, Lempel-Ziv algorithm, Entropy equivalence. Mutual information and capacity of noisy channels.

ADDITIONAL TOPICS FROM: Stationary coding of finite alphabets, Ergodic theorem for binary alphabets and examples. Frequencies of finite blocks and Entropy theorem.

Reference Texts:

(a) T. M. Cover and J. A. Thomas. Elements of Information Theory.
(b) P. Bremaud. Discrete Probability Models and Methods.
(c) D. J. C. Mackay. Information Theory, Inference and Learning Algorithms.
(d) Robert J. McEliece. The Theory of Infomation and Coding.
(e) Paul C. Shields. The Ergodic Theory of Discrete Sample Paths.




Top of the page

Past Exams
Midterm
25.pdf
Semestral
25.pdf
Supplementary and Back Paper
25.pdf

Top of the page

[ Semester Schedule ][ Statmath Unit ] [Indian Statistical Institute]