CENG 382
Information Theory
An introduction to Shannon’s information theory and elementary binary coding schemes with and without noise. The concept of information, entropy, simple sources, Markov sources, continuous sources, information channels, average error, ambiguity, transformation, capacity, noiseless coding, Kraft-McMillan theorem, Shannon-Fano and Huffmann coding schemes, error-correcting codes, linear codes, cyclic codes. Data Compression.
Topics |
Introduction; Related concepts. |
Entropy, relative entropy and mutual information |
Asymptotic equipartition property |
Entropy rates of stochastic processes |
Data compression -I |
Data compression -II |
Midterm |
Channel capacity |
Differential entropy |
The Gaussian channel |
Network Information Theory – I |
Network Information Theory – II |
Relevant topics in Information Theory today |
Rate-distortion |