CENG 382
Information Theory
An introduction to Shannon’s information theory and elementary binary coding schemes with and without noise. The concept of information, entropy, simple sources, Markov sources, continuous sources, information channels, average error, ambiguity, transformation, capacity, noiseless coding, Kraft-McMillan theorem, Shannon-Fano and Huffmann coding schemes, error-correcting codes, linear codes, cyclic codes. Data Compression.
Course Objectives:
To teach the fundamental concepts of information theory. To teach that how to define and formulate the problems of communication and signal processing. Teaching that how to solve the problems for communication and signal processing.
Recommended or Required Reading:
T. M. Cover, J. A. Thomas, ‘Elements of Information Theory’, 2nd. Edition, Wiley,2006 ,P. S. Nuggehalli, ‘Information Theory and Coding’, CEDT, IISc, Bangalore, http://nptel.iitm.ac.in/courses/Webcourse-contents/IISc-BANG/Information%20Theory%20and%20Coding/Learning%20Material%20-%20ITC.pdf ,R. G. Gallager, ‘Principles of Digital Communication’, Chapters 1 to 3, Cambridge Univ. Press, 2008 ,R. G. Gallager, ‘MIT 6.450 Principles of Digital Communications I, Lectures 1 to 7’; YouTube: MIT OCW, 2006 ,R. W. Yeung, ‘A First Course in Information Theory’, NY: Kluwer Academic/Plemu Publishers, 2002 ,A. E. Gamal and Y.-H. Kim, ‘Lecture Notes on Network Information Theory’, online, 2010
Learning Outcomes:
1. To do analyzing and modeling by using of fundamental concepts of information theory for signal processing and communication.
2. Analyzing, defining and doing research capability for the new requirements of information theory depend on technological evolution.
3. To have the ability for analyzing, developing and using of information theory for interdisciplinary projects.
4. To be able to measure and analyse the signal proceesing and communication quality and efficiency.
| Topics |
| Introduction; Related concepts. |
| Entropy, relative entropy and mutual information |
| Asymptotic equipartition property |
| Entropy rates of stochastic processes |
| Data compression -I |
| Data compression -II |
| Midterm |
| Channel capacity |
| Differential entropy |
| The Gaussian channel |
| Network Information Theory – I |
| Network Information Theory – II |
| Relevant topics in Information Theory today |
| Rate-distortion |
Grading:
Midterm: 25%
Homework: 20%
Research Presentation: 20%
Final: 35%
