Undergraduate Teaching 2017-18

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2017-18

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2017-18

Not logged in. More information may be available... Login via Raven / direct.

PDF versionPDF version


Dr R Venkataramanan


Dr R Venkataramanan

Lab Leader

Dr J Sayir

Timing and Structure

Michaelmas Term. 16 lectures. Assessment: 100% exam


The aims of the course are to:

  • To introduce students to the principles of information theory, data compression, and error-correction, which form the foundations of modern communication and information processing systems.


As specific objectives, by the end of the course students should be able to:

  • Explain why entropy and channel capacity arise as fundamental limits for data compression and transmission, respectively
  • Understand and implement basic compression algorithms such as Huffman coding and Arithmetic coding
  • Encode and decode information using simple linear block codes
  • Implement decoders for error-correcting codes such as LDPC codes



Information Theory  and Data Compression (11L)

  1. Probability fundamentals; Definitions of entropy, joint entropy, conditional entropy: interpretations as measures of uncertainty 

  2. Noiseless source coding theorem; Significance of entropy as the fundamental limit of compression 

  3. Bounds on code length for lossless data compression

  4. Practical compression algorithms: Huffman coding, Arithmetic coding

  5. Relative Entropy, Mutual Information: Properties and some applications

  6. Discrete Memoryless Channels and Channel Capacity

  7. The channel coding theorem: Random Coding and the direct coding theorem; Fano's inequality and the converse theorem

  8. The additive white Gaussian noise (AWGN) channel and its capacity


Channel Coding (Error-correcting codes) (5L)

  1. Introduction to block codes; Linear block codes

  2. Representing a linear code using a factor graph; Sparse-graph codes

  3. Message passing decoding of sparse-graph codes for binary erasure channels

  4. The Belief-Propagation (BP) algorithm; BP decoding of sparse-graph codes for general binary input channels



Further notes

This module will be of interest to anyone who wishes to  understand how information can be mathematically modelled, measured, compressed, and transmitted. Though not a pre-requisite for 3F4, 3F7 provides a good foundation for further study of communication.


[Coursework Title]

Learning objectives


Practical information:

  • Sessions will take place in [Location], during week(s) [xxx].
  • This activity [involves/doesn't involve] preliminary work ([estimated duration]).

Full Technical Report:

Students [will/won't] have the option to submit a Full Technical Report.


The following are useful references:

  • T. Cover and J. Thomas, Elements of Information Theory, Wiley-Blackwell, 2006.
  • D. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003 (free electronic copy available for download)
  • T. Richardson and R. Urbanke, Modern Coding Theory, Cambridge University Press, 2008.
  • R. Blahut, Algebraic Codes for Data Transmission, Cambridge University Press, 2012. 

Examination Guidelines

Please refer to Form & conduct of the examinations.

Last modified: 03/08/2017 15:42