Undergraduate Teaching 2023-24

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2023-24

Engineering Tripos Part IIA, 3F7: Information Theory and Coding, 2023-24

Not logged in. More information may be available... Login via Raven / direct.

PDF versionPDF version


Prof Ramji Venkataramanan


Prof R Venkataramanan

Lab Leader

Dr Jossy Sayir

Timing and Structure

Michaelmas Term. 16 lectures. Assessment: 100% exam. Lectures will be recorded.


The aims of the course are to:

  • To introduce students to the principles of information theory, data compression, and error-correction, which form the foundations of modern communication and information processing systems.


As specific objectives, by the end of the course students should be able to:

  • Explain why entropy and channel capacity arise as fundamental limits for data compression and transmission, respectively
  • Understand and implement basic compression algorithms such as Huffman coding and Arithmetic coding
  • Encode and decode information using simple linear block codes
  • Implement decoding algorithms for modern error-correcting codes such as LDPC codes



Information Theory and Data Compression (11L)

  1. Probability fundamentals; Definitions of entropy, joint entropy, conditional entropy: interpretations as measures of uncertainty 
  2. Noiseless source coding theorem; Significance of entropy as the fundamental limit of compression 
  3. Bounds on code length for lossless data compression
  4. Practical compression algorithms: Huffman coding, Arithmetic coding
  5. Relative Entropy, Mutual Information: Properties and some applications
  6. Discrete Memoryless Channels and Channel Capacity
  7. The channel coding theorem: Random coding and the direct coding theorem; Fano's inequality and the converse theorem
  8. The additive white Gaussian noise (AWGN) channel and its capacity


Channel Coding (Error-correcting codes) (5L)

  1. Introduction to block codes; Linear block codes
  2. Representing a linear code using a factor graph; Sparse-graph codes
  3. Message passing decoding of sparse-graph codes for binary erasure channels
  4. The Belief Propagation (BP) algorithm; BP decoding of sparse-graph codes for general binary input channels

Further notes

This module will be of interest to anyone who wishes to  understand how information can be mathematically modelled, measured, compressed, and transmitted. Though not a pre-requisite for 3F4, 3F7 provides a good foundation for further study of digital communications.


Data Compression: Build your own CamZIP


This lab can be done remotely in your own time, and there is no need to sign up for a lab slot. The lab is based on material that is covered in Week 3 of the lectures, so you can do the lab anytime after that.  If you need help, you can sign up for an online support sessions. Please see the course Moodle for details.

Learning objectives

  • To implement various data compression algorithms in Python/Matlab/Octave
  • To compare the compression performance of different techniques on text files
  • To understand the effects of finite precision implementation on the compression performance of arithmetic coding

Practical information:

  • Students can do the lab in their own time. Scheduled 'helpdesk' sessions will be held during Michaelmas term (times will be announced on Moodle)

Full Technical Report:

Students will have the option to submit a Full Technical Report.


The following are useful references:

  • T. Cover and J. Thomas, Elements of Information Theory, Wiley-Blackwell, 2006.
  • D. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003 (free electronic copy available for download)
  • T. Richardson and R. Urbanke, Modern Coding Theory, Cambridge University Press, 2008.
  • R. Blahut, Algebraic Codes for Data Transmission, Cambridge University Press, 2012. 

Examination Guidelines

Please refer to Form & conduct of the examinations.

Last modified: 30/05/2023 15:21