
Module Leader
Lecturer
Timing and Structure
Lent term. 16 lectures (including examples classes). Assessment: 100% exam
Prerequisites
3F3; Useful 3F1 and 3F8
Aims
The aims of the course are to:
- Continue the study of statistical signal processing techniques from the basics studied in 3F3.
- Introduce time-series models, in particular State-space models and hidden Markov models; understand their role in applications of signal processing.
- Develop techniques for fitting statisical modes to data and estimating hidden signals from noisy observations.
Objectives
As specific objectives, by the end of the course students should be able to:
- Understand state-space models and hidden Markov models including their mathematical characterisation, strengths and limitations.
- Understand how to execute all the necessary computational tasks involved in fitting the models to data, to estimate unobserved quantities and make future predictions.
- Understand the computational methodology employed, their mathematical derivation, their strengths and weaknesses, how to execute them, and their use more generally in Statistical and data-centric engineering problems.
Content
This course is about fitting statistical models to data that arrives sequentially over time. Once an appropriate model has been fit, tasks like predicting future trends or estimating quantities not directly observed can be performed. The statistical modelling and computational methodology covered by this course is widely used in many applied areas. For example, data that arrives sequentially over time is a common occurrence in Signal Processing (Engineering), Finance, Machine Learning, Environmental statistics etc.
The model that most appropriately describes data that arrives sequentially over time is a time-series model, an example of which is the ARMA model (studied in 3F3.) However, this course will look at more versatile models that incorporate hidden or latent state variables as these are able to account for richer behaviour. Also, models that aim describe how many really physical processes evolve over time often necessarily have to incorporate unobserved hidden states that form a Markov process.
- Introduction to state-space models and optimal linear filtering; the Kalman filter; exemplar problems in signal processing.
- Introduction to hidden Markov models: definition; inference/estimation aims; exact computation of the conditional probability distributions.
- Importance sampling: introduction; weight degeneracy; statistical properties.
- Sequential importance sampling and resampling (also known as the particle filter): application to hidden Markov models; filtering; smoothing.
- Calibrating hidden Markov models: maximum likelihood estimation and its implementation.
- Exemplar problems in Signal Processing.
- Examples Papers.
Booklists
Please see the Booklist for Group F Courses for references for this module.
Examination Guidelines
Please refer to Form & conduct of the examinations.
UK-SPEC
This syllabus contributes to the following areas of the UK-SPEC standard:
Toggle display of UK-SPEC areas.
Last modified: 13/09/2018 15:29