The taught modules last two weeks each. Each module is led jointly by academics from Imperial and Oxford. There are 3 core modules and 10 optional modules. Students will follow the core modules in Year 1. Optional modules are shared across all cohorts to reinforce cross-cohort cohesion. Students follow 1 optional module in Years 1 and 4 and 2 additional optional modules in each of the Years 2 and 3. 

Our teaching is geared towards training students to become researchers. Typically, a few introductory lectures will be given by the module leaders. Students will then work in groups to explore additional aspects of the material; this can be literature research, or it can be a computational or applied data problem, or a combination of these.

Indicative module leaders in brackets.

  1. Bayesian Modelling and Computation: (Holmes, Kantas, Nicholls). Modern applications often involve high-dimensional data with different modalities (data types). Bayesian modelling is highly valuable in this context, as it provides a framework for incorporating all aspects of uncertainty. The requirement for intensive computation means that one cannot apply the methodology naively. As a result advanced simulation algorithms have been developed that combine unbiased simulation and classical techniques, for example Markov Chain Monte Carlo  (MCMC) or Sequential Monte Carlo (SMC). The course will review the foundations of Bayesian modelling and present the mainstream simulation methods (MCMC, SMC, etc.), with theoretical as well as practical aspects such as computational cost, efficiency and reproducibility. It will discuss contemporary extensions related to approximate Bayesian computation and intractable likelihoods, scalable inference. The group projects will apply these methods to challenging realistic applications.
  2. Statistical Machine Learning: ( Teh, C Archambeau – Amazon). Machine learning techniques enable us to automatically extract features from data so as to solve predictive tasks and are now used in increasingly varied contexts. There are now strong interactions between machine learning and statistics and statistical machine learning is the symbiosis of these two branches of data science. This course will cover the fundamentals of statistical machine learning starting from empirical risk minimization which is at the core of most machine learning procedures to key recent advances around kernel methods, deep learning and generative models. One of the crucial aspects of successful machine learning procedures is their implementation via efficient algorithms. We will thus present typical optimization algorithms in high dimensions and various ideas around approximate inference.
  3. Modern Statistical Theory: (Young,  Deligiannidis). Classical statistical theory focuses on a context where the number n of experimental units is large compared to the number p of unknown features or parameters. Most statistical theory provides results on estimation and testing for an asymptotic regime where p is fixed and n tends to infinity. This theory is powerful, but must be adapted to the modern setting of large scale inference/high dimensional data, where p is large and/or grows with n. The module will discuss key underlying concepts of modern theory including sparsity, oracle estimation and inequalities, concentration inequalities, Stein’s method and other mathematical tools. These concepts will be discussed in the context of important components of theory and methodology, for instance, large-scale (multiple) testing and estimation, model selection, high-dimensional regression, empirical Bayes strategies and inference after model selection.

Optional modules allow the students to explore more in depth some specific aspects.  Optional courses run every second or third year and are open to all cohorts. The following is a list of optional courses we have offered to our students.

  1. Causality and Graphical models 
  2. Modelling Events
  3. Bayesian Non parametrics 
  4. Conformal Inference
  5. Optimisation 
  6. (Deep) learning Theory and Practice 
  7. Reinforcement learning and Multi-Armed Bandits 
  8. Applied Statistics 
  9. Selective Inference
  10. Time Series 
Close Menu