Hide menu

Machine Learning

PhD course, 2013


General Information

Data is becoming more and more widely available and the world is now in a situation where there is more data than we can handle. This clearly calls for new technology and this challenge has resulted in the rapid growth of the machine learning area over the past decade. This course provides an introduction into the area of machine learning, focusing on dynamical systems. To a large extent this involves probabilistic modeling in order to be able to solve a wide range of problems.


Contents

  • Linear regression
  • Linear classification
  • Neural networks
  • Support vector machines
  • Expectation Maximization (EM)
  • Clustering
  • Approximate inference (VB and EP)
  • Graphical models
  • Boosting
  • Sampling methods and MCMC
  • Bayesian nonparametric (BNP) models

Organization and Examination

The course gives 9 hp (you can receive an additional 3 hp by carrying out a project).
  • Lectures: 11
The examination consists in a standard written 3 day (72 h) exam. The exam period is March 17 - April 26, 2013.

Course Literature

The main book used during the course is,
[B] Christopher M. Bishop. Pattern Recognition and Machine Learning, Springer, 2006.

We will also make use of,
[HTF] Trevor Hastie, Robert Tibshirani and Jerome Friedman. The Elements of Statistical Learning: Data Mining, Inference and Prediction, Second edition, Springer, 2009.

Recommended supplementary reading

Periodicity

Every 2 years.

Prerequisites

Basic undergraduate courses in linear algebra, statistics, signal and systems.

Related Courses

Computational inference in dynamical systems, System identification.

Contact Person

Dr Thomas Schön, tel 013 - 281373, email: schon_at_isy.liu.se.


Page responsible: Thomas Schön
Last updated: 2013-04-02