Bayesian Signal Processing

Short Course at Automatic Control, LiTH, 2008-03-10 -- 2008-03-13.

Speaker: Dr Anthony Quinn, Department of Electronic and Electrical Engineering, University of Dublin, Trinity College, Ireland

Abstract: The vast majority of tasks we seek to address in statistical signal processing are inductive inference tasks. In other words, we are seeking knowledge in conditions of uncertainty. Probability quantifies degrees of belief amid uncertainty, and the calculus of probability is available to us as a consistent framework for manipulating degrees of belief in order to arrive at answers to problems of interest. This is the Bayesian paradigm which I will advocate in this course. It presents some challenges but more rewards, and the aim of this course is to examine these challenges and rewards critically in the context of signal processing.

In terms of challenges, perhaps the greatest is to overcome the frequentist mindset that still dominates the signal processing field. Thereafter, we must elicit probability functions for all unknowns, most notoriously expressed in the need for priors. Finally, we must develop tractable procedures for computing and manipulating probability functions. A main aim of the course will be to present the Variational Bayes method for approximating distributions, and to examine its contrasts and cooperations with stochastic approximations, which dominate Bayesian signal processing at present.

The reward of such effort is, first-and-foremost, the fact that the Bayesian approach is a principled and prescriptive pathway to solving signal processing problems properly. If non-Bayesian solutions are consistent, they can always be characterized as special cases of Bayesian solutions. The unique armoury of the Bayesian includes, of course, the prior, which can be used to regularize an inference, and to exploit external information. This is well known. Less well known, but perhaps more powerful, is the availability of the marginalization operator, conferred uniquely because of the measure nature of probability functions. Among the compelling advantages of marginalization are the automatic embrace of Ockham's Razor, and the ability to compare set hypotheses, including competing model structures.

All these ideas will be explored in this course, and illustrated via important representative problems, such as sinusoidal identification, principal component analysis and nonlinear filtering. Radiotherapy, functional medical imaging and speech processing will be among the applications to be considered.

The main sections of the course will be:

  • How to be a Bayesian (Bayesian Ways)
  • Why to be a Bayesian (Bayesian Gains)
  • A Question of Priors
  • The Need for Approximation: The Variational Bayes Method
  • Going On-Line: Nonlinear Filtering and Variational Bayesian Filtering