Speaker: 
Tim Hoheisel
Speaker Affiliation: 
McGill University Dept. of Mathematics and Statistics

September 26, 2022

LSK 306
Vancouver, BC V6T 1Z2
Canada

 IAM Distinguished Colloquium

View All Events

Abstract: 

The principle of maximum entropy states that the probability distribution that best represents the current state of knowledge about a system is the one with largest entropy with respect to a given prior (data) distribution. It was first formulated in the context of statistical physics in two seminal papers by E. T. Jaynes (Physical Review, Series II. 1957), and thus constitutes an information-theoretic manifestation of Occam’s razor. We bring the idea of maximum entropy to bear in the context of linear inverse problems: we solve for the probability measure that is close to the (learned or chosen) prior and whose expectation has small residual with respect to the observation. Duality leads to tractable, finite-dimensional (dual) problems. A core tool, which we then show to be useful beyond the linear inverse problem setting, is the “MEMM functional”: it is an infimal projection of the Kullback-Leibler divergence and a linear equation, which coincides with Cramer’s function (ubiquitous in the theory of large deviations) in most cases, and is paired in duality with the cumulant generating function of the prior measure. Numerical examples underline the efficacy of the presented framework.

This is joint work with Rustum Choksi (McGill), Ariel Goodwin (McGill), and Carola-Bibiane Schönlieb (Cambridge).

Event Details

September 26, 2022

3:00pm

LSK 306

Vancouver, BC, CA
V6T 1Z2

View Map

Categories

  • IAM seminars and colloquia