Math 605D Tensor decompositions and their applications

Elina Robeva, The University of British Columbia, Fall 2020

Course Information

Class time: TTh 9:30am - 11:00am Pacific Time; Sept 10 - Dec 8, 2020

Location: Zoom; please fill out this form in order to be added to the class mailing list

Instructor: Elina Robeva; erobeva@math.ubc.ca

Prerequisites: Linear algebra (e.g., one of Math 221, 223, 307), Probability theory (e.g., one of Math 302, 318)

Grading: Final project: 50%; Homework: 40%; Scribing: 5%; Participation: 5%.

Overview

This is a graduate course designed to introduce tensors (or multi-dimensional arrays) and their uses in statistics and machine learning. In particular, we will illustrate fundamental theoretical properties of several types of tensor decompositions, including CP-decomposition, nonnegative matrix and tensor decomposition, Tucker decomposition as well as tensor network decompositions arising from physics. We will see how these naturally come up in hidden variable models, Gaussian mixture models, directed and undirected graphical models, blind source separation, independent component analysis, and quantum physics. We will discuss algorithms for computing such decompositions, and will exhibit open problems.

Specific topics include

  • CP decomposition - algorithms, applications, properties (about 10 lectures)

  • Tensor network decompositions (2 lectures)

  • Undirected graphical models (2 lectures)

  • Nonnegative matrix and tensor decompositions (4 lectures)

  • Total positivity (1 lecture)

  • Directed graphical models (2 lectures)

  • Linear structural equation models and independent component analysis (2 lectures)

For a more detailed list of topics, please refer to the syllabus page or the pdf.