Instructor: Yaniv Plan
Office: 1219 Math Annex
Email:  yaniv (at) math (dot) ubc (dot) ca

Lectures: TuTh, 3:30 – 5:00pm

Office hours: TBD.

Prerequisites: The course will assume knowledge of linear algebra (and functional analysis) as well as a strong probabilistic intuition.  For example, I will assume you have familiarity with stochastic processes, norms, singular values, and Lipschitz functions.

Overview:  We study the tools and concepts in high-dimensional probability which support the theoretical foundations of compressed sensing; they also apply to many other problems in machine learning and data science.

Detailed course outline: See here. (We probably won’t cover all of those topics.)

Textbook:  There is no required textbook.  The following references cover some of the material, and they are available online:

  1. R. Vershynin, High-dimensional probability.  This book has the most overlap with our course. (Our course begins by following an earlier course of Vershynin’s on high-dimensional probability.)
  2. T. Tao, Topics in random matrix theory.
  3. D. Chafai, O. Guedon, G. Lecue, A. Pajor, Interactions between compressed sensing, random matrices and high dimensional geometry, preprint.
  4. R. Vershynin, Lectures in geometric funcitonal analysis.
  5. R. Adler, J. Taylor, Random fields and geometry.
  6. S. Foucart, H. Rauhut, A mathematical introduction to compressive sensing.
  7. J. Lee, nicely presented proof of majorizing measures theorem, which is the lower bound for generic chaining.
  8. Theory of deep learning class at UdeM.
  9. Berkeley 2-month program on theory of deep learning, summer 2019.
  10. Earlier version of this course, which contains a series of notes.  For the beginning of the course, we will roughly follow the same notes.

Grading: Students will complete a class project (ideally in teams of 3 to 4). Instructions:

  1. Determine a “mini-research problem” related to the class material that you wish to investigate. This should have a theory component, but may also include numerical simulations. It may also consist largely of literature review if your problem is (mostly) solved in prior literature. In the latter case, focus on clarifying what open questions remain. Please run your idea(s) by me by Feb 27.
  2. Make what progress you can towards solving it.
  3. Write up your results in about 3-5 pages, plus references (and pictures). Write ups are due on April 20. Here is an example.
  4. Give a 20 minute presentation of your results in class (this will happen in late March and early April). All members of the group should contribute to the presentation and discuss the parts of the project that they worked on most.