Probability in high dimensions

Instructor: Yaniv Plan
Office: 1219 Math Annex
Email:  yaniv (at) math (dot) ubc (dot) ca

Lectures: MWF 10:00 am – 11:00 am.  MATX 1118.

Office hours: By request.

Prerequisites: Graduate probability, undergraduate linear algebra, undergraduate functional analysis.  For example, I will assume you have familiarity with stochastic processes, probability spaces, norms, and Lipschitz functions.

Overview:  See here.

Detailed course outline: See here.

Textbook:  This course is based on a similar course from Roman Vershynin.  There is no required textbook. The following references cover some of the material, and they are available online:

  1. R. Vershynin, Introduction to the non-asymptotic analysis of random matrices. In: Compressed Sensing: Theory and Applications, eds. Yonina Eldar and Gitta Kutyniok. Cambridge University Press, to appear.
  2. R. Vershynin, A short course on the non-asymptotic analysis of random matrices.
  3. T. Tao, Topics in random matrix theory.
  4. D. Chafai, O. Guedon, G. Lecue, A. Pajor, Interactions between compressed sensing, random matrices and high dimensional geometry, preprint.
  5. R. Vershynin, Lectures in geometric funcitonal analysis.
  6. R. Adler, J. Taylor, Random fields and geometry.
  7. S. Foucart, H. Rauhut, A mathematical introduction to compressive sensing.

Homework:  Homework will be assigned from the lecture notes.  Depending on how the class develops, we may substitute a project for some homework assignments.

Note 1:  If you think that there is something interesting which can be proven/explored based on tools we have learned in class, and which has not been covered in class or on homework, then tell me!   Perhaps we can make that a bonus assignment, and/or discuss it as a class.

Note 1:  Do extra stuff!  As mentioned in class, completing all of the assigned work perfectly is not sufficient for a perfect mark.  There are many possibilities for extra stuff.  You may, for example, find a project related to your own research and what we are learning in class.  (Also, see Note 1.)

Due Friday 9/25:  All homework problems (not exercises) from Lectures 1-4.

Due Friday 10/9:  All homework problems (not exercises) from Lectures 5-7.

Due Monday 11/2:  All homework problems (not exercises) from Lectures 8-12.

Due Monday 11/23.  All homework problems from Lecture 14.

Lecture notes:

Compressed Sensing presentation.

Concentration of measure:

Lecture 1.  Central limit theorem.  Hoeffding inequality.  (1 homework questions.)

Lecture 2.  Sub-Gaussian random variables.  (2 homework questions.)

Lecture 3.  Hoeffding inequality and Khintchine inequality for sub-Gaussian random variables.  (0 homework questions.)

Lecture 4. Sub-exponential random variables.  Bernstein’s inequality.  (2 homework questions.)

Lecture 5.  Concentration of measure on the sphere.  (0 homework questions.)

Lecture 6.  Isoperimetric inequality.  Concentration of Lipschitz functions.  (1 homework question.)

Lecture 7.  Consequences of concentration.  Johnson-Lindenstrauss lemma.  (2 homework questions.)

Non-asymptotic random matrix theory:

Lecture 8.  Simple covering argument.  Norm of a random matrix.  (2 open-ended homework questions.)

Lecture 9.  Chaining argument:  Dudley’s inequality.  Norm of a random matrix.  (0 homework questions.)

Lecture 10.  Slepian’s inequality.  Sharp bounds on singular values of a Gaussian matrix.  (1 homework question.)  [For a nice proof of Slepian’s inequality, see Random fields and geometry.]

Lecture 11.  Application of Dudley’s inequality:  RIP.  (1 homework question.)

Lecture 12.  Generic chaining.  (0 homework questions.)

Lecture 13.  Majorizing measures theorem (part of the proof).  (0 homework questions.)

Lecture 14.  Application:  Generalized compressed sensing.  (2 homework questions.)

Lecture 15.  Generalized compressed sensing continued.  (0 homework questions.)

Lecture 16.  Review + more covering numbers.  (0 homework questions.)

Lecture 17.  Matrix Bernstein inequality.  (0 homework questions.)

Lecture 18.  Application: Covariance estimation.  (0 homework questions.)

Lecture 19.  Application:  Matrix completion.  (0 homework questions.)

Lecture 20.  Dvoretzky-Milman Theorem.  (0 homework questions.)

Lecture 21.  Compressed sensing with noise.  Precise analysis of ell_1 minimization.  (0 homework questions.)

Lecture 22.  Fano’s inequality, proven using “probability in high dimensions” tools, rather than the standard information theoretic approach.  Minimax error in compressed sensing model.  (0 homework questions.)

Some errata stated below:  

Lecture 9:  Statement of Dudley’s inequality.
Lectures 5-6:  Definitions used in proof of Concentration on the sphere.
All lectures:  Numerical constants missing/inaccurate.

Please ask if you would like to see these fixed.