Probability in high dimensions

Instructor:ĀYaniv Plan
Office: 1219 Math Annex
Email: Āyaniv (at) math (dot) ubc (dot) ca

Lectures:ĀMWF 10:00 am – 11:00 am. ĀMATX 1118.

Office hours:ĀBy request.

Prerequisites:ĀGraduate probability, undergraduate linear algebra, undergraduate functional analysis.Ā For example, I will assume you have familiarity with stochastic processes, probability spaces, norms,Āand Lipschitz functions.

Overview:Ā SeeĀhere.

Detailed course outline:ĀSeeĀhere.

Textbook: ĀThis course is based on aĀsimilar courseĀfrom Roman Vershynin. ĀThere is no required textbook. The following references cover some of the material, and they are available online:

1. R. Vershynin,ĀIntroduction to the non-asymptotic analysis of random matrices. In: Compressed Sensing: Theory and Applications, eds. Yonina Eldar and Gitta Kutyniok. Cambridge University Press, to appear.
2. R. Vershynin,ĀA short course on the non-asymptotic analysis of random matrices.
3. T. Tao,ĀTopics in random matrix theory.
4. D. Chafai, O. Guedon, G. Lecue, A. Pajor,ĀInteractions between compressed sensing, random matrices and high dimensional geometry, preprint.
5. R. Vershynin,ĀLectures in geometric funcitonal analysis.
6. R. Adler, J. Taylor,ĀRandom fields and geometry.
7. S. Foucart, H. Rauhut,ĀA mathematical introduction to compressive sensing.

Homework:Ā Homework will be assigned from the lecture notes. ĀDepending on how the class develops, we may substitute a project for some homework assignments.

Note 1:Ā If you think that there is something interesting which can be proven/explored based on tools we have learned in class, and which has not been covered in class or on homework, then tell me!ĀĀĀPerhaps we can make that a bonus assignment, and/or discuss it as a class.

Note 1:ĀĀDo extra stuff! ĀAs mentioned in class, completing all of the assigned work perfectly is not sufficient for a perfect mark. ĀThere are many possibilities for extra stuff. ĀYou may, for example, find a project related to your own research and what we are learning in class. Ā(Also, see Note 1.)

Due Friday 9/25:ĀĀAll homework problems (not exercises) from Lectures 1-4.

Due Friday 10/9: ĀAll homework problems (not exercises) from Lectures 5-7.

Due Monday 11/2: ĀAll homework problems (not exercises) from Lectures 8-12.

Due Monday 11/23. ĀAll homework problems from Lecture 14.

Lecture notes:

Compressed Sensing presentation.

Concentration of measure:

Lecture 1.ĀĀCentral limit theorem. ĀHoeffding inequality.Ā (1 homework questions.)

Lecture 2.ĀĀSub-Gaussian random variables.Ā (2 homework questions.)

Lecture 3.ĀĀHoeffding inequality and Khintchine inequality for sub-Gaussian random variables.Ā (0 homework questions.)

Lecture 4. Sub-exponential random variables.Ā Bernstein’s inequality.Ā (2 homework questions.)

Lecture 5.ĀĀConcentration of measure on the sphere. Ā(0 homework questions.)

Lecture 6.ĀĀIsoperimetric inequality. ĀConcentration of Lipschitz functions. Ā(1 homework question.)

Lecture 7.Ā Consequences of concentration.Ā Johnson-Lindenstrauss lemma.Ā (2 homework questions.)

Non-asymptotic random matrix theory:

Lecture 8. ĀSimple covering argument. ĀNorm of a random matrix. Ā(2 open-ended homework questions.)

Lecture 9. ĀChaining argument: ĀDudley’s inequality. ĀNorm of a random matrix. Ā(0 homework questions.)

Lecture 10.Ā Slepian’s inequality.Ā Sharp bounds on singular values of a Gaussian matrix. Ā(1 homework question.) Ā[For a nice proof of Slepian’s inequality, seeĀRandom fields and geometry.]

Lecture 11. ĀApplication of Dudley’s inequality: ĀRIP. Ā(1 homework question.)

Lecture 12. ĀGeneric chaining. Ā(0 homework questions.)

Lecture 13. ĀMajorizing measures theorem (part of the proof). Ā(0 homework questions.)

Lecture 14. ĀApplication: ĀGeneralized compressed sensing. Ā(2 homework questions.)

Lecture 15.ĀĀGeneralized compressed sensing continued. Ā(0 homework questions.)

Lecture 16. ĀReview + more covering numbers. Ā(0 homework questions.)

Lecture 17. ĀMatrix Bernstein inequality. Ā(0 homework questions.)

Lecture 18.ĀĀApplication: Covariance estimation. Ā(0 homework questions.)

Lecture 19.Ā Application: ĀMatrix completion. Ā(0 homework questions.)

Lecture 20. ĀDvoretzky-Milman Theorem. Ā(0 homework questions.)

Lecture 21. ĀCompressed sensing with noise. ĀPrecise analysis of ell_1 minimization. Ā(0 homework questions.)

Lecture 22.ĀĀFano’s inequality, proven using “probability in high dimensions” tools, rather than the standard information theoretic approach. ĀMinimax error in compressed sensing model. Ā(0 homework questions.)

Some errata stated below: Ā

Lecture 9: ĀStatement of Dudley’s inequality.
Lectures 5-6: ĀDefinitions used in proof of Concentration on the sphere.
All lectures:ĀĀNumerical constants missing/inaccurate.