Remarkable phenomena of deep learning through the prism of interpolation.

Talk
Mikhail Belkin
Time: 
09.06.2022 15:30 to 16:30

Zoom link: https://umd.zoom.us/j/95197245230?pwd=cDRlVWRVeXBHcURGQkptSHpIS0VGdz09Password: 828w
In the past decade the mathematical theory of machine learning has lagged far behind the successes of deep neural networks on practical challenges. In this lecture I will outline why the practice of neural networks precipitated a crisis in the theory of Machine Learning and rethinking of certain basic assumptions. I will discuss how the concept of interpolation (fitting the data exactly) clarifies many of the underlying issues leading to new theoretical analyses. Finally, I will briefly mention some new results showing how interpolated predictors may relate to practical settings where the training loss, while small, is not usually driven to zero.
The main reference paper is the review: https://arxiv.org/abs/2105.14368
I will also briefly discuss some of the results in https://arxiv.org/abs/2207.11621 and https://arxiv.org/abs/2010.01851