Breadcrumb
Maths & Stats Colloquium with Professor Alexandre Tsybakov

Please register your place here.
Title: Gradient-free stochastic optimization
Abstract: This talk will deal with optimization problems in a statistical learning setup where the learner has no access to unbiased estimators of the gradient of the objective function. It includes stochastic optimization with zero-order oracle, continuum bandit and contextual continuum bandit problems. I’ll give an overview of recent results on minimax optimal algorithms and fundamental limits for these problems.
Biography: Alexandre Tsybakov is currently a Professor and Head of the Statistics Department at CREST-ENSAE Paris. He has been at CREST-ENSAE since 2007. He is also a Professor at Sorbonne University, Paris. From 1993 to 2017 he was a Professor at University Pierre and Marie Curie (Paris 6), and from 2009 to 2015 a Professor at Ecole Polytechnique. He was a member of the Institute for Information Transmission Problems, Moscow, until 2007. He was Miller Professor at the University of California-Berkeley in 2006, and Distinguished Visiting Professor at MIT in 2017. Prof. Tsybakov is an author of 3 books and more than 150 journal papers. He is an elected Fellow of the Institute of Mathematical Statistics and he has been awarded Lucien Le Cam’s Lecture by the French Statistical Society (2005), Medallion Lecture by the Institute of Mathematical Statistics (2012), Gay-Lussac-Humboldt Prize (2013), an Invited Lecture at the International Congress of Mathematicians (2014). He is a member of editorial boards of several journals.