## Krishna Balasubramanian (UC Davis)Feb 12, 2020 ## Title and Abstract
The first result of this talk is on establishing non-asymptotic normal approximation rates for stochastic gradient descent (SGD), arguably the most widely used stochastic iterative estimator, for locally strongly-convex (but globally potentially nonconvex) M-estimation problems. This result could be clubbed with existing bootstrap techniques to obtain non-asymptotically valid confidence sets for parameter estimation via the SGD estimator. The second result of this talk is on establishing non-asymptotic normal approximation rates for Euler discretization of ItÃ´ diffusions (a special case of this estimator is the stochastic gradient Langevin Monte Carlo, widely used by the Bayesian community), which is a stochastic iterative estimator used for posterior expectation computation or numerical integration. This result could potentially be clubbed with (yet to be well-developed) bootstrap techniques to obtain non-asymptotically valid Frequentist-style confidence intervals for prediction within the Bayesian framework or non-asymptotically valid confidence intervals for numerical integration in general. ## BioKrishna Balasubramanian is an assistant professor in the Department of Statistics, University of California, Davis. His recent research interests include inference for stochastic iterative methods, theory and computation with tensors, hypothesis testing with kernel methods, and zeroth-order optimization.Â |