Fanny Yang (Berkeley)May 4, 2-3PM, 400 Cory. Title and AbstractStatistics meets optimization - computational guarantees for statistical learning algorithms I will start by discussing the regularization effect of stopping gradient-type methods before convergence for non-parametric estimation. In particular, we show how both early stopping and penalty regularization can be explained by localized complexities. In the second part of the talk, I will elaborate on the advantages of efficient data collection and demonstrate how adaptive algorithms can reduce the number of required samples for simultaneous false discovery rate control and best-arm detection in multiple testing. BioFanny Yang is a graduate student in the PhD program in EECS, Berkeley, advised by Martin Wainwright. Her research interests focus on computational and statistical aspects of machine learning procedures aimed at obtaining good generalization. She has explored these issues in the context of non-convex optimization, non-parametric estimation and active learning. She received her B.A. degree in Electrical Engineering from Karlsruhe Institute of Technology and her M.Sc. degree from the Technical University Munich |