Jerry Li (Microsoft Research)Oct 19, 2020 Title and AbstractFaster and Simpler Algorithms for List Learning In our work, we have two main contributions. First, we give a very simple polynomial time algorithm for this problem which achieves optimal error. Second, we then build off of these insights to develop a somewhat more sophisticated algorithm based on lazy mirror descent which runs in time \(\widetilde{O}(n d / \alpha + 1/\alpha^6)\), and which also achieves optimal error. Our algorithm improves upon the runtime of previous work for all \(1/\alpha = O(\sqrt{d}\). The goal of this talk is to give a more or less self-contained proof of the first, and then explain at a high level how to use these ideas to develop our faster algorithm. Joint work with Ilias Diakonikolas, Daniel Kane, Daniel Kongsgaard, and Kevin Tian BioJerry Li is a Senior Researcher at Microsoft Research. He completed his PhD at MIT under the supervision of Ankur Moitra, after completing his Masters at MIT and his undergraduate at UW. Subsequently, he did a postdoc at the Simons Institute at UC Berkeley. His work focuses largely on high dimensional statistical learning theory and machine learning, with a particular focus towards understanding how and why learned models are robust. For his work, he has been awarded the George Sprowls award for Outstanding Thesis in EECS at MIT, various special issue invitations, and an invitation to appear in the Communications of the ACM. Outside of work, he enjoys being killed first in Among Us, and being really bad at playing the imposter |