Jingbo Liu (MIT)Aug 27, 4-5pm 293 Cory. Title and AbstractReverse hypercontractivity beats measure concentration for information theoretic converses Now, after all the past 40 years of such progress, we found that, amusingly, measure concentration is not the right hammer for many of these information theoretic applications. We introduce a new machinery based on functional inequalities and reverse hypercontractivity which yields strict improvements in terms of sharpness of the bounds, generality of the source/channel distributions, and simplicity of the proofs. Examples covered in the talk include: 1. optimal second-order converses to distributed source-type problems (hypothesis testing, common randomness generation, and source coding); 2. sharpening the recent relay channel converse bounds by Wu and Ozgur with much simpler proofs. The work benefited from collaborations with Thomas Courtade, Paul Cuff, Ayfer Ozgur, Ramon van Handel, and Sergio Verd'u. BioJingbo Liu received the B.E. degree from Tsinghua University, Beijing, China in 2012, and the M.A. and Ph.D. degrees from Princeton University, Princeton, NJ, USA, in 2014 and 2018, all in electrical engineering. His research interests include signal processing, information theory, coding theory, high dimensional statistics, and the related fields. His undergraduate thesis received the best undergraduate thesis award at Tsinghua University (2012). He gave a semi-plenary presentation at the 2015 IEEE Int. Symposium on Information Theory, Hong-Kong, China. He was a recipient of the Princeton University Wallace Memorial Fellowship (2016). His Ph.D. thesis received the Bede Liu Best Dissertation Award of Princeton and the Thomas M. Cover Dissertation Award of the IEEE Information Theory Society (2018) |