Michael Gastpar (EPFL)

Sep 6, 2016, 2-3PM, 400 Cory.

Title and Abstract

Towards an Algebraic Network Information Theory
The IID random coding argument, a particular instance of the probabilistic method, is arguably one of the crown jewels of classical information theory. It establishes capacity in many point-to-point communication scenarios, but fails to play a similarly fundamental role in communication networks. Inklings of this issue have been observed at least since Korner and Marton’s 1979 paper, where the receiver is only interested in the modulo-sum of the transmitted messages; a rather unusual goal. More recent work has shown that this fundamental limitation continues to apply even for the usual problem of pure data transmission. That work also elucidates that one key missing ingredient is the algebraic structure of the underlying error-correcting codes. The key goal of our work is to include such algebraic structure firmly into Network Information Theory. Specifically, we present a joint typicality framework for encoding and decoding nested linear codes for multi-user networks. This framework provides a new perspective on compute-forward within the context of discrete memoryless networks. In particular, it establishes an achievable rate region for computing the weighted sum of nested linear codewords over a discrete memoryless multiple-access channel (MAC). When specialized to the Gaussian MAC, this rate region recovers and improves upon the lattice-based compute-forward rate region, thus providing a unified approach for discrete memoryless and Gaussian networks. Furthermore, this framework can be used to shed light on the joint decoding rate region for compute-forward, which is considered an open problem. Specifically, this work establishes an achievable rate region for simultaneously decoding two linear combinations of nested linear codewords from K senders.

This is joint work with Sung Hoon Lim, Chen Feng, Adriano Pastore, and Bobak Nazer.

Bio

Michael Gastpar received the Dipl. El.-Ing. degree from Eidgenössische Technische Hochschule (ETH) in 1997, the Master of Science degree from the University of Illinois (Urbana) in 1999, and the Doctorat ès Science degree from Ecole Polytechnique Fédérale de Lausanne (EPFL) in 2002.

During the years 2003-2011, he was an Assistant and tenured Associate Professor in the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley. Since 2011, he has been a Professor in the School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland. He was also a professor at Delft University of Technology, The Netherlands, and a researcher with the Mathematics of Communications Department, Bell Labs, Lucent Technologies, Murray Hill, NJ. His research interests are in network information theory and related coding and signal processing techniques, with applications to sensor networks and neuroscience.

He won the 2002 EPFL Best Thesis Award, an NSF CAREER award in 2004, an Okawa Foundation Research Grant in 2008, and an ERC Starting Grant in 2010. He is the co-recipient of the 2013 Communications Society & Information Theory Society Joint Paper Award. He was an Information Theory Society Distinguished Lecturer (2009-2011). He has served as an Associate Editor for Shannon Theory for the IEEE Transactions on Information Theory (2008-11), and as Technical Program Committee Co-Chair for the 2010 International Symposium on Information Theory, Austin, TX