School of Technology and Computer Science Seminars

A Tale of Two Measures

by Dr. Sudeep Kamath (Princeton University, USA)

Friday, January 2, 2015 from to (Asia/Kolkata)
Description
Information theory has been traditionally studied in the context of communication theory and statistical physics. However, it has also had important applications in other fields such as computer science, economics, mathematics, and statistics. This talk is very much in the spirit of discovering applications of information theory in other fields. We will discuss three such recent applications:

Statistics: The Hirschfeld-Gebelein-Rényi maximal correlation is an important tool in statistics that has found numerous applications from correspondence analysis, to detection of non-linear patterns in data. We will describe a simple information-theoretic proof of a fundamental result on maximal correlation due to Dembo, Kagan, and Shepp (2001).

Computer Science: Boolean functions are one of the most basic objects of study in theoretical computer science. We show how information-theoretic tools can aid Fourier analytic tools in this quest. Specifically, we will consider the problem of correlation between Boolean functions on a noisy hypercube graph.

Mathematics: Hypercontractivity and Reverse Hypercontractivity are very useful tools for studying concentration of measure, and extremal questions in the geometry of high-dimensional spaces, both discrete and continuous. In this talk, we will describe a recent result by Chandra Nair characterizing hypercontractivity using information measures. We will extend this result to reverse hypercontractivity, and we will discuss implications of these results.

The title of this presentation is derived from two measures of correlation

- the maximal correlation and the so-called strong data processing constant
- that will be key concepts used throughout.

This talk is based on joint work with Venkat Anantharam, Amin Gohari, and Chandra Nair.