Rényi entropies are a natural one-parameter generalization of Shannon entropy that were first introduced over half a century ago, but about which fundamental questions remain incompletely answered. After a (very) brief introduction to why Rényi information functionals (entropies, divergences, etc.) are of interest from an information-theoretic viewpoint, we will attempt to expose the relevance of Rényi information inequalities for several areas of mathematics. For example, they allow for the unification of several interesting inequalities — including the entropy power inequality (which plays a fundamental role in information theory), the Brunn-Minkowski inequality (which plays a fundamental role in convex geometry), and Rogozin’s convolution inequality (which is fundamental to the area of “small ball” estimates in probability theory). They also allow for the quantification of uncertainty principles in harmonic analysis. In another direction, they are relevant to the field of additive combinatorics, which has seen burgeoning activity over the last two decades due to applications in theoretical computer science as well as other parts of mathematics.

### Recorded Talk

Thanks to Mokshay for allowing us to record the talk!