Sample Complexity of Distinguishing Cause from Effect

Sagnik Bhattacharya, UMD


Shared information is a measure of mutual dependence among m ≥ 2 jointly distributed discrete random variables, and has been proposed as a generalization of Shannon’s mutual information. The first part of the talk will focus on some properties of shared information that make it a good measure of such mutual dependence and some applications. In the second part, I shall discuss our recent work on explicit formulae for shared information in the special case of a Markov chain on a tree and how these results help in estimating shared information when the joint distribution of the underlying random variables is not known. Joint work with Prakash Narayan.

Recorded Talk

Coming soon!