Information geometry (IG) proposes viewing statistical models as geometrical shapes. This perspective allows one to reason intuitively about them and to apply modern tools from differential geometry to analyze the correspondence between their geometric features (e.g. flatness, curvatures) and their statistical properties (e.g. distinguishability between distributions, convergence of estimators). The interdisciplinary field of IG was born from C.R. Rao and H. Hotelling, and was further developed by R.A. Fisher, B. Efron, N.N. Cencov, and I. Csiszár (among others) and its modern treatment is largely due to S-i. Amari and H. Nagaoka.
While the above theory was primarily developed in the context of distributions, in this talk, we will turn the spotlight on the class of Markov models, point out some of their geometric features and discover that they enjoy a much richer structure. We will introduce mixture families and exponential families of Markov chains by following the construction of H. Nagaoka, and emphasize how it contrasts with the distribution setting. We will list a few applications of the geometric approach to inference in Markov chains (e.g. parameter estimation, hypothesis testing, large deviations, … ) and will discuss some recent advances in the context of time reversibility and, if time permits, lumpability.