Category: Information geometry

Dually flat manifold
No description available.
Kullback–Leibler divergence
In mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how one probability distribution
Fisher information metric
In information geometry, the Fisher information metric is a particular Riemannian metric which can be defined on a smooth statistical manifold, i.e., a smooth manifold whose points are probability mea
Information geometry
Information geometry is an interdisciplinary field that applies the techniques of differential geometry to study probability theory and statistics. It studies statistical manifolds, which are Riemanni
Chentsov's theorem
In information geometry, Chentsov's theorem states that the Fisher information metric is, up to rescaling, the unique Riemannian metric on a statistical manifold that is invariant under sufficient sta