Methods of Information Geometry
Information geometry provides the mathematical sciences with a new framework of analysis. It has emerged from the investigation of the natural differential geometric structure on manifolds of probability distributions, which consists of a Riemannian metric defined by the Fisher information and a one-parameter family of affine connections called the $\alpha$-connections. The duality between the $\alpha$-connection and the $(-\alpha)$-connection together with the metric play an essential role in this geometry. This kind of duality, having emerged from manifolds of probability distributions, is ubiquitous, appearing in a variety of problems which might have no explicit relation to probability theory. Through the duality, it is possible to analyze various fundamental problems in a unified perspective. The first half of this book is devoted to a comprehensive introduction to the mathematical foundation of information geometry, including preliminaries from differential geometry, the geometry of manifolds or probability distributions, and the general theory of dual affine connections. The second half of the text provides an overview of many areas of applications, such as statistics, linear systems, information theory, quantum mechanics, convex analysis, neural networks, and affine differential geometry. The book can serve as a suitable text for a topics course for advanced undergraduates and graduate students.
What people are saying - Write a review
We haven't found any reviews in the usual places.
Elementary differential geometry
The geometric structure of statistical models
Statistical inference and differential geometry
The geometry of time series and linear systems
Multiterminal information theory and statistical inference
a-connections a-family addition affine connection affine coordinate system Amari arbitrary autoparallel canonical divergence components convex coordinate system corresponding curvature curved exponential family defined denote differential geometry dual connections dualistic structure dually flat space e-curvature efficient estimator embedding encoding entropy Equation equivalent estimating function estimating submanifold Example exponential family finite first-order efficient Fisher information matrix Fisher metric geometric structure given hence induced information geometry information theory inner product invariant Kullback divergence let us consider mapping maximum likelihood estimator measurement mixture family multiterminal mutually dual n-dimensional Nagaoka necessary and sufficient normal distribution Note obtain open subset operator orthogonal parallel translation parameter parameterized probability distributions problem quantum random variable real number Riemannian metric Rn+1 satisfies statistical inference statistical manifolds statistical model stochastic submanifold subspace sufficient statistic suppose symmetric tangent space tangent vector tensor field tion TP(M TP(S transformation unbiased underlying distribution vector fields
All Book Search results »