Information Distances and Divergences for the Generalized Normal Distribution
Advances in Mathematics and Computer Science Vol. 3,
Page 29-45
Abstract
The study of relative measures of information between two distributions that characterizes an Input/Output System is important for the investigation of the informational ability and behaviour of that system. The most important measures of information distance and divergence are briefly presented and grouped. In Statistical Geometry, and for the study of statistical manifolds, relative measures of information are needed that are also distance metrics. The Hellinger distance metric is studied, providing a “compact” measure of informational “proximity” between of two distributions. Certain formulations of the Hellinger distance between two generalized normal distributions are given and discussed. Some results for the Bhattacharyya distance are also given. Moreover, the symmetricity of the Kullback-Leibler divergence between a generalized normal and a t -distribution, is examined for this key measure of information divergence.
Keywords:
- Generalized γ-order normal distribution
- multivariate t distribution
- Kul lback-Leibler divergence
- Hel linger distance
How to Cite
- Abstract View: 0 times