“School of Mathematics”

Back to Papers Home
Back to Papers of School of Mathematics

Paper   IPM / M / 15544
School of Mathematics
  Title:   Mixture models, Bayes Fisher information, and divergence measures
  Author(s):  Majid Asadi (Joint with N. Ebrahimi, O. Kharazmi, and E. S. Soofi)
  Status:   Published
  Journal: IEEE Transactions on Information Theory
  Vol.:  65
  Year:  2019
  Pages:   1-6
  Supported by:  IPM
This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen-Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback-Leibler, Jeffreys, Jensen-Shannon, Renyi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution.

Download TeX format
back to top
scroll left or right