Some applications of Kullback-Leibler and Jeffreys’ divergences in multinomial populations
Abstract
Some of the entropy measures proposed are Shannon entropy(1948), Rényi entropy (1961), Havrda&Charvát entropy(1967), and Tsallis entropy (1988). The limit of Rényi divergence is relative entropy (or Kullback-Leibler divergence) which is a measure of discrepancy between two statistical hypotheses or two probability distributions. Jeffreys’ divergence is a measure of difficulty of making a discrimination between two probability distributions. These divergence measures are related to some chi-square distributions asymptotically such that they can be used in some hypothesis tests. In this study I try to show that entropy based statistics like Kullback-Leibler divergence and Jeffreys’ divergence can be used in some statistical hypothesis tests for multinomial populations by some examples.
Keywords
Refbacks
- There are currently no refbacks.