Do some relative entropy measures coincide in determining correlations or associations for metric data?
Abstract
Entropy is a measure of uncertainty of a statistical experiment or the measure of information provided by experimentation. Several measures of entropy are used in uncertainty considerations for nominal, ordinal (as well as metric) data and specifically in qualitative variation calculations. Besides, relative entropy concepts (e.g. mutual information, etc.) are used in goodness of fit tests or in checking the adequacy of any statistical model in general. In particular, relative entropy measures are used in correlation or association estimations. In this study, based on a specific definition of mutual information, we use some different relative entropy measures. Then we compare these measures under three different situations by some applications.
Keywords
Refbacks
- There are currently no refbacks.