SOME APPLICATIONS OF ENTROPY-BASED STATISTICS IN LINEAR REGRESSION ANALYSIS
Abstract
Statistical entropy is a measure of variation of a distribution especially when the random variable is qualitative. Entropy-based statistics are also used to measure the degree of association between qualitative variables. Two measures of divergence, namely, Kullback-Leibler divergence and Jeffreys’ divergence are closely related to loglikelihood function. Thus these two entropy-based measures can be used in hypothesis testing procedures as well. In this study, we discuss how relative entropy measures are applied in testing some hypotheses and how useful they would be in regression analysis especially in determining influential observations.
Keywords
statistical entropy, linear regression
Refbacks
- There are currently no refbacks.