Calculates the Kullback-Leibler divergence (relative entropy) between unweighted theoretical component distributions. Divergence is calculated as: int [f(x) (log f(x) - log g(x)) dx] for distributions with densities f() and g().

kl.divergence(object, eps = 10^-4, overlap = TRUE)

Arguments

object

Matrix or dataframe object with >=2 columns

eps

Probabilities below this threshold are replaced by this threshold for numerical stability.

overlap

Logical, do not determine the KL divergence for those pairs where for each point at least one of the densities has a value smaller than eps.

Value

pairwise Kullback-Leibler divergence index (matrix)

References

Kullback S., and R. A. Leibler (1951) On information and sufficiency. The Annals of Mathematical Statistics 22(1):79-86

Author

Jeffrey S. Evans <jeffrey_evans@tnc.org>

Examples

x <- seq(-3, 3, length=200) y <- cbind(n=dnorm(x), t=dt(x, df=10)) matplot(x, y, type='l')
kl.divergence(y)
#> n t #> n 0.000000000 0.004314432 #> t 0.005138101 0.000000000
# extract value for last column kl.divergence(y[,1:2])[3:3]
#> [1] 0.004314432