Last data update: 2014.03.03

R: Kullback-Leibler Divergence
KL.divergenceR Documentation

Kullback-Leibler Divergence

Description

Compute Kullback-Leibler divergence.

Usage

  KL.divergence(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.divergence(X, Y, k = 10, algorithm="kd_tree")

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Details

If p(x) and q(x) are two continuous probability density functions, then the Kullback-Leibler divergence of q from p is defined as E_p[log p(x)/q(x)].

KL.* versions return divergences from C code to R but KLx.* do not.

Value

Return the Kullback-Leibler divergence from X to Y.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: shli@stat.wvu.edu.

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.

See Also

KL.dist

Examples

    set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)

    KL.divergence(X, Y, k=5)
    #theoretical divergence = log(0.2/0.4)+(0.4-0.2)-1 = 1-log(2) = 0.307

Results


R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(FNN)
> png(filename="/home/ddbj/snapshot/RGM3/R_CC/result/FNN/KL.divergence.Rd_%03d_medium.png", width=480, height=480)
> ### Name: KL.divergence
> ### Title: Kullback-Leibler Divergence
> ### Aliases: KL.divergence KLx.divergence
> ### Keywords: manip
> 
> ### ** Examples
> 
>     set.seed(1000)
>     X<- rexp(10000, rate=0.2)
>     Y<- rexp(10000, rate=0.4)
> 
>     KL.divergence(X, Y, k=5)
[1] 0.2962696 0.3173042 0.3070079 0.3034722 0.3021469
>     #theoretical divergence = log(0.2/0.4)+(0.4-0.2)-1 = 1-log(2) = 0.307
> 
> 
> 
> 
> 
> dev.off()
null device 
          1 
>