KL_div {YEAB}R Documentation

Computes the Kullback-Leibler divergence based on kernel density estimates

Description

Computes the Kullback-Leibler divergence based on kernel density estimates of two samples.

Usage

KL_div(x, y, from_a, to_b)

Arguments

x

numeric, the values from a sample p

y

numeric, the values from a sample q

from_a

numeric, the lower limit of the integration

to_b

numeric, the upper limit of the integration

Details

The Kullback-Leibler divergence is defined as

D_{KL}(P||Q) = \int_{-\infty}^{\infty} p(x) \log \frac{p(x)}{q(x)} dx

Value

a numeric value that is the kl divergence

Examples

set.seed(123)
p <- rnorm(100)
q <- rnorm(100)
KL_div(p, q, -Inf, Inf) # 0.07579204
q <- rnorm(100, 10, 4)
KL_div(p, q, -Inf, Inf) # 7.769912

[Package YEAB version 1.0.6 Index]