R: RMSE, Expected Log Likelihood and KL Divergence Between Two...
metrics
R Documentation
RMSE, Expected Log Likelihood and KL Divergence Between
Two Multivariate Normal Distributions
Description
These functions calculate the root-mean-squared-error,
the expected log likelihood, and Kullback-Leibler (KL) divergence
(a.k.a. distance), between two multivariate normal (MVN)
distributions described by their mean vector and covariance matrix
mean vector of second (true, baseline, or comparator) MVN
S2
covariance matrix of second (true, baseline, or comparator) MVN
quiet
when FALSE (default)
symm
when TRUE a symmetrized version of the
KL divergence is used; see the note below
Details
The root-mean-squared-error is calculated between the entries of
the mean vectors, and the upper-triangular part of the covariance
matrices (including the diagonal).
where N is length(mu1), and must agree with
the dimensions of the other parameters. Note that the parameterization
used involves swapped arguments compared to some other references,
e.g., as provided by Wikipedia. See note below.
The expected log likelihood can be formulated in terms of the
KL divergence. That is, the expected log likelihood of data
simulated from the normal distribution with parameters mu2
and S2 under the estimated normal with parameters
mu1 and S1 is given by
-0.5 ln((2 pi e)^N |S2|) - kl.norm(mu1, S1, mu2, S2).
Value
In the case of the expected log likelihood the result is
a real number. The RMSE is a positive real number.
The KL divergence method returns a positive
real number depicting the distance between the
two normal distributions
or by using symm = TRUE. The arguments are reversed
compared to some other references, like Wikipedia. To match
those versions use kl.norm(mu2, S2, mu1, s1)