the unit in which entropy is measured.
The default is "nats" (natural units). For
computing entropy in "bits" set unit="log2".
Details
The mutual information of two random variables X and Y
is the Kullback-Leibler divergence between the joint density/probability
mass function and the product independence density of the marginals.
It can also defined using entropy as MI = H(X) + H(Y) - H(X, Y).
Similarly, the chi-squared statistic of independence is the chi-squared statistic
between the joint density and the product density. It is a second-order accurate
approximation of twice the mutual information.
Value
mi.plugin returns the mutual information.
chi2indep.plugin returns the chi-squared statistic of independence.