entropy
(Package: infotheo) :
entropy computation
entropy takes the dataset as input and computes the entropy according to the entropy estimator method .
● Data Source:
CranContrib
● Keywords: misc
● Alias: entropy
●
0 images
|
multiinformation takes a dataset as input and computes the multiinformation (also called total correlation) among the random variables in the dataset. The value is returned in nats using the entropy estimator estimator .
● Data Source:
CranContrib
● Keywords: misc
● Alias: multiinformation
●
0 images
|
mutinformation
(Package: infotheo) :
mutual information computation
mutinformation takes two random variables as input and computes the mutual information in nats according to the entropy estimator method . If Y is not supplied and X is a matrix-like argument, the function returns a matrix of mutual information between all pairs of variables in the dataset X.
● Data Source:
CranContrib
● Keywords: misc
● Alias: mutinformation
●
0 images
|
natstobits
(Package: infotheo) :
convert nats into bits
natstobits takes a value in nats (a double) as input and returns the value in bits (a double).
● Data Source:
CranContrib
● Keywords: misc
● Alias: natstobits
●
0 images
|
discretize
(Package: infotheo) :
Unsupervized Data Discretization
discretize discretizes data using the equal frequencies or equal width binning algorithm. "equalwidth" and "equalfreq" discretizes each random variable (each column) of the data into nbins . "globalequalwidth" discretizes the range of the random vector data into nbins .
● Data Source:
CranContrib
● Keywords: misc
● Alias: discretize
●
0 images
|
interinformation
(Package: infotheo) :
interaction information computation
interinformation takes a dataset as input and computes the the interaction information among the random variables in the dataset using the entropy estimator method . This measure is also called synergy or complementarity.
● Data Source:
CranContrib
● Keywords: misc
● Alias: interinformation
●
0 images
|
condentropy
(Package: infotheo) :
conditional entropy computation
condentropy takes two random vectors, X and Y, as input and returns the conditional entropy, H(X|Y), in nats (base e), according to the entropy estimator method . If Y is not supplied the function returns the entropy of X - see entropy .
● Data Source:
CranContrib
● Keywords: misc
● Alias: condentropy
●
0 images
|
condinformation
(Package: infotheo) :
conditional mutual information computation
condinformation takes three random variables as input and computes the conditional mutual information in nats according to the entropy estimator method . If S is not supplied the function returns the mutual information between X and Y - see mutinformation
● Data Source:
CranContrib
● Keywords: misc
● Alias: condinformation
●
0 images
|
infotheo
(Package: infotheo) :
Information Theory package
The package infotheo provide various estimators for computing information-theoretic measures from data
● Data Source:
CranContrib
● Keywords: misc
● Alias: infotheo
●
0 images
|