entropy.NSB estimates the Shannon entropy H of the random variable Y
from the corresponding observed counts y using the method
of Nemenman, Shafee and Bialek (2002).
Note that this function is an R interface to the "nsb-entropy" program.
Hence, this needs to be installed separately from http://nsb-entropy.sourceforge.net/.
the unit in which entropy is measured.
The default is "nats" (natural units). For
computing entropy in "bits" set unit="log2".
CMD
path to the "nsb-entropy" executable.
Details
The NSB estimator is due to Nemenman, Shafee and Bialek (2002).
It is a Dirichlet-multinomial entropy estimator, with a hierarchical prior
over the Dirichlet pseudocount parameters.
Note that the NSB estimator is not a plug-in estimator, hence there
are no explicit underlying bin frequencies.
Value
entropy.NSB returns an estimate of the Shannon entropy.
Author(s)
Jean Hausser.
References
Nemenman, I., F. Shafee, and W. Bialek. 2002. Entropy and inference, revisited.
In: Dietterich, T., S. Becker, Z. Gharamani, eds. Advances in Neural
Information Processing Systems 14: 471-478. Cambridge (Massachusetts):
MIT Press.