information measure

Share This

“From Shannon, sometimes called Shannon’s entropy: H(X) = E[I(X)] where I(X) = log(1/p) = -log(p). It is a measure of the surprise value of data. It is highest for uniformly random data—because there is no way to predict which value might arise.”

« Back to Glossary Index Download Tooltip Pro
By |February 1st, 2019|Comments Off on information measure