Jump to content

User:Michelebn2/One-symbol Information

From Wikipedia, the free encyclopedia


In information theory, the one-symbol information of two random variables X and Y is a quantity that measures the contribution to the mutual information I(X;Y) of a single symbol

As for mutual information, the most common unit of measurement of One-symbol Information is the bit, when logarithms to the base 2 are used.

Definition of One-symbol Information

[edit]

References

[edit]
  • M.R. DeWeese and M. Meister.(1999) How to measure the information gained from one symbol in Network: Comput. Neural Syst, 10:325-340, 1999.
[edit]