Wikipedia:Reference desk/Archives/Mathematics/2013 November 3
Appearance
Mathematics desk | ||
---|---|---|
< November 2 | << Oct | November | Dec >> | November 4 > |
Welcome to the Wikipedia Mathematics Reference Desk Archives |
---|
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages. |
November 3
[edit]Entropy of uniform distribution
[edit]Melbourne, Australia
Looking at the entries for uniform distribution the entropy is given as ln(n) for the discrete case, and ln(b-a) for the continuous case. For a normalised distribution in the interval [0,1] the discrete case agrees with Shannon's definition of entropy, but the continuous case produces entropy of 0. Is this correct? Rjeges (talk) 10:06, 3 November 2013 (UTC)
- I think it would be more accurate to say that the entropy in the continuous case is undefined. Looie496 (talk) 14:30, 3 November 2013 (UTC)
- Let X be a random variable with a probability density function f whose support is a set . The differential entropy h(X) or h(f) is defined as
- .
- and
- One must take care in trying to apply properties of discrete entropy to differential entropy, since probability density functions can be greater than 1. For example, Uniform(0,1/2) has negative differential entropy
- .
- Thus, differential entropy does not share all properties of discrete entropy.