Independence (probability theory): Difference between revisions
Appearance
Content deleted Content added
Dick Beldin (talk | contribs) No edit summary |
Larry_Sanger (talk) mNo edit summary |
||
Line 1: | Line 1: | ||
⚫ | |||
When we assert that two or more [[Random Variables]] are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is: |
When we assert that two or more [[Random Variables]] are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is: |
||
Line 27: | Line 23: | ||
⚫ | |||
Revision as of 12:57, 29 June 2001
When we assert that two or more Random Variables are independent, we imply that probabilities of compound events involving these variables can be calculated by simply multiplying the probabilities of the individual events. This is expressed in many ways. The most general statement is:
- Pr[(X in A) & (Y in B)] = Pr[X in A]*Pr[Y in B] for A and B any subsets of the independent sample spaces for X and Y.
In terms of joint and marginal probability densities, we find:
- fXY(x,y)dx dy = fX(x)dx fY(y)dy where f represents a density and the indices on f indicate the random variable.
In terms of the Expectation Operator, we have:
- E[X*Y] = E[X]*E[Y]
back to Statistics/Assumptions