Joint distribution

From Free net encyclopedia

Given two random variables X and Y, the joint distribution of X and Y is the distribution of X and Y together.

Contents

The discrete case

For discrete random variables, the joint probability mass function can be written as Pr(X = x & Y = y). This is

<math>P(X=x\ \mathrm{and}\ Y=y) = P(Y=y|X=x)P(X=x)= P(X=x|Y=y)P(Y=y).\;</math>

Since these are probabilities, we have

<math>\sum_x \sum_y P(X=x\ \mathrm{and}\ Y=y) = 1.\;</math>

The continuous case

Similarly for continuous random variables, the joint probability density function can be written as fX,Y(xy) and this is

<math>f_{X,Y}(x,y)=f_{Y|X}(y|x)f_X(x) = f_{X|Y}(x|y)f_Y(y) \;</math>

where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.

Since this is a probability density, we have

<math>\int_x \int_y f_{X,Y}(x,y) \; dy \; dx= 1.</math>

Joint distribution of independent variables

If for discrete random variables <math>\ P(X = x \ \mbox{and} \ Y = y ) = P( X = x) \cdot P( Y = y) </math> for all x and y, or for continuous random variables <math>\ p_{X,Y}(x,y) = p_X(x) \cdot p_Y(y) </math> for all x and y, then X and Y are said to be independent.

Multidimensional distributions

The joint distribution of two random variables can be extended to many random variables X1, ..., Xn by adding them sequentially with the identity

<math>f_{X_1, \ldots, X_n}(x_1, \ldots, x_n) = f_{X_n | X_1, \ldots, X_{n-1}}( x_n | x_1, \ldots, x_{n-1}) f_{X_1, \ldots, X_{n-1}}( x_1, \ldots, x_{n-1} ) .</math>

See also

External links