Probability-generating function
From Free net encyclopedia
Revision as of 10:33, 26 January 2006 Stemonitis (Talk | contribs) rv overlooked vandalism ← Previous diff |
Current revision Stemonitis (Talk | contribs) rv overlooked vandalism |
Current revision
In probability theory, the probability-generating function of a discrete random variable is a power series representation (the generating function) of the probability mass function of the random variable. Probability-generating functions are often employed for their succinct description of the sequence of probabilities Pr(X = i), and to make available the well-developed theory of power series with non-negative coefficients.
Contents |
Definition
If X is a discrete random variable taking values on some subset of the non-negative integers, {0,1, ...}, then the probability-generating function of X is defined as:
- <math>G(z) = \textrm{E}(z^X) = \sum_{i=0}^{\infty}f(i)z^i,</math>
where f is the probability mass function of X. Note that the equivalent notation GX is sometimes used to distinguish between the probability-generating functions of several random variables.
Properties
Power series
Probability-generating functions obey all the rules of power series with non-negative coefficients. In particular, G(1-) = 1, since the probabilities must sum to one, and where G(1-) = limz→1G(z), then the radius of convergence of any probability-generating function must be at least 1, by Abel's theorem for power series with non-negative coefficients.
Probabilities and expectations
The following properties allow the derivation of various basic quantities related to X:
1. The probability mass function of X is recovered by taking derivatives of G
- <math> \quad f(k) = \textrm{Pr}(X = k) = \frac{G^{(k)}(0)}{k!}.</math>
2. It follows from Property 1 that if we have two random variables X and Y, and GX = GY, then fX = fY. That is, if X and Y have identical probability-generating functions, then they are identically distributed.
3. The normalization of the probability density function can be expressed in terms of the generating function by
- <math> E(1)=G(1-)=\sum_{i=0}^\infty f(i)=1.</math>
The expectation of X is given by
- <math> \textrm{E}\left(X\right) = G'(1-).</math>
More generally, the kth factorial moment, E(X(X − 1) ... (X − k + 1)), of X is given by
- <math>\textrm{E}\left(\frac{X!}{(X-k)!}\right) = G^{(k)}(1-), \quad k \geq 0.</math>
So we can get the variance of X as
- <math>var(X)=G(1-) + G'(1-) - \left [G'(1-)\right ]^2.</math>
Functions of independent random variables
Probability-generating functions are particularly useful for dealing with functions of independent random variables. For example:
- If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and
- <math>S_n = \sum_{i=1}^n a_i X_i,</math>
- where the ai are constants, then the probability-generating function is given by
- <math>E(z^{S_n}) = E(z^{\sum_{i=1}^n a_i X_i,}) = G_{S_n}(z) = G_{X_1}(z^{a_1})G_{X_2}(z^{a_2})\ldots G_{X_n}(z^{a_n}).</math>
- For example, if
- <math>S_n = \sum_{i=1}^n X_i,</math>
- then the probability-generating function, GSn(z), is given by
- <math>G_{S_n}(z) = G_{X_1}(z)G_{X_2}(z)\ldots G_{X_n}(z).</math>
- It also follows that the probability-generating function of the difference of two random variables S = X1 − X2 is
- <math>G_S(z) = G_{X_1}(z)G_{X_2}(1/z).</math>
- Suppose that N is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function GN. If the X1, X2, ..., XN are independent and identically distributed with common probability-generating function GX, then
- <math>G_{S_N}(z) = G_N(G_X(z)).</math>
- This can be seen as follows:
- <math> G_{S_N}(z) = E(z^{S_N}) = E(z^{\sum_{i=1}^N X_i}) = E\big(E(z^{\sum_{i=1}^N X_i}| N) \big) = E\big( (G_X(z))^N\big) =G_N(G_X(z)).</math>
- This last fact is useful in the study of Galton-Watson processes.
- Suppose again that N is also an independent, discrete random variable taking values on the non-negative integers, with probability-generating function GN. If the X1, X2, ..., XN are independent, but not identically distributed random variables, where G_{X_i} denotes the probability generating function of X_i, then it holds
- <math>G_{S_N}(z) = \sum_{i \ge 1} f_i \prod_{k=1}^i G_{X_i}(z).</math>
- For identically distributed X_i this simplifies to the identity stated before. The general case is sometimes useful to obtain a decomposition of S_N by means of generating functions.
Examples
- The probability-generating function of a constant random variable, i.e. one with Pr(X = c) = 1, is
- <math>G(z) = \left(z^c\right).</math>
- The probability-generating function of a binomial random variable, the number of successes in n trials, with probability p of success in each trial, is
- <math>G(z) = \left[(1-p) + pz\right]^n.</math>
- Note that this is the n-fold product of the probability-generating function of a Bernoulli random variable with parameter p.
- The probability-generating function of a negative binomial random variable, the number of trials required to obtain the rth success with probability of success in each trial p, is
- <math>G(z) = \left(\frac{pz}{1 - (1-p)z}\right)^r.</math>
- Note that this is the r-fold product of the probability generating function of a geometric random variable.
- The probability-generating function of a Poisson random variable with rate parameter λ is
- <math>G(z) = \left(\textrm{e}^{\lambda(z - 1)}\right).</math>
Related concepts
The probability-generating function is occasionally called the z-transform of the probability mass function. It is an example of a generating function of a sequence (see formal power series).
Other generating functions of random variables include the moment-generating function and the characteristic function.