Probability distribution

From Free net encyclopedia

In mathematics and statistics, a probability distribution, more properly called a probability density, assigns to every interval of the real numbers a probability, so that the probability axioms are satisfied. In technical terms, a probability distribution is a probability measure whose domain is the Borel algebra on the reals.

A probability distribution is a special case of the more general notion of a probability measure, which is a function that assigns probabilities satisfying the Kolmogorov axioms to the measurable sets of a measurable space. Additionally, some authors define a distribution generally as the probability measure induced by a random variable X on its range - the probability of a set B is <math>P(X^{-1}(B))</math>. However, this article discusses only probability measures over the real numbers.

Every random variable gives rise to a probability distribution, and this distribution contains most of the important information about the variable. If X is a random variable, the corresponding probability distribution assigns to the interval [a, b] the probability Pr[aXb], i.e. the probability that the variable X will take a value in the interval [a, b].

The probability distribution of the variable X can be uniquely described by its cumulative distribution function F(x), which is defined by

<math> F(x) = \Pr\left[ X \le x \right] </math>

for any x in R.

A distribution is called discrete if its cumulative distribution function consists of a sequence of finite jumps, which means that it belongs to a discrete random variable X: a variable which can only attain values from a certain finite or countable set. By one convention, a distribution is called continuous if its cumulative distribution function is continuous, which means that it belongs to a random variable X for which Pr[ X = x ] = 0 for all x in R. Another convention reserves the term continuous probability distribution for absolutely continuous distributions. These can be expressed by a probability density function: a non-negative Lebesgue integrable function f defined on the real numbers such that

<math>

\Pr \left[ a \le X \le b \right] = \int_a^b f(x)\,dx </math>

for all a and b. Of course, discrete distributions do not admit such a density; there also exist some continuous distributions like the devil's staircase that do not admit a density.

  • The support of a distribution is the smallest closed set whose complement has probability zero.
  • The probability distribution of the sum of two independent random variables is the convolution of each of their distributions.
  • The probability distribution of the difference of two random variables is the cross-correlation of each of their distributions.

Contents

List of important probability distributions

Several probability distributions are so important in theory or applications that they have been given specific names:

Discrete distributions

With finite support

  • The Bernoulli distribution, which takes value 1 with probability p and value 0 with probability q = 1 − p.
  • The binomial distribution describes the number of successes in a series of independent Yes/No experiments.
  • The degenerate distribution at x0, where X is certain to take the value x0. This does not look random, but it satisfies the definition of random variable. This is useful because it puts deterministic variables and random variables in the same formalism.
  • The discrete uniform distribution, where all elements of a finite set are equally likely. This is supposed to be the distribution of a balanced coin, an unbiased die, a casino roulette or a well-shuffled deck. Also, one can use measurements of quantum states to generate uniform random variables. All these are "physical" or "mechanical" devices, subject to design flaws or perturbations, so the uniform distribution is only an approximation of their behaviour. In digital computers, pseudo-random number generators are used to produce a statistically random discrete uniform distribution.
  • The hypergeometric distribution, which describes the number of successes in the first m of a series of n independent Yes/No experiments, if the total number of successes is known.
  • Zipf's law or the Zipf distribution. A discrete power-law distribution, the most famous example of which is the description of the frequency of words in the English language.
  • The Zipf-Mandelbrot law is a discrete power law distribution which is a generalization of the Zipf distribution.

With infinite support

Image:Poisson distribution PMF.png

Image:SkellamDistribution.png

Continuous distributions

Supported on a bounded interval

Image:Beta distribution pdf.png

  • The Beta distribution on [0,1], of which the uniform distribution is a special case, and which is useful in estimating success probabilities.

Image:Uniform distribution PDF.png

Supported on semi-infinite intervals, usually [0,∞)

Image:Chi-square distributionPDF.png

Image:Exponential distribution pdf.png

Image:Gamma distribution pdf.png

Image:Pareto distributionPDF.png

Supported on the whole real line

Image:Cauchy distribution pdf.png Image:Laplace distribution pdf.png Image:LevyDistribution.png Image:Normal distribution pdf.png

Joint distributions

For any set of independent random variables the probability density function of the joint distribution is the product of the individual ones.

Two or more random variables on the same sample space

Matrix-valued distributions

Miscellaneous distributions

See also

ar:توزيع احتمالي de:Wahrscheinlichkeitsverteilung es:Distribución de probabilidad fr:Loi de probabilité gl:Distribución de probabilidade it:Variabile casuale he:התפלגות lt:Skirstinys nl:Kansverdeling ja:確率分布 pl:Rozkład zmiennej losowej ru:Распределение su:Sebaran probabilitas sv:Sannolikhetsfördelning vi:Phân bố xác suất zh:概率分布