List of relationships between probability distributions

from Wikipedia, the free encyclopedia

The following table provides an overview of the relationships between probability distributions .

description Memory aid *)
The square of a standard normally distributed random variable is chi-square distributed with parameter 1.  
The sum of independent chi-square distributed random variables is again chi-square distributed.  
The sum of independent normally distributed random variables is again normally distributed.  
The sum of independent Poisson-distributed random variables is again Poisson-distributed.  
The sum of independent binomially distributed random variables with the same parameter p is again binomially distributed.  
The sum of independent, negatively binomially distributed random variables with the same parameter p is again negatively binomially distributed.  
The sum of independent Erlang-distributed random variables with the same parameter is again Erlang-distributed.  
The sum of independent gamma-distributed random variables with the same parameter b is again gamma-distributed.  
Relationship between the Erlang distribution and the exponential distribution .  
Relationship between the Erlang distribution and the chi-square distribution.  
Relationship between Erlangen distribution and gamma distribution.

(For the second integer parameter, the gamma distribution agrees with the Erlang distribution.)

 
Relationship between the Weibull distribution and the exponential distribution.  
If X and Y are independent random variables, X are standard normal distributed and Y are -distributed, then -distributed.  
If X and Y are independent random variables, X -distributed and Y -distributed, then -distributed.  
The logarithm of a logarithmically normally distributed random variable is normally distributed.  
If Z is negatively binomially distributed with parameters 1 and p , then Z - 1 is geometrically distributed with parameter p .  

*) In the memory aid, for example, it does not stand for the chi-square distribution , but for a random variable in a chi-square distribution. The difference is that, for example, the distribution of the sum of random variables (it is called the convolution of the distributions) is usually written with, for example, ( distributions) instead of ( random variable) as is the case here . The advantage of the notation ( distributions) is that it already indicates which operation is to be applied to the distribution functions in order to obtain the distribution of the sum. The advantage of the notation ( random variable) is that it indicates which operation originally acted on the random variable.

The "=" sign stands for "has the same distribution as".

The random variables that are on the left side of the equal sign are always completely independent of one another.


From the above rules as follows (in "memory aid" notation) . Note that the first random variable must be independent of the second random variable . If instead you use the same random variable both times, i.e. when you calculate, the result is different!

See also

Web links