# Joint distribution of random variables

In stochastics, the joint distribution of random variables is a way of constructing a multivariate distribution on a higher-dimensional space from a simple probability measure on a probability space. An example of this is the multinomial distribution . From a theoretical perspective, it is an image measure . The common distribution of random variables is thus a generalization of the distribution of a random variable .

## definition

A finite index set as well as a probability space and a family of random variables from this probability space into the event spaces are given . Be ${\ displaystyle I}$ ${\ displaystyle (\ Omega, {\ mathcal {A}}, P)}$ ${\ displaystyle (X_ {i}) _ {i \ in I}}$ ${\ displaystyle (\ Omega _ {i}, {\ mathcal {A}} _ {i})}$

${\ displaystyle \ Omega _ {I}: = \ prod _ {i \ in I} \ Omega _ {i}}$

the Cartesian product of the basic sets and

${\ displaystyle {\ mathcal {A}} _ {I}: = \ bigotimes _ {i \ in I} {\ mathcal {A}} _ {i}}$

the corresponding product σ-algebra . Then the probability measure on the product space is called that by ${\ displaystyle (\ Omega _ {I}, {\ mathcal {A}} _ {I})}$

${\ displaystyle P _ {(X_ {i}) _ {i \ in I}} \ left (\ prod _ {i \ in I} A_ {i} \ right): = P \ left (\ bigcap _ {i \ in I} \ {X_ {i} \ in A_ {i} \} \ right) = P \ left (\ bigcap _ {i \ in I} X_ {i} ^ {- 1} (A_ {i}) \ right)}$

for is defined, the common distribution of the random variables . ${\ displaystyle A_ {i} \ in {\ mathcal {A}} _ {i}}$${\ displaystyle X_ {i}}$

## example

We consider the probability space with ${\ displaystyle (\ Omega, {\ mathcal {A}}, P)}$

${\ displaystyle \ Omega = \ {1, \ dots, 6 \} ^ {2} {\ text {and}} {\ mathcal {A}} = {\ mathcal {P}} (\ Omega)}$

and the discrete uniform distribution on this basic set. This corresponds to the modeling of a two-time dice roll with a fair dice.

The first random variable is defined as

${\ displaystyle X_ {1} (\ omega) = \ omega _ {1} + \ omega _ {2}}$,

it formalizes the summation of the sums of the two dice and maps with and . ${\ displaystyle (\ Omega _ {1}, {\ mathcal {A}} _ {1})}$${\ displaystyle \ Omega _ {1} = \ {2, \ dots, 12 \}}$${\ displaystyle {\ mathcal {A}} _ {1} = {\ mathcal {P}} (\ Omega _ {1})}$

The second random variable is defined as

${\ displaystyle X_ {2} (\ omega) = {\ begin {cases} 1 & {\ text {if}} \ omega _ {1} {\ text {even}} \\ 0 & {\ text {otherwise}} \ end {cases}}}$

and provides the information whether the first number rolled is even. She reproduces with and . ${\ displaystyle (\ Omega _ {2}, {\ mathcal {A}} _ {2})}$${\ displaystyle \ Omega _ {2} = \ {0.1 \}}$${\ displaystyle {\ mathcal {A}} _ {2} = {\ mathcal {P}} (\ Omega _ {2})}$

The common distribution is now a probability measure , provided with the product σ-algebra (here corresponding to the power set). The probability measure is fully described by the specification on a generator of the σ-algebra, in this case by its values ​​on the elementary events . For the sake of simplicity we only give some probabilities of the common distribution here. ${\ displaystyle \ {2, \ dots, 12 \} \ times \ {0,1 \}}$${\ displaystyle | \ Omega _ {1} | \ cdot | \ Omega _ {2} | = 11 \ cdot 2 = 22}$ ${\ displaystyle \ {(\ omega _ {1}, \ omega _ {2}) \} \ subset \ Omega _ {1} \ times \ Omega _ {2}}$

${\ displaystyle P_ {X_ {1}, X_ {2}} (\ {(3,1) \}) = P (X_ {1} ^ {- 1} (\ {3 \}) \ cap X_ {2 } ^ {- 1} (\ {1 \}))}$
${\ displaystyle = P (\ {(1,2), (2,1) \} \ cap \ {2,4,6 \} \ times \ {1, \ dots, 6 \}) = P (\ { (2,1) \}) = {\ frac {1} {36}}}$
${\ displaystyle P_ {X_ {1}, X_ {2}} (\ {(2,1) \}) = P (X_ {1} ^ {- 1} (\ {2 \}) \ cap X_ {2 } ^ {- 1} (\ {1 \}))}$
${\ displaystyle = P (\ {(1,1) \} \ cap \ {2,4,6 \} \ times \ {1, \ dots, 6 \}) = P (\ emptyset) = 0}$
${\ displaystyle P_ {X_ {1}, X_ {2}} (\ {(4.0) \}) = P (X_ {1} ^ {- 1} (\ {4 \}) \ cap X_ {2 } ^ {- 1} (\ {0 \}))}$
${\ displaystyle = P (\ {(1,3), (3,1), (2,2) \} \ cap \ {1,3,5 \} \ times \ {1, \ dots, 6 \} ) = P (\ {(1,3), (3,1) \}) = {\ frac {2} {36}}}$.

## Derived terms

### Common distribution function

Analogous to the distribution function of a probability distribution can be also for common distributions of real-valued random variables common distribution function defined. This is a function

${\ displaystyle F _ {(X_ {i}) _ {i \ in I}}: \ mathbb {R} ^ {| I |} \ to [0,1]}$

defined by

${\ displaystyle F _ {(X_ {i}) _ {i \ in I}} (x) = P (X_ {i} \ leq x_ {i} {\ text {for each}} i \ in I) = P \ left (\ bigcap _ {i \ in I} \ {X_ {i} \ leq x_ {i} \} \ right)}$.

Sometimes it is just referred to as. ${\ displaystyle F_ {I}}$

### Common density

As with probability distributions with probability densities , a common density can be defined for common distributions of random variables . This denotes the (not necessarily existing) continuous function which ${\ displaystyle f}$

${\ displaystyle F_ {I} (x_ {i_ {1}}, \ dots, x_ {i_ {n}}) = \ int _ {- \ infty} ^ {x_ {i_ {1}}} \ dots \ int _ {- \ infty} ^ {x_ {i_ {n}}} f (t_ {1}, \ dots, t_ {n}) \ mathrm {d} t_ {n} \ dots \ mathrm {d} t_ {1 }}$

Fulfills. The index set is here o. B. d. A. set. ${\ displaystyle I = \ {i_ {1}, \ dots, i_ {n} \}}$

### Marginal distribution

The image dimensions under the projection onto the individual components of the product space are referred to as marginal distributions (sometimes also called marginal distributions) . Formally, the j-th marginal distribution of the common distribution is thus defined for as ${\ displaystyle A_ {j} \ in {\ mathcal {A}} _ {j}}$

${\ displaystyle P ^ {j} (A_ {j}) = P _ {(X_ {i}) _ {i \ in I}} \ left (A_ {j} \ times \ prod _ {i \ in I, i \ neq j} \ Omega _ {i} \ right)}$.

The distribution function of the marginal distribution is accordingly called the marginal distribution function , and the density is called the marginal density .

## Uniqueness

The common distribution of random variables is initially not defined on the entire product σ-algebra, but only on the product of the individual σ-algebras of the measurement spaces. However, since this product is a generator of the product σ-algebra in this case, the above definition can be extended unambiguously to a probability measure on the entire product σ-algebra.

## Relationship to independence

The independence of finite sets of random variables can easily be checked by means of the joint distribution of random variables . The following applies:

${\ displaystyle P _ {(X_ {i}) _ {i \ in I}} = \ bigotimes _ {i \ in I} P_ {X_ {i}}}$
• It follows directly from this: The random variables are independent if their common distribution function (common density) is exactly the product of the distribution functions (density functions) of their distributions.

According to the definition of stochastic independence from random variables, any families of random variables are independent if and only if one of the above statements applies to all finite partial families.

## use

The common distributions of random variables are used not only to define multivariate distributions but also to determine conditional distributions by means of the marginal distributions . The conditional distributions model existing knowledge about the value of a random variable.