Stochastically independent random variables

from Wikipedia, the free encyclopedia

The stochastic independence of random variables is a central concept of probability theory and statistics , which generalizes the stochastic independence of events and the independence of set systems . The stochastic independence from random variables is required , for example, when formulating the central limit theorem .

Definition for two random variables

A probability space as well as two measurement spaces and and two random variables are given

and

.

The two random variables are called stochastically independent or easier regardless if for each and every rule,

.

Usually the quantities are noted more compactly by writing instead of simply . Then the definition is

for everyone .

An alternative definition is made possible by the stochastic independence of events . Then you define

.

The random variables are then called stochastically independent if it applies to all that the and are stochastically independent events, i.e.

applies.

example

We consider the probability space with the basic set , σ-algebra and, as a probability measure, the uniform distribution on the basic set. Be and . The random variables are defined as

.

Each of the σ-algebra has four elements: . Accordingly, 16 combinations would have to be checked. However, the cases in which one of the sets involved is the superset or the empty set can be excluded, since each set is independent of these two. This leaves only 4 cases: or combined with or

  1. Be . Then is and as well . These events are independent because it is .
  2. Be . Then is and as well . These events are independent because it is .
  3. Be and . Then is and as well . These events are independent because it is .
  4. Be and . Then is and as well . These events are independent because it is .

Thus, all events are independent and so are the random variables.

general definition

The family of random variables , for an arbitrary index set is stochastically independent if for any finite subset of rule,

applies to all .

With the independence for set systems , the stochastic independence of random variables is also defined as follows: A family of random variables are stochastically independent if and only if their initial σ-algebras are independent of one another.

This definition can be applied equivalently to random vectors , that is to -value random variables. There are no further requirements for the independence of the component images.

Criteria for independence

Generating systems

The number of quantities to be checked for independence can be reduced if a producer is known. If at any σ-algebra a by-system that generates so true , so it is sufficient to test the independence of the producers. The criterion is then reduced to

for all and all finite subsets of . For discrete probability spaces, one usually chooses the point sets as the generator , for real random variables the half-open intervals as the generator of Borel's σ-algebra .

Finite families

If the family of random variables and thus also the index set is finite, for example with an index set , then it is sufficient

for everyone to check. There is no need to check the subsets . This follows that is. The case with then follows automatically from the above case, you set for then and get the statement for the smaller index set.

For finite families of discrete random variables

Both of the above criteria can be combined for a finite family of random variables that assume values ​​in a discrete measurement space. Let and be the random variables of to and be discrete, i.e. finite or countably infinite. Then the random variables are independent if and only if

applies to all .

For finite families of real random variables

The following criterion results for finite families of real-valued random variables: The random variables are stochastically independent if and only if

applies to all . So are the distribution functions of and the joint distribution function , then the if and stochastically independent if

applies. If they have a common density function , then they are stochastically independent if and only if

applies. Here referred to the edge density of .

Existence of independent random variables

For countably infinite families of random variables the question arises whether a “sufficiently large” probability space exists at all so that the entire family is independent on this probability space. It is not obvious that this is possible; alternatively, independence could be too strong a requirement, since the initial σ-algebras are always inevitably dependent on many random variables.

In fact, the question can be answered positively by means of the product size. Looking at the infinite product model

and as a family of random variables precisely defines the projections onto the i-th components , this family is independent by definition of the product model and the product dimension, and the projections have precisely the distribution on the event space . So the product model is large enough to contain an independent family of random variables. On the other hand, the problem of the existence of an infinite number of independent random variables is reduced to the existence of an infinite product measure, which is not self-evident. This question of existence is answered positively by Andersen-Jessen's theorem for any index sets, but it can also be answered for countable index sets using the Ionescu-Tulcea theorem or for Borelian spaces using Kolmogorov's extension theorem .

Uncorrelatedness and independence

Two random variables are called uncorrelated if their covariance is zero.

The independence of the random variables always results in their uncorrelation. Namely, if the random variables are independent, then the following applies to the expected value and accordingly

.

The first equality follows from the shift theorem for the covariance and the second from the independence of the random variables and the above conclusion for the expected value.

Conversely, uncorrelatedness does not result in stochastic independence. An example of this is the random variable , which is evenly distributed on and . It then applies

,

the random variables are therefore uncorrelated. But you are not independent because it is for example

and

.

The dependency then follows from .

Analysis for addiction

For the analysis of the dependence of two random variables, one can also test whether the correlation coefficient is zero. If the hypothesis is rejected, it is assumed that these variables are stochastically dependent. However, the reverse is not permissible because there may be dependency structures that the correlation coefficient cannot capture. However, for example, uncorrelated, jointly normally distributed random variables are also stochastically independent.

Independence from random variables and set systems

In the context of the conditional expected value , the independence of a random variable and a set system is sometimes also used . The random variable and the set system are said to be independent if the set system and the initial σ-algebra of the random variable are independent set systems .

Generalizations

By means of the conditional expectation value , both the independence from set systems and the independence from random variables can be extended to conditional independence .

literature

  • Achim Klenke: Probability Theory . 3. Edition. Springer-Verlag, Berlin Heidelberg 2013, ISBN 978-3-642-36017-6 .
  • Ulrich Krengel: Introduction to probability theory and statistics . For studies, professional practice and teaching. 8th edition. Vieweg, Wiesbaden 2005, ISBN 3-8348-0063-5 .
  • Hans-Otto Georgii: Stochastics . Introduction to probability theory and statistics. 4th edition. Walter de Gruyter, Berlin 2009, ISBN 978-3-11-021526-7 .
  • Christian Hesse: Applied probability theory . 1st edition. Vieweg, Wiesbaden 2003, ISBN 3-528-03183-2 .
  • AM Prokhorov: Independence . In: Michiel Hazewinkel (Ed.): Encyclopaedia of Mathematics , Kluwer Academic Publishers, 2001, ISBN 978-1-55608-010-4 ( [1] ).

Web links

Wikibooks: Introduction to Stochastic Independence  - Learning and Teaching Materials

Individual evidence

  1. Norbert Kusolitsch: Measure and probability theory . An introduction. 2nd, revised and expanded edition. Springer-Verlag, Berlin Heidelberg 2014, ISBN 978-3-642-45386-1 , p. 95 , doi : 10.1007 / 978-3-642-45387-8 .