# Studentsche t distribution

Densities of -distributed random variables${\ displaystyle t}$

The Student 's t distribution (including Student t distribution or shortly t- distribution ) is a probability distribution , which in 1908 by William Sealy Gosset developed and after his pseudonym student was named.

Gosset had found that the standardized estimator of the sample mean value of normally distributed data is no longer normally distributed, but rather -distributed if the variance of the characteristic required to standardize the mean value is unknown and must be estimated using the sample variance . Its distribution allows - especially for small sample sizes - to calculate the distribution of the difference between the mean of the sample and the true mean of the population .${\ displaystyle t}$${\ displaystyle t}$

The values depend on the significance level and the sample size and determine the confidence interval and thus the significance of the estimate of the mean. The distribution becomes narrower with increasing and changes into the normal distribution (see graphic on the right). Hypothesis tests that use the distribution are called t-tests . ${\ displaystyle t}$${\ displaystyle n}$${\ displaystyle t}$${\ displaystyle n}$${\ displaystyle n \ to \ infty}$${\ displaystyle t}$

The derivation was first published in 1908 when Gosset was working in the Dublin Guinness Brewery . Since his employer did not allow publication, Gosset published it under the pseudonym Student. The t-factor and the related theory were first substantiated by the work of R. A. Fisher , who called the distribution Student's distribution .

The distribution also occurs in earlier publications by other authors. It was first derived in 1876 by Jacob Lüroth as a posteriori distribution when dealing with a problem of equalization , and in 1883 by Edgeworth in a similar context . ${\ displaystyle t}$

## definition

A continuous random variable satisfies the Student's distribution with degrees of freedom , when the probability density${\ displaystyle X}$${\ displaystyle t}$${\ displaystyle n> 0}$

${\ displaystyle f_ {n} (x) = {\ frac {\ Gamma \ left ({\ frac {n + 1} {2}} \ right)} {{\ sqrt {n \ pi}} ~ \ Gamma \ left ({\ frac {n} {2}} \ right)}} \ left (1 + {\ frac {x ^ {2}} {n}} \ right) ^ {- {\ frac {n + 1} {2}}} \ quad \ mathrm {f {\ ddot {u}} r} \ quad - \ infty

owns. It is

${\ displaystyle \ Gamma (x) = \ int \ limits _ {0} ^ {+ \ infty} t ^ {x-1} e ^ {- t} \ operatorname {d} t}$

the gamma function . For natural numbers the following applies in particular (here the factorial of ) ${\ displaystyle n}$${\ displaystyle n!}$${\ displaystyle n}$

${\ displaystyle \ Gamma (n + 1) = n!, \ quad \ Gamma \ left (n + {\ tfrac {1} {2}} \ right) = {\ frac {(2n)!} {n! \, 4 ^ {n}}} \, {\ sqrt {\ pi}}.}$

Alternatively, the distribution with degrees of freedom can also be defined as the distribution of size ${\ displaystyle t}$${\ displaystyle n}$

${\ displaystyle t_ {n} \ equiv {\ frac {Z} {\ sqrt {\ chi _ {n} ^ {2} / n}}}}$,

where is a standard normally distributed random variable and an independent, chi-square distributed random variable with degrees of freedom. ${\ displaystyle Z}$${\ displaystyle \ chi _ {n} ^ {2}}$${\ displaystyle Z}$${\ displaystyle n}$

## distribution

The distribution function can be expressed in closed form as

${\ displaystyle F_ {n} (t) = I \ left ({\ frac {t + {\ sqrt {t ^ {2} + n}}} {2 {\ sqrt {t ^ {2} + n}}} }, {\ frac {n} {2}}, {\ frac {n} {2}} \ right)}$

or as

${\ displaystyle F_ {n} (t) = {\ frac {1} {2}} \ left (1 + {\ frac {t} {| t |}} I \ left ({\ frac {t ^ {2 }} {t ^ {2} + n}}, {\ frac {1} {2}}, {\ frac {n} {2}} \ right) \ right)}$

With

${\ displaystyle I (z, a, b) = {\ frac {1} {B (a, b)}} \ int _ {0} ^ {z} t ^ {a-1} (1-t) ^ {b-1} \ mathrm {d} t,}$

where represents the beta function. ${\ displaystyle B}$

${\ displaystyle F_ {n} (t)}$calculates the probability that a randomly distributed variable receives a value less than or equal to . ${\ displaystyle f_ {n} (x)}$${\ displaystyle X}$${\ displaystyle t}$

## properties

Let it be a -distributed random variable with degrees of freedom and density . ${\ displaystyle X}$${\ displaystyle t}$${\ displaystyle n}$${\ displaystyle f_ {n} (x)}$

### Turning points

The density has turning points at

${\ displaystyle x = \ pm \, {\ sqrt {\ frac {n} {n + 2}}}.}$

### Median

The median is

${\ displaystyle {\ tilde {x}} = 0.}$

### mode

The mode arises too

${\ displaystyle x_ {D} = 0.}$

### symmetry

The Student distribution is symmetrical around the 0. ${\ displaystyle t}$

### Expected value

For the expected value we get for${\ displaystyle n> 1}$

${\ displaystyle \ operatorname {E} (X) = 0.}$

The expected value for does not exist. ${\ displaystyle n = 1}$

### Variance

The variance results for to ${\ displaystyle n> 2}$

${\ displaystyle \ operatorname {Var} (X) = {\ frac {n} {n-2}}.}$

### Crookedness

The crookedness is for${\ displaystyle n> 3}$

${\ displaystyle \ operatorname {v} (X) = 0.}$

### Bulges

For the kurtosis bulge and the excess bulge you get for${\ displaystyle \ beta _ {2}}$ ${\ displaystyle \ gamma _ {2}}$${\ displaystyle n> 4}$

${\ displaystyle \ operatorname {\ beta _ {2}} (X) = {\ frac {\ mu _ {4}} {\ mu _ {2} ^ {2}}} = {\ frac {3n-6} {n-4}}, \ qquad \ operatorname {\ gamma _ {2}} (X) = {\ frac {\ mu _ {4}} {\ mu _ {2} ^ {2}}} - 3 = {\ frac {6} {n-4}}.}$

### Moments

For the -th moments and the -th central moments : ${\ displaystyle k}$ ${\ displaystyle m_ {k} = \ operatorname {E} (X ^ {k})}$${\ displaystyle k}$${\ displaystyle \ mu _ {k} = \ operatorname {E} ([X- \ operatorname {E} (X)] ^ {k})}$

${\ displaystyle m_ {k} = \ mu _ {k} = 0, {\ text {if}} n> k {\ text {and}} k {\ text {odd}}}$
${\ displaystyle m_ {k} = \ mu _ {k} = n ^ {k / 2} \ cdot {\ frac {1 \ cdot 3 \ cdot 5 \ cdot 7 \ dotsm (k-1)} {(n- 2) \ cdot (n-4) \ cdot (n-6) \ dotsm (nk)}}, {\ text {if}} n> k {\ text {and}} k {\ text {even}}}$

### Relationship to beta distribution

The integral

${\ displaystyle \ int _ {0} ^ {z} t ^ {a-1} (1-t) ^ {b-1} \ mathrm {d} t}$

is the incomplete beta function

${\ displaystyle B (z; a, b),}$

in which

${\ displaystyle B (a, b) = B (1; a, b)}$creates the connection to the full beta function. Then is for${\ displaystyle t> 0}$
${\ displaystyle F_ {n} (t) = {\ tfrac {1} {2}} + {\ tfrac {1} {2}} I (z, {\ tfrac {1} {2}}, {\ tfrac {n} {2}}) = {\ tfrac {1} {2}} + {\ tfrac {1} {2}} {\ frac {B (z_ {t}; {\ tfrac {1} {2} }, {\ tfrac {n} {2}})} {B (1; {\ tfrac {1} {2}}, {\ tfrac {n} {2}})}}}$

With

${\ displaystyle z_ {t} = {\ frac {t ^ {2}} {t ^ {2} + n}}.}$

If t approaches infinity, it tends towards 1. In the limiting case, the numerator and denominator of the above fraction have the same, that is, one obtains ${\ displaystyle z_ {t}}$

${\ displaystyle F_ {n} (t) = {\ tfrac {1} {2}} + {\ tfrac {1} {2}} I (z_ {t}, {\ tfrac {1} {2}}, {\ tfrac {n} {2}}) \ rightarrow {\ tfrac {1} {2}} + {\ tfrac {1} {2}} = 1}$

## Non-central t distribution

The size

${\ displaystyle {\ frac {Z + \ delta} {\ sqrt {\ chi _ {n} ^ {2} / n}}}}$

with and as a non-centrality parameter follows the so-called non - central distribution. This distribution is mainly used to determine the β-error in hypothesis tests with a -distributed test variable . Their probability density is: ${\ displaystyle Z \ sim {\ mathcal {N}} (0,1)}$${\ displaystyle \ delta}$${\ displaystyle t}$${\ displaystyle t}$

${\ displaystyle f (x) = {\ frac {n ^ {n / 2} n! e ^ {- \ delta ^ {2} / 2}} {2 ^ {n} \ Gamma \ left (n / 2 \ right) (x ^ {2} + n) ^ {(n + 1) / 2}}} \ left ({\ frac {{\ sqrt {2}} \ delta x} {\ sqrt {x ^ {2} + n}}} {\ frac {_ {1} {\ mathcal {F}} _ {1} \ left (n / 2 + 1,3 / 2, {\ frac {(\ delta x) ^ {2} } {2 (x ^ {2} + n)}} \ right)} {\ Gamma \ left ((n + 1) / 2 \ right)}} + {\ frac {_ {1} {\ mathcal {F }} _ {1} \ left ((n + 1) / 2,1 / 2, {\ frac {(\ delta x) ^ {2}} {2 (x ^ {2} + n)}} \ right )} {\ Gamma \ left (n / 2 + 1 \ right)}} \ right)}$
Some densities of non-central distributions${\ displaystyle t}$

The bracket with the sum of hypergeometric functions can be written a little easier, so that a shorter alternative expression for the density is created:

${\ displaystyle f (x) = {\ frac {2 ^ {n} n ^ {n / 2 + 1} \ Gamma \ left ((n + 1) / 2 \ right)} {\ pi (x ^ {2 } + n) ^ {(n + 1) / 2}}} e ^ {- \ delta ^ {2} / 2} H _ {- n-1} \ left (- {\ frac {\ delta x} {{ \ sqrt {2}} {\ sqrt {x ^ {2} + n}}}} \ right),}$

where represents a Hermitian polynomial with a negative index with . ${\ displaystyle H _ {- n-1} \ left (z \ right)}$${\ displaystyle H _ {- n-1} \ left (0 \ right) = {\ frac {\ sqrt {\ pi}} {2 ^ {n + 1} \ Gamma \ left (n / 2 + 1 \ right) }}}$

The expectation is for at ${\ displaystyle n> 1}$

${\ displaystyle {\ frac {\ delta {\ sqrt {n}} \ Gamma \ left ((n-1) / 2 \ right)} {{\ sqrt {2}} \ Gamma \ left (n / 2 \ right )}}}$

and the variance (for ) at ${\ displaystyle n> 2}$

${\ displaystyle {\ frac {(1+ \ delta ^ {2}) n} {n-2}} - {\ frac {\ delta ^ {2} n \ Gamma \ left ((n-1) / 2 \ right) ^ {2}} {2 \ Gamma \ left (n / 2 \ right) ^ {2}}}.}$

With you get the characteristic values ​​of the central distribution. ${\ displaystyle \ delta = 0}$${\ displaystyle t}$

## Relationship to other distributions

### Relationship to the Cauchy distribution

For and with , the Cauchy distribution results as a special case from the Student distribution. ${\ displaystyle n = 1}$${\ displaystyle \ Gamma (1/2) = {\ sqrt {\ pi}}}$${\ displaystyle t}$

### Relationship to the chi-square distribution and standard normal distribution

The distribution describes the distribution of an expression ${\ displaystyle t}$

${\ displaystyle t_ {n} \ equiv {\ frac {{\ mathcal {N}} (0,1)} {\ sqrt {\ frac {\ chi _ {n} ^ {2}} {n}}}} ,}$

where means a standard normally distributed and a chi-square distributed random variable with degrees of freedom. The numerator variable must be independent of the denominator variable. The density function of the distribution is then symmetrical with respect to its expected value . The values ​​of the distribution function are usually tabulated. ${\ displaystyle {\ mathcal {N}} (0,1)}$${\ displaystyle \ chi _ {n} ^ {2}}$${\ displaystyle n}$${\ displaystyle t}$${\ displaystyle 0}$

### Distribution with heavy margins

The distribution belongs to the distributions with heavy margins .

### Approximation by the normal distribution

With an increasing number of degrees of freedom, the distribution values ​​of the distribution can be approximated using the normal distribution. The rule of thumb is that from 30 degrees of freedom the distribution function can be approximated by the normal distribution. ${\ displaystyle t}$${\ displaystyle t}$

## Use in mathematical statistics

Various estimation functions are distributed. ${\ displaystyle t}$

If the independent random variables are identically normally distributed with mean and standard deviation , it can be proven that the sample mean${\ displaystyle X_ {1}, X_ {2}, \ dotsc, X_ {n}}$${\ displaystyle \ mu}$${\ displaystyle \ sigma}$

${\ displaystyle {\ overline {X}} = {\ frac {1} {n}} \ sum _ {i = 1} ^ {n} X_ {i}}$

and the sample variance

${\ displaystyle S ^ {2} = {\ frac {1} {n-1}} \ sum _ {i = 1} ^ {n} (X_ {i} - {\ overline {X}}) ^ {2 }}$

are stochastically independent.

Because the random variable has a standard normal distribution and follows a chi-square distribution with degrees of freedom , the result is that the size ${\ displaystyle {\ tfrac {{\ overline {X}} - \ mu} {\ sigma / {\ sqrt {n}}}}}$${\ displaystyle (n-1) \, S ^ {2} / \ sigma ^ {2}}$${\ displaystyle n-1}$

${\ displaystyle t_ {n-1} = {\ frac {{\ overline {X}} - \ mu} {S / {\ sqrt {n}}}} = {\ frac {{\ overline {X}} - \ mu} {S / {\ sqrt {n}}}} \ cdot {\ frac {\ sigma} {\ sigma}} = {\ frac {{\ overline {X}} - \ mu} {\ sigma / { \ sqrt {n}}}} \ cdot {\ frac {\ sigma} {S}} = {\ frac {{\ overline {X}} - \ mu} {\ sigma / {\ sqrt {n}}}} / \ left ({\ frac {S} {\ sigma}} \ right) = {\ frac {{\ overline {X}} - \ mu} {\ sigma / {\ sqrt {n}}}} / {\ sqrt {\ chi _ {n-1} ^ {2} / (n-1)}}}$

by definition -distributed with degrees of freedom. ${\ displaystyle t}$${\ displaystyle n-1}$

So the distance between the measured mean and the mean of the population is distributed as . Thus one calculates then the 95 -% - confidence interval for the mean to ${\ displaystyle t_ {n-1} S / {\ sqrt {n}}}$${\ displaystyle \ mu}$

${\ displaystyle {\ overline {x}} - t \ cdot S / {\ sqrt {n}} \ leq \ mu \ leq {\ overline {x}} + t \ cdot S / {\ sqrt {n}}, }$

where is determined by . This interval is somewhat larger than that which would have resulted from the distribution function of the normal distribution with the same confidence level . ${\ displaystyle t}$${\ displaystyle F_ {n-1} (t) = 0 {,} 975}$${\ displaystyle n <\ infty}$${\ displaystyle \ sigma}$${\ displaystyle \ left (\ mu \ in \ left [{\ overline {x}} \ pm 1 {,} 96 \ cdot {\ tfrac {\ sigma} {\ sqrt {n}}} \ right] \ right) }$

## Derivation of the density

The probability density of the distribution can be derived from the common density of the two independent random variables and , which are standard normal or chi-square distributed: ${\ displaystyle t}$${\ displaystyle Z}$${\ displaystyle \ chi _ {n} ^ {2}}$

${\ displaystyle f_ {Z, \ chi _ {n} ^ {2}} (z, y) = {\ frac {e ^ {- {\ frac {1} {2}} z ^ {2}}} { \ sqrt {2 \ pi}}} \ cdot {\ frac {y ^ {{\ frac {n} {2}} - 1} e ^ {- {\ frac {1} {2}} y}} {2 ^ {\ frac {n} {2}} \ Gamma ({\ frac {n} {2}})}}}$

With the transformation

${\ displaystyle t = z / {\ sqrt {y / n}}, v = y}$

we get the common density of and , where and . ${\ displaystyle T = Z / {\ sqrt {\ chi _ {n} ^ {2} / n}}}$${\ displaystyle \ chi _ {n} ^ {2}}$${\ displaystyle - \ infty ${\ displaystyle 0 \ leq v <\ infty}$

The Jacobide terminant of this transformation is:

${\ displaystyle \ det {\ frac {\ partial (z, y)} {\ partial (t, v)}} = {\ begin {vmatrix} {\ sqrt {\ frac {v} {n}}} & 0 \ \\ Diamond & 1 \ end {vmatrix}} = {\ sqrt {\ frac {v} {n}}}}$

The value is not important because it is multiplied by 0 when calculating the determinant. So the new density function is written ${\ displaystyle \ Diamond}$

${\ displaystyle f_ {T, \ chi _ {n} ^ {2}} (t, v) = {\ frac {e ^ {- {\ frac {1} {2}} v {\ frac {t ^ { 2}} {n}}}} {\ sqrt {2 \ pi}}} \ cdot {\ frac {1} {2 ^ {\ frac {n} {2}} \ Gamma ({\ frac {n} { 2}})}} v ^ {{\ frac {n} {2}} - 1} e ^ {- {\ frac {1} {2}} v} \ cdot {\ sqrt {\ frac {v} { n}}}.}$

We are now looking for the marginal distribution as an integral over the variable that is not of interest : ${\ displaystyle f_ {n} (t)}$${\ displaystyle v}$

${\ displaystyle f_ {n} (t) = \ int \ limits _ {0} ^ {\ infty} f_ {T, \ chi _ {n} ^ {2}} (t, v) \, dv = {\ frac {1} {{\ sqrt {n \ pi}} \, 2 ^ {(n + 1) / 2} \ Gamma (n / 2)}} \ int \ limits _ {0} ^ {\ infty} v ^ {(n-1) / 2} e ^ {- v (1 + t ^ {2} / n) / 2} \, dv = {\ frac {\ Gamma \ left ({\ frac {n + 1} {2}} \ right)} {{\ sqrt {n \ pi}} \ Gamma \ left ({\ frac {n} {2}} \ right)}} \ left (1 + {\ frac {t ^ { 2}} {n}} \ right) ^ {- {\ frac {n + 1} {2}}}}$

## Selected quantiles of the t distribution

Values ​​for various degrees of freedom and common probabilities (0.75 to 0.999) are tabulated , for which the following applies: ${\ displaystyle t}$${\ displaystyle n}$${\ displaystyle P}$

${\ displaystyle P _ {\ text {one-sided}} = F_ {n} (t) = P (T_ {n} \ leq t)}$

Due to the mirror symmetry of the density, one only needs to adapt the probability scale for the case of the interval symmetrically limited on both sides. The probabilities decrease with the same , because the integration interval is reduced by cutting away the range from to : ${\ displaystyle t}$${\ displaystyle - \ infty}$${\ displaystyle -t}$

${\ displaystyle P _ {\ text {two-sided}} = F_ {n} (t) -F_ {n} (- t) = P (-t

If observations are carried out on a sample and parameters are estimated from the sample , the number of degrees of freedom is. ${\ displaystyle N}$${\ displaystyle m}$${\ displaystyle n = Nm}$

For the number of degrees of freedom in the first column and the level of significance (shown as in the second row), the value of the (one-sided) quantile , according to DIN 1319-3, is given in each cell of the following table . This satisfies the following equations for the density of the distribution: ${\ displaystyle n}$${\ displaystyle \ alpha}$${\ displaystyle 1- \ alpha}$${\ displaystyle t_ {n, \ alpha}}$${\ displaystyle f_ {n}}$${\ displaystyle t_ {n}}$

One sided: ${\ displaystyle \ int _ {- \ infty} ^ {t_ {n, \ alpha}} f_ {n} (x) \, \ mathrm {d} x = 1- \ alpha}$
Two-sided: ${\ displaystyle \ int _ {- t_ {n, \ alpha / 2}} ^ {t_ {n, \ alpha / 2}} f_ {n} (x) \, \ mathrm {d} x = 1- \ alpha }$

So for example with and the values ​​of 2.776 (two-sided) or 2.132 (one-sided) are found. ${\ displaystyle n = 4}$${\ displaystyle \ alpha = 0 {,} 05}$${\ displaystyle t}$

The quantile function of the -distribution is the solution of the equation and can therefore in principle be calculated using the inverse function. Specifically applies here ${\ displaystyle t}$${\ displaystyle x_ {p}}$${\ displaystyle p = F (x_ {p} | m, \, n)}$

${\ displaystyle x_ {p} = {\ frac {{\ sqrt {n}} \ left (2I ^ {- 1} (p, {\ frac {n} {2}}, {\ frac {n} {2 }}) - 1 \ right)} {2 {\ sqrt {\ left (1-I ^ {- 1} (p, {\ frac {n} {2}}, {\ frac {n} {2}} ) \ right) \ cdot I ^ {- 1} (p, {\ frac {n} {2}}, {\ frac {n} {2}})}}}}}$

with as the inverse of the regularized incomplete beta function. This value is entered in the quantile table under the coordinates p and n. ${\ displaystyle I ^ {- 1}}$${\ displaystyle x_ {p}}$

For a few values (1,2,4) the quantile function is simplified: ${\ displaystyle n}$

${\ displaystyle n = 1: x_ {p} = \ operatorname {tan} (\ pi (p-1/2))}$
${\ displaystyle n = 2: x_ {p} = (p-1/2) {\ sqrt {\ frac {2} {p (1-p)}}}}$
${\ displaystyle n = 4: x_ {p} = {\ sqrt {{\ frac {2 \ cos \ left ({\ frac {1} {3}} \ arccos \ left (2 {\ sqrt {p (1- p)}} \, \ right) \ right)} {\ sqrt {p (1-p)}}} - 4}}}$

### Table of some t -quantiles

Number of
degrees of freedom
n
P for two-sided confidence interval
0.5 0.75 0.8 0.9 0.95 0.98 0.99 0.998
P for one-sided confidence interval
0.75 0.875 0.90 0.95 0.975 0.99 0.995 0.999
1 1,000 2,414 3.078 6.314 12,706 31,821 63.657 318,309
2 0.816 1.604 1,886 2.920 4.303 6,965 9.925 22,327
3 0.765 1.423 1.638 2.353 3.182 4,541 5,841 10.215
4th 0.741 1,344 1.533 2.132 2,776 3,747 4.604 7.173
5 0.727 1.301 1.476 2.015 2.571 3.365 4.032 5,893
6th 0.718 1.273 1,440 1,943 2,447 3.143 3.707 5.208
7th 0.711 1.254 1.415 1,895 2,365 2.998 3,499 4,785
8th 0.706 1,240 1.397 1,860 2.306 2,896 3.355 4,501
9 0.703 1.230 1.383 1,833 2.262 2.821 3,250 4,297
10 0.700 1,221 1.372 1,812 2.228 2.764 3.169 4.144
11 0.697 1,214 1.363 1,796 2,201 2.718 3.106 4.025
12 0.695 1.209 1.356 1,782 2.179 2,681 3.055 3,930
13 0.694 1.204 1,350 1,771 2.160 2,650 3.012 3.852
14th 0.692 1,200 1,345 1.761 2.145 2.624 2,977 3,787
15th 0.691 1.197 1.341 1.753 2.131 2.602 2.947 3.733
16 0.690 1.194 1.337 1,746 2.120 2.583 2.921 3,686
17th 0.689 1.191 1.333 1,740 2.110 2.567 2,898 3,646
18th 0.688 1.189 1.330 1.734 2.101 2.552 2,878 3,610
19th 0.688 1.187 1.328 1.729 2.093 2.539 2.861 3,579
20th 0.687 1.185 1.325 1.725 2.086 2.528 2.845 3,552
21st 0.686 1.183 1.323 1.721 2.080 2.518 2.831 3.527
22nd 0.686 1.182 1.321 1.717 2.074 2.508 2.819 3.505
23 0.685 1.180 1,319 1.714 2.069 2,500 2.807 3.485
24 0.685 1.179 1.318 1.711 2.064 2,492 2.797 3.467
25th 0.684 1.178 1,316 1.708 2.060 2.485 2.787 3,450
26th 0.684 1.177 1,315 1.706 2.056 2,479 2,779 3.435
27 0.684 1.176 1,314 1.703 2.052 2.473 2.771 3.421
28 0.683 1.175 1,313 1.701 2.048 2,467 2.763 3.408
29 0.683 1.174 1,311 1,699 2.045 2.462 2.756 3.396
30th 0.683 1.173 1,310 1.697 2.042 2.457 2.750 3.385
40 0.681 1.167 1.303 1.684 2.021 2,423 2.704 3.307
50 0.679 1.164 1,299 1.676 2.009 2.403 2.678 3.261
60 0.679 1.162 1.296 1.671 2,000 2,390 2,660 3.232
70 0.678 1.160 1.294 1.667 1.994 2.381 2.648 3.211
80 0.678 1.159 1.292 1.664 1,990 2,374 2,639 3.195
90 0.677 1.158 1.291 1.662 1.987 2,368 2.632 3.183
100 0.677 1.157 1.290 1.660 1.984 2,364 2.626 3.174
200 0.676 1.154 1.286 1.653 1,972 2,345 2.601 3.131
300 0.675 1.153 1.284 1,650 1,968 2,339 2,592 3.118
400 0.675 1.152 1.284 1.649 1,966 2,336 2,588 3.111
500 0.675 1.152 1.283 1.648 1.965 2,334 2.586 3.107
${\ displaystyle \ infty}$ 0.674 1.150 1.282 1.645 1,960 2,326 2.576 3.090

## Individual evidence

1. ^ A b Student: The Probable Error of a Mean . In: Biometrika . 6, No. 1, 1908, pp. 1-25. JSTOR 2331554 . doi : 10.1093 / biomet / 6.1.1 .
2. Josef Bleymüller, Günther Gehlert, Herbert Gülicher: Statistics for economists . 14th edition. Vahlen, 2004, ISBN 978-3-8006-3115-5 , pp. 16 .
3. J. Pfanzagl, O. Sheynin: A forerunner of the t -Distribution (Studies in the history of probability and statistics XLIV) . In: Biometrika . 83, No. 4, 1996, pp. 891-898. doi : 10.1093 / biomet / 83.4.891 .
4. P. Gorroochurn: Classic Topics on the History of Modern Mathematical Statistics from Laplace to More Recent Times . Wiley, 2016, doi : 10.1002 / 9781119127963 .
5. ^ NL Johnson, BL Welch: Applications of the Non-Central t-Distribution. In: Biometrika. Vol. 31, No. 3/4 (Mar. 1940), pp. 362-389, JSTOR 2332616 doi : 10.1093 / biomet / 31.3-4.362 .
6. HermiteH. At: functions.wolfram.com.
7. Frodesen, Skjeggestad, Tofte: Probability and Statistics in Particle Physics. Universitetsforlaget, Bergen / Oslo / Tromsø, p. 141.
8. ^ WT Shaw: Sampling Student's T distribution - Use of the inverse cumulative distribution function . In: Journal of Computational Finance . 9, No. 4, 2006, pp. 37-73. doi : 10.21314 / JCF.2006.150 .