A priori probability
The a priori probability (also initial probability , pre-test or original probability ) is a probability value in the natural sciences that is obtained on the basis of general prior knowledge about the properties of the system (e.g. symmetrical properties of a cube ). Assumptions about the a priori probabilities are prerequisites for calculating the conditional probability of a composite event and for Bayesian concept of probability .
The oldest method for determining a priori probabilities comes from Laplace : Unless there is an explicit reason to assume otherwise, all elementary events are assigned the same probability ( principle of indifference ). For example, when tossing a coin, the elementary events "heads" and "tails" are a priori equally likely: As long as there is no reason to assume that the coin has been manipulated, both events will be assigned the same probability 1/2.
An extension of this principle is the principle of maximum entropy . Here it is assumed that you already know something about the system, but not everything. Since the (information) entropy is a measure of the uncertainty of knowledge, it is argued that the a priori probability must be given by the fact that its entropy is at its maximum under the probability distributions compatible with knowledge, because a different distribution would be additional Imply knowledge. In the event that no additional information is available, this principle is reduced to the principle of indifference .
See also
Individual evidence
- ↑ Arthur Pap: Analytical Epistemology , Springer-Verlag, Vienna 1955 p. 99.
- ↑ Intuition, statistics and evaluation of evidence ( Memento from May 31, 2009 in the Internet Archive )