The Zipf law (after George Kingsley Zipf , who set this law in the 1930s) is a model that allows one whose value can be estimated from their rank for certain sizes that are placed in a hierarchy. Frequent use is the law in linguistics (linguistics), especially in the corpus linguistics and Quantitative Linguistics , where it continues as the frequency of words in a text to rank in relationship. Zipf's Law marked the beginning of quantitative linguistics.
Easy zip distribution
The simplified statement of Zipf's law reads: If the elements of a set - for example the words of a text - are ordered according to their frequency, the probability of their occurrence is inversely proportional to their position within the ranking:
The normalization factor for elements is through the harmonic series
given and can only be given for finite sets. So it follows:
Zipf's law has its origin in linguistics. It says that certain words appear much more frequently than others and the distribution is similar to a hyperbola . For example, in most languages, the longer they are, the less often words appear. The order parameter rank n can be described as a cumulative quantity: The rank is synonymous with the number of all elements that are equal to or greater than . There is exactly one element for rank 1 , namely the largest. For rank 2 there are two, namely the first and the second element, for 3 three, etc.
Zipf takes a simple inverse relationship to the rank of: . In its original form, Zipf's law is free of parameters, it is .
It is the inverse of the Pareto distribution. Like this, it is a cumulative distribution function that obeys a power law. The exponent of the distribution density function is accordingly:
and for the simple case :
The distribution of word frequencies in a text (left graphic) roughly corresponds qualitatively to a simple Zipf distribution.
The Zipf law gives the exponent of the cumulative distribution function before: .
The fit value for the word frequencies is, however , equivalent to the exponent of a Pareto distribution and the exponent of a power distribution density function of .
The distribution of the letter frequencies also resembles a Zipf distribution. However, statistics based on 20–30 letters are not sufficient to adapt the course with a power function.
Another example from the Pareto Distribution article deals with the size distribution of cities. Here, too , one can see a connection in individual countries (e.g. Germany) that seems to obey a power law, but with striking deviations. The graphic on the right compares the Zipf approximation with the measured values. The linear course in the logarithmic distribution supports the assumption of a power law. Unlike Zipf's conjecture, the exponent does not have the value 1, but the value 0.77, corresponding to an exponent of a power density distribution of . This theory, according to which the number of inhabitants and sizes of cities that develop independently of one another, nevertheless develop according to a superordinate law, is also used to determine the expected size of towns .
The importance of the Zipf distribution lies in the rapid qualitative description of distributions from the most varied of areas, while the Pareto distribution refines the exponent of the distribution.
For example, the database is too small for a fit when specifying the population of only seven cities. Zipf's law provides an approximation:
|Rank n||city||Residents||1 / rank||p ( n )||p ( n ) people||Deviation in percent %|
- Helmut Birkhan : The "Zipfsche Law", the weak past tense and the Germanic sound shift (= Austrian Academy of Sciences. Philosophical-Historical Class. Meeting reports. 348). Publishing house of the Austrian Academy of Sciences, Vienna 1979, ISBN 3-700-10285-2 .
- David Crystal : The Cambridge Encyclopedia of Language . Campus-Verlag, Frankfurt am Main et al. 1993, ISBN 3-593-34824-1 .
- Xavier Gabaix : Zipf's law for cities: An explanation. In: The Quarterly Journal of Economics. Vol. 114, No. 3, 1999, pp. 739-767, doi : 10.1162 / 003355399556133 .
- Henri Guiter , Michail V. Arapov (Ed.): Studies on Zipf's Law (= Quantitative Linguistics. Vol. 16). Studienverlag Brockmeyer, Bochum 1982, ISBN 3-88339-244-8 .
- Matteo Marsili, Yi-Cheng Zhang: Interacting Individuals Leading to Zipf's Law. In: Physical Review Letters. Vol. 80, No. 12, 1998, pp. 2741-2744, doi : 10.1103 / PhysRevLett.80.2741 .
- George Kingsley Zipf: The Psycho-Biology of Language. An Introduction to Dynamic Philology. Mifflin, Boston MA 1935, (The MIT Press, Cambridge MA 1968).
- George Kingsley Zipf: Human Behavior and the Principle of Least Effort. An Introduction to Human Ecology. Addison-Wesley, Cambridge MA 1949.
- Extensive bibliography ( Memento from November 1, 2012 in the Internet Archive )
- Zipf's law and the creation of musical context
- Zipf's law using the example of German vocabulary
- Zipf, Power-laws and Pareto
- Use of Hermetic Word Frequency Counter to Illustrate Zipf's Law
- B. McCOWAN et al .: The appropriate use of Zipf's law in animal communication studies . ANIMAL BEHAVIOR, 2005, 69, F1 – F7 (PDF; 167 kB)
- Zipf's law in the prime factors of the Fibonacci numbers
- Zipf's law in the logistic equation
- Tobias Just and Patrick Stephan: Zipf's Law and its implications for urban regions (PDF; 283 kB)
- Christian Schluter, Mark Trede, September 12, 2013: Gibrat, Zipf, Fisher and Tippett: City Size and Growth Distributions Reconsidered (PDF; 494 kB; 29 pages), or in the Internet archive ( Memento from June 10, 2016 in the Internet Archive ) , accessed July 29, 2018.