Gustafson's Law

from Wikipedia, the free encyclopedia

Gustafson's Law (also known as Gustafson-Barsis Law ) is a law in theoretical computer science that was established by John Gustafson in 1988. It says that a sufficiently large problem can be efficiently parallelized .

description

Gustafson's law is based on Amdahl's law , which, starting from a fixed problem size, tries to solve a task to be mastered in a shorter time by parallelization with processors . It assumes that the best possible result is a linear improvement in the time required (i.e. ). If you use twice as many processors, you need half the time at best.

Gustafson changed the law to use a fixed time window in which the size of the problem grows with the number of processors. If you use twice as many processors, you can, at best, solve a task twice as big. Although they appear the same at first glance, the statements differ significantly.

A program can never be executed completely in parallel, because some parts, such as process initialization or memory allocation, only run once on a processor or the sequence depends on certain results. Therefore, the program run is broken down into sections that run either completely sequentially or completely in parallel, and each of them is combined into a group. Let the portion of the runtime of the parallel sections of a program be, then the sequential portion and the total runtime results from the sum when executing on a core:

According to this formula, both parts are executed one after the other on a serial computer, the times add up. If the overhead for communication, synchronization and the like is neglected , the parallel portion can be carried out on processors at the same time. The following applies to the acceleration factor ( SpeedUp )

The key difference to Amdahl is that the parallel share increases with the number of processors. The sequential part does not have a restrictive effect here, since it becomes less important with increasing . If it approaches infinity, the SpeedUp grows linearly with the number of processors .

application

Gustafson's Law applies well to problems that are processed in real time . The real time is fixed, and the problem can then become more complex with more processors.

For example, in 3D rendering, at least 30 images per second must be calculated, this is a fixed time window. However, the picture can be more realistic if you increase the amount of data, e.g. B. more polygons or more detailed textures. According to Gustafson, this can be achieved with more processors.

The same applies to the logic calculations in games, physics simulation and artificial intelligence that interacts with people. The same applies in theory to robotics , machine control and monitoring and similar areas of responsibility.

Gustafson himself is currently working on Radeon graphics cards at AMD .

criticism

There are a number of problems that cannot reasonably be enlarged at will, or problems that have to be calculated in the shortest possible time.

Often, non-linear algorithms cannot fully benefit from the parallelization described by Gustafson. Lawrence Snyder explains that an algorithm with a complexity in doubling the concurrency achieves a SpeedUp of only 9%.

Individual evidence

  1. John L. Gustafson: Reevaluating Amdahl's law . In: Communications of the ACM . tape 31 , no. 5 , 1988, pp. 532-533 , doi : 10.1145 / 42411.42415 ( PDF ).
  2. AMD recruits Intel guru , August 29, 2012
  3. ^ Lawrence Snyder: Type Architectures, Shared Memory, and the Corollary of Modest Potential . In: Annual Review of Computer Science . tape 1 , 1986, pp. 289-317 , doi : 10.1146 / annurev.cs.01.060186.001445 ( PDF ).