Concurrency
The concurrency , sometimes parallelism ( English concurrency ) called, is in computer science , the property of a system, several calculations, instructions or commands to run simultaneously. This can range from completely independent instructions to working together on a task. You can also interact with each other (e.g. to exchange intermediate results).
The operations can only appear to be carried out concurrently (in parallel ), in so-called preemptive multitasking , on one processor , or they can also actually be carried out concurrently
- a multi-core processor or
- physically separate processors, such as in a multiprocessor system , or
- several separate computers connected via a network .
The boundary to the parallel computer in the actual sense is fluid in the approaches to real parallelism.
Concurrent processes can u. a. can be described and analyzed by Petri nets or parallel random access machines .
Narrower conception
It is sometimes even more accurately distinguish between "Concurrent treatment" ( English concurrency ) and "parallel processing" ( english parallelism ): The concurrent treatment is then understood primarily as a concept, depict real events that is also useful if only one to run CPU core is available; In contrast, in this understanding of the term, parallel processing focuses on genuinely simultaneous multi-core computation, mostly of a problem.
Parallelizability and serializability
If process steps are so dependent on one another that they have to be carried out one after the other in a fixed order so that it is not possible to carry out them simultaneously, then they cannot be parallelized . If processes have to take place genuinely simultaneously, so that there is absolutely no possibility of executing them one after the other, they cannot be serialized .
The ability to parallelize two or more sections or iterations of a process or of events is the assessment of whether these can be executed / calculated concurrently without leading to a deviating result.
"Actions can be carried out concurrently if neither needs the other's result."
The program sections must therefore not be causally dependent on one another. In other words, several transactions or processes / threads can be parallelized precisely when the parallel, interleaved, or in reverse order execution leads to the same result as the sequential execution .
If the program sections are (partially) dependent on one another, they must be synchronized with regard to these dependencies , which causes sequencing (with regard to this dependency).
“Many practical problems have solution methods that can be implemented directly in a parallel algorithm. In this case it is said that the problem has a natural parallelism . "
Procedures for parallelization are
- Binary tree method,
- list ranking,
- pointer doubling,
- parallel prefix,
- symmetry breaking.
See also
literature
- Peter Ziesche: Concurrent & distributed programming. W3L, 2004, ISBN 3-937137-04-1 .
- Douglas Schmidt, Michael Stal, Hans Rohnert, Frank Buschmann: Pattern-oriented software architecture, patterns for concurrent and networked objects. dpunkt 2002, ISBN 3-89864-142-2 .
Individual evidence
- ^ Rob Pike: 'Concurrency Is Not Parallelism' , Youtube, October 20, 2013
- ↑ System programming, process synchronization: non-sequentiality , Wolfgang Schröder-Preikschat, Chair of Computer Science 4, November 6, 2013
- ↑ " Parallel Algorithms ", B. Monien, J. Schulze, S. Grothklags, script at Uni Paderborn, 2001 (with extensive bibliography)
- ^ Special lecture "Parallel Algorithms" , University of Potsdam, Didactics of Computer Science