Duration as the duration of execution
The length of the time span that is needed to solve a task can often only be determined by trial and error. Every command of a program in a high-level programming language is translated by the compiler into a number of machine commands that is not necessarily known beforehand . Depending on the hardware, the execution of a command can result in further delays - if e.g. For example, data has to be exchanged between main memory and cache or stored in memory from the hard disk ( paging ). Further dependencies result from the operating system , CPU clock rate , size of the main memory, transfer rate of the internal bus system, etc. One would also like to estimate how the examined program behaves when the sizes of the input variables, the instances, vary.
Asymptotic running time of algorithms
In computer science, therefore, the running times of algorithms are not given in units of time. Instead, one looks for an upper bound on the number of simple operations, including elementary steps, in the size of the instance and uses the Landau notation .
Some examples using a program that sorts n numbers:
- describes a linear growth. Such a program only makes a constant number of calculation steps per entered number. For example, if twice as many numbers are entered, the execution time is doubled.
- means square growth. The sorting program makes a constant number of runs through the entire list for each number entered. If the input data is twice as large, the execution time is approximately quadrupled.
- would mean exponential growth after all. In the example of the sorting program, the running time would (approximately) double with every additional number, which leads to extremely long running times even with relatively small input sizes. A sorting program achieves such a time consumption, for example, by testing all possible sequences of the numbers to see whether they are sorted.
Therefore, one tries to avoid procedures with exponential running time if possible - whether this is possible at all is one of the questions that one asks in theoretical computer science (cf. complexity theory and NP-complete ). The aim is for methods with a polynomial ( for a suitable natural number ) or even better logarithmic runtime . Sorting processes commonly used today usually achieve a worst case runtime of or . It should be noted that a program is basically divided into three parts - input, processing, output - and that only the middle part can be optimized in this regard (input and output usually have a linear time behavior - every single value must be read in / output).
Concrete runtime determination (profiling)
The specific runtime determination of programs and especially program parts is called profiling in software development . A software that supports profiling by supplementing the program to be examined with code for runtime detection ( instrumenting ) and preparing the results of the runtime determination is called a profiler . Profilers are often implemented as part of an integrated development environment .
Duration as the life phase of a program
Runtime ( runtime ) also referred to the phase of the execution of a program in a specific run-time context: varying hardware characteristics, an input parameter and user -interaction. At runtime, the program is typically used in a context which, in this exact constellation, could not be foreseen by the developer (or only approximately via dynamic code analysis ).
Since certain program peculiarities - in particular errors - can appear for the first time when running in these varying contexts, the developer often only receives information on necessary changes to the program in this way. In a broader sense, the regular execution of a program can also be viewed as part of the development process.
Further phases are e.g. For example, the translation time (engl. Compile time ) as the phase by the time of automatic translation of the source text and the link time for the time at which the program from its binary program components to an executable unit together is. Sometimes the programming and modeling phase is called precompile time .