Error quotient

from Wikipedia, the free encyclopedia

Error quotient (also error density , error rate , error rate or error frequency ) in quality management is the relative proportion of defective elements in relation to the totality, i.e. the relative frequency with which an error occurs in a product , a service , a production process or the quality of work .


Impeccable product quality and service quality contribute to both product safety and customer satisfaction . Defective productions, on the other hand, lead to increased costs ( defect costs , repairs , product liability ) and to possible image problems for the manufacturer . Therefore, the quality management should ensure that incorrect production is avoided as far as possible and the error rate is kept as low as possible.

Measures for the error quotient as a breakdown number are, for example, pieces per lot , percent , per thousand , ppm or the so-called sigma level as a measure of dispersion .



The error quotient could be used in school, for example:

When writing a text of 400 words, a student made 11 misspellings . To get the error quotient, divide the number of errors by the total number of words. So it reads:
Generally one can write:
Using the error quotient - in this example it would be 2.75% - the teacher can assign a grade for the spelling.


An employee fulfills his contractual obligation under the employment contract if he works with adequate use of his personal capabilities . If there is a negative deviation from the work owed, there is poor performance , which can entitle the employer to take action under labor law , including termination . According to a judgment of the Federal Labor Court (BAG) from January 2008, a termination due to poor quality of work is only legally permissible if the employee has performed below average over a longer period of time and either produces less or makes significantly more mistakes than the average employee in the company or when he is able to perform better according to his personal abilities. In the case cited, it was about a shipping worker in a mail-order department store who had an error rate between 4.01 ‰ and 5.44 ‰, while the average error rate of the 209 comparable employees employed was only 1.34 ‰. In the judgment, the BAG made it clear that the long-term clear excess of the average error rate, depending on the actual number of errors, type, severity and consequences of the faulty work performance, can be an indication that the employee is accusingly violating his contractual obligations.

Error quotients in data communication and processing

The error quotient when storing and transmitting binary data is called the bit error rate .

In data communication , the frequency of errors is understood as the ratio of incorrectly received symbols, words or blocks to the total number of symbols, words or blocks received. This is the indication of a relative frequency that was determined within a certain finite measuring time.

In data processing , the causes of data loss are 44% hardware errors , followed by 32% user errors , 7% computer viruses , 4% software errors , 3% natural events and 10% other causes of errors.

Error density in computer science

In computer science , the error density describes the number of errors per 1,000 lines of code or per function point . For economic and technical reasons, it is impossible to create error-free software in practice (see Correctness (IT) ). The aim is therefore to achieve the lowest possible error density that matches the requirements of the software before the software is used in production. The targeted defect density must therefore be defined during the analysis phase. In the case of software, the failure of which could cost human life (such as military or hospital systems), an error density of <0.5 errors per 1,000 lines of code is usually aimed for.

For the time of going live, an error density of <2 should normally be aimed for, with secure software <1 if not <0.5. Normal is 2-6 and acceptable, but only on the web, is 6-10. An error density of more than 10 is considered malpractice and can lead to compensation payments before a court. Commercial software has an average error density of 15 to 50 errors per 1,000 lines of code or 0.75 errors per function point. Successful projects have a lower error density (0.2 errors per function point), larger projects a higher error density. In Microsoft applications, 10 to 20 errors per 1,000 lines of code were found in the test; after the software was released, the error density was 0.5. Linux has an error density of 0.47, PostgreSQL below 0.1.

There are different techniques for achieving low defect densities. Using the "Cleanroom Development" technique proposed by Harlan Mills, error densities of 3 errors per 1,000 lines of code could be achieved during development, which can be reduced to 0.1 by in-house testing before the software is released. If you combine the best error prevention and correction techniques, you get 0.08 errors per function point. Only a few projects, for example the software for the Space Shuttle (the Primary Avionics Software System ), achieve error densities of 0 errors in 500,000 lines of code. Such is achieved, for example, through a system of formal development methods , peer reviews and statistical testing.

The defect density can also be used to classify the product maturity of software:

Defect density Classification of the programs
<0.5 stable programs
0.5 ... 3 maturing programs
3… 6 unstable programs
6… 10 error-prone programs
> 10 unusable programs

Web links

Individual evidence

  1. Oliver Vollstädt / Daniela Turck / Patrick Wiederhake / Ivonne Faerber, dealing with difficult employees , 2016, p. 57
  2. BAG, judgment of January 17, 2008, Az .: 2 AZR 536/06
  3. Frequency of errors in data communication
  4. Frequency of errors in the event of data loss ( memento of the original from January 1, 2013 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. @1@ 2Template: Webachiv / IABot /
  5. a b Linda M. Laird, M. Carol Brennan: Software Measurement and Estimation: A Practical Approach . Ed .: IEEE Computer Society. John Wiley & Sons, Hoboken NJ 2006, ISBN 0-471-67622-5 , pp. 135 (English): “Good software should be less than 2 defects per KLOC. Safe software should be at least less than 1, if not 0.5 defects per KLOC ”
  6. ^ A b Frank Witte: Test management and software testing . Springer-Verlag, 2015, ISBN 978-3-658-09963-3 , pp. 288 .
  7. ^ A b Steve McConnell: Code Complete A Practical Handbook of Software Construction . 2nd Edition. Microsoft Press, 2004, ISBN 978-0-7356-1967-8 (English).
  8. Carnegie Mellon University's CyLab Sustainable Computing Consortium
  9. ^ Capers Jones: Software Engineering Best Practices. Lessons from Successful Projects in the Top Companies . P. 574
  10. ^ Capers Jones: Software Engineering Best Practices. Lessons from Successful Projects in the Top Companies . P. 131
  11. ^ Capers Jones: Program Quality and Programmer Productivity . 1977
  12. ^ Capers Jones: Estimating Software Costs . 1998
  13. ^ Peter Moore, Microsoft Visions , 1992
  14. Mel Llaguno: 2017 Coverty Scan Report. (PDF) synopsis, p. 16 , accessed on July 31, 2018 (English).
  15. ^ RH Cobb / Harlan D. Mills: Engineering Software under Statistical Quality Control . In: IEEE Software , 7 (6), 1990, pp. 44-54
  16. ^ Capers Jones: Software Engineering Best Practices. Lessons from Successful Projects in the Top Companies . P. 575
  17. ^ Carper Jones: Programming Productivity . McGraw-Hill, New York 1986