Inspection (software development)

from Wikipedia, the free encyclopedia

Inspections in software development are a formal method of quality assurance with the aim of finding and correcting errors early and inexpensively during software development. The method was developed in 1972 in the IBM laboratory in Kingston, NY and published by ME Fagan in 1976.

description

While software tests can only begin when an executable program is available (dynamic test), inspections can be carried out at a very early stage of the development process (static test). The documents that already exist are checked for deviations. The idea behind it: the earlier an error is found, the easier and cheaper it can be corrected.

In the following, these terms are used for development documents to be inspected (different company-specific terms are used):

  • Objectives (ger .: Objectives): Description of the general function of the proposed product and the target user groups.
  • Specification : Exact description of the planned functions and the external interfaces as well as the quality objectives of the product.
  • Structural design (high-level design): Description of the function of the individual modules as well as their dependencies and interfaces.
  • Logic design (Low-level design): Detailed description of the logic and the algorithms used for each individual module to be programmed. A formal design language is often used here.
  • Code : The executable code in a programming language.

The idea of ​​the inspection process is to compare two documents that have already been created and to determine any differences. This means that there are the following types of inspection:

  • Interface inspection : comparison of objectives with the specification
  • High-level design inspection (I0) : comparison of the specification with the structural design
  • Low-level design inspection (I1) : comparison of the structural design with the logic design
  • Code inspection (I2) : comparison of the logic design with the code

In addition to code development, other areas can also be subjected to the inspection process, e.g. B. Test objectives vs. Test plan (IT1), test plan vs. Test cases (IT2) or documentation plan vs. User documentation. The abbreviations (I0 etc.) to designate the inspection type were introduced by Fagan in 1976.

Every discrepancy between the two documents inspected is recorded in a protocol. The following types of deviation are documented (the English terms are used here, as these have become commonplace in everyday language):

  • Major Error : A design or code problem which, if implemented this way, would result in a malfunction in the running program or a deviation from the objectives / specifications. A major error must be corrected by the author.
  • Minor Error : A violation of programming standards, (naming) conventions or rules that would not cause the program to malfunction, but would generally lead to maintenance difficulties. A minor error must be corrected by the author.
  • Suggestion : Not a bug, but a suggestion for improvement which, when implemented, leads to better understandability or a clearer design. The recommendation can be accepted or rejected by the author.
  • Open Problem : A situation that would potentially lead to an error. However, the participants in the inspection meeting cannot come to a decision here. This situation needs to be followed up outside of the inspection session. This often occurs in interface or design inspections when all the necessary details have not yet been agreed.

Note: Fagan only described "major" and "minor" errors in his first publication. The other two types of deviations were added later.

Attendees

Every participant in an inspection has received the two comparison documents in advance and has prepared accordingly. Checklists can support the preparatory work.

For reasons of efficiency, an inspection meeting should have between three and five participants. Each participant has a specific role:

  • The moderator : A person not involved in the program development but trained in the inspection process. The moderator leads the inspection meeting and takes the minutes. Every discrepancy found is recorded.
  • The author : The person who created the document to be inspected and who is responsible for correcting it.
  • The reader : A project member (not the author!) Who guides through the document to be inspected in his own words. The material is not "read out", but the reader describes how he has understood the document. Problems of understanding become immediately apparent.
  • The inspector : A project member with no special role. The main task of the inspector is to find errors. An inspector can come from the development team or from another project area (e.g. test, documentation developer, maintenance worker). The inspector can also change roles with the reader during the inspection meeting.

Important: A project management employee does not take part in the inspection meeting to ensure an open atmosphere.

Data collection from inspections

The execution of each inspection is documented with the number of participants, the duration of the preparation and the inspection meeting and the scope of the inspected material.

Every deviation found is logged with its type (major, minor, suggestion, open). Errors of the type "Major Error" can be compared directly with errors that are found in the later test phases. This allows you to create error statistics over the entire development process and identify areas prone to errors at an early stage. The number of errors found per 1000 coding lines (kLoC) is usually standardized in statistics.

Furthermore, an error category (e.g. interface problem, logic problem, performance problem, usability problem) can be recorded for each error in order to identify frequently occurring error areas and to be able to define countermeasures.

An organization that carries out intensive software inspections can, in the course of time, be able to use the collected data to create "target values", e.g. B. for errors to be found per type of inspection or for the optimal duration of inspection meetings. If these target values ​​are significantly missed, an inspection should be repeated.

The collected data should be used to learn from mistakes made and to avoid these mistakes in the future. For this purpose, a separate process "was avoiding errors " (Engl .: defect prevention and root cause analysis ) was developed.

Success factors

To achieve effective inspections, the following factors should be considered:

  • The management is behind the process
  • Sufficient time is planned for the preparation and implementation of inspection meetings
  • The number of participants is limited to 3-5 people
  • During the inspection, there is a targeted search for errors, no discussion about questions of style or solutions
  • All participants know their role, the moderator is trained accordingly
  • The inspection meeting is not significantly shorter than planned and finds a typical number of deviations

Fagan shows in that development projects with a good inspection process require totally fewer resources and usually also less time than projects without inspections. The usually somewhat higher effort in the specification and design phases is more than made up for by the lower effort in the later test phases.

Recent developments

With the advent of graphical development environments and code generators or new programming techniques (e.g. pair programming ), the role of code inspection is decreasing. Interface and design inspections still have their value, however.

Agile development processes use different development-specific documents than in the conventional waterfall model (e.g. there is usually no complete specification), so the inspection process is rarely used.

Individual evidence

  1. ^ ME Fagan: Design and code inspections to reduce errors in program development . In: IBM Systems Journal, Vol. 15, No. 3, pp. 182-211 (1976) .
  2. Michael E. Fagan: Advances in Software Inspections . In: IEEE Transactions on Software Engineering, Vol. 12, No. 7, pp. 744-751 (July 1986) .

literature

  • Inspections in Application Development - Introduction and Implementation Guidelines, IBM Form GC20-2000, July 1977
  • Code Reading, Structured Walk-Throughs and Inspections, IBM Form GE19-5200, March 1978
  • Improved Programming Technologies - An Overview, IBM Form GC20-1850, October 1978
  • Gerald M Weinberg, Daniel P. Friedman: Reviews, Walkthroughs and Inspections. In: IEEE Transactions on Software Engineering, Vol. 10, No. January 1, 1984
  • CL Jones: A process-integrated approach to defect prevention, IBM Systems Journal, Vol. 24, No. 2 (1985)
  • VR Basili, RW Selby: Comparing the Effectiveness of Software Testing Strategies. In: IEEE Transactions on Software Engineering, Vol. 13, No. December 12, 1987
  • KE Schnurer: Program inspections, experiences and problems. In: Informatik Spektrum, Volume 11, Issue 6, December 1988
  • DB Bisant, JR Lyle: A Two-Person Inspection Method to Improve Programming Productivity. In: IEEE Transactions on Software Engineering, Vol. 15, No. October 10, 1989
  • Mario Winter: Reviews in object-oriented software development. In: Software technology trends, Volume 17, Issue 2, May 1997
  • Ronald A. Radice: High Quality Low Cost Software Inspections, Paradoxicon Publishing, Andover, MA, ISBN 0-9645913-1-6 (2002)