Critical Incident Reporting System

from Wikipedia, the free encyclopedia

A Critical Incident Reporting System (CIRS) ( German  reporting system on critical events is) a reporting system for anonymous reporting of critical events ( English critical incident ) and near misses ( English near miss ) in institutions of health and aviation .

concept

The safety concept of CIRS was largely shaped by the English psychologist James Reason (1990). Reason examined the accident reports of disasters such as Bhopal , Challenger , King's Cross , Chernobyl, and Zeebrugge and suggested a distinction between active and latent failure. In healthcare, Reason distinguishes between a people and systems approach to investigating errors. His research has been as Swiss Cheese Model (Engl. Swiss Cheese Model ) known.

In the Swiss cheese model, safety precautions are compared with the slices of Swiss cheese that are laid out on several levels one behind the other. In everyday life, mistakes happen that pass through the hole in one slice of cheese and are caught by the next slice if the safety system is functioning.

However, especially because of planning errors or a lack of resources, the cheese holes can be lined up one behind the other and thus offer the "opportunity for a flight path" (Reason 1990) through these multiple defensive levels. Then the security system fails and critical events with adverse consequences occur.

Reason had shown a reporting culture as an essential part of security systems:

  • an adequate reporting system,
  • appropriate reactions to the reports received,
  • Flexibility,
  • and finally the ability to learn from experience.

functionality

CIRS is an instrument for improving patient and aviation safety . These systems work on the Internet on an anonymous basis or only by post for secure data protection (cf. JavaScript and cookies ).

The introduction of an error reporting system in health care was stipulated by the fifth book of the Social Security Code from 2014. Reporting systems on critical incidents are now called error reporting systems by the legislator in the health care system ( Section 137 of Book Five of the Social Security Code ). Hospitals must have a risk management officer and a complaint management system for patients. Health workers should be able to report critical incidents anonymously and free of sanctions in easily accessible and manageable error reporting systems. In the case of aviation personnel, the EU data protection officer promotes anonymous and sanction-free communications. (see whistleblowing )

The reporter (doctor, nurse, paramedic, etc.) fills out an anonymized online form about the incident and can already add suggested solutions to prevent this critical event from occurring again. CIRS experts (e.g. medical specialists, paramedics) then evaluate the incident and, in turn, propose solutions. The incident is now published on the CIRS portal to give others the opportunity to learn from mistakes.

An identification of the behavior as a treatment or care error or a lawsuit are not a prerequisite for taking events into account. Reports in a CIRS neither replace a possibly necessary criminal complaint nor do they initiate the legal safeguarding of claims for damages or the like (compare the article medical liability ). Rather, it is about learning processes of the professionals involved under a high level of mutual protection of trust.

Such online portals are intended to improve the so far less pronounced error culture in medical professions. The important thing to avoid potentially fatal mistakes is: "Learn to talk about it".

history

Security systems were originally used in engineering e.g. B. applied for flight pilots .

An early mention of CIRS in medicine goes back to the field of anesthesiology and medical informatics in 1997.

The oldest German medical system of this type for general practitioners is called www.jeder- Fehler-zaehlt.de. This "error reporting and learning system" has existed since autumn 2004. The number of users is growing: around 895 error messages with regular use of the comment function were submitted (as of October 2017); thousands of doctors and office assistants view the reports every month.

As a project of the German Medical Association and Health Insurance Physicians National Association of Medical Devices has been around since 2005 CIRSmedical that the Medical Center for Quality in Medicine is organized.

In 2007, the patient safety alliance published recommendations for the general introduction of CIRS in hospitals . This recommendation is based on the 4-step method according to M. Cartes, which led the CIRS working group.

Various care scandals in the elderly have u. a. led to the fact that the Kuratorium Deutsche Altershilfe Köln in 2007 developed a corresponding system. This is the world's first national reporting system for care in Germany.

An innovative and effective further development of a CIRS is the 3Be system according to Cartes, which was awarded the 5th quality prize of the AOK Lower Saxony in 2008. With the help of the 3Be system (report - edit - remedy), identified risks can be processed in order to develop and implement strategies for avoidance and handling from the identified critical situations and risks. Since the improvement measures developed from the 3Be system are based on the real conditions on site, an individual risk management can be created that takes into account the needs of your own company and is cost-effective, resource-efficient and effective.

In 2012, the statutory accident insurance company set up a CIRS system for fire brigades to report and analyze near-accidents in service operations.

literature

  • Wolfgang Althof (Ed.): Worlds of Errors. About making mistakes and learning from mistakes. Opladen 1999.
  • Peter Helling, Bernhard Spengler, Thomas Springer: Errors planned correctly. Vbt Verlag Bau und Technik, 1987, ISBN 3-7640-0232-8 . (Engineering)
  • Elke M. Schüttelkopf: Success strategy error culture. How organizations increase their performance through professional handling of errors. In: Gabriele Ebner, Peter Heimerl, Elke M. Schüttelkopf: Fehler.Lernen.Unternehmen. How you perceive and shape the error culture and learning maturity of your organization. Frankfurt am Main / Berlin / Bern / Bruxelles / New York / Oxford / Vienna 2008, ISBN 978-3-631-57744-8 .
  • Elke M. Schüttelkopf: Learning from mistakes: How to learn from damage. Freiburg 2013, ISBN 978-3-648-04595-4 .
  • WHO "World Alliance for Patient Safety": WHO Draft Guidelines for Adverse Event Reporting and Learning Systems - From Information to Action (PDF; English).
  • S. Staender, J. Davies, B. Helmreich, B. Sexton, M. Kaufmann: The anesthesia critical incident reporting system: an experience based database. In: Int J Med Inform. 1997; 47 (1-2), pp. 87-90.
  • M. Rall, J. Martin, G. Geldner u. a .: Characteristics of effective incident reporting systems to increase patient safety. In: Anaesthesiology and Intensive Care Medicine. 2006; 47, pp. S9-19.
  • Advisory Council on Concerted Action in Health Care. Report 2003: Printed matter of the German Bundestag 15/530 (PDF).
  • Holger Sorgatz, T. Dichtjar: Critical Incident Reporting System Anesthesiology: Joint model project by DGAI / BDA and the Medical Center for Quality in Medicine. In: Anästh Intensivmed. 2010; 51, pp. 193–194 ( PDF ).

Web links

Germany

Switzerland

Austria

Sources and quotations

  1. ^ Medical Association of North Rhine-Westphalia A typo and the consequences . In: Rheinisches Ärzteblatt . , P. 16. ( PDF )
  2. James Reason: The Contribution of Latent Human Failures to the Breakdown of Complex Systems . In: Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences . 327, No. 1241, April 12, 1990, pp. 475-484. doi : 10.1098 / rstb.1990.0090 . (read online: JSTOR 55319 )
  3. James Reason: Human error: models and management . In: British Medical Journal . 320, No. 7237, March 18, 2000, pp. 768-770. doi : 10.1136 / bmj.320.7237.768 . PMID 10720363 . PMC 1117770 (free full text).
  4. James Reason: Managing the risks of organizational accidents. Ashgate, Aldershot 1997, ISBN 1-84014-104-2 ; quoted from James Reason: Human error: models and management. In: British Medical Journal. 320, No. 7237, March 18, 2000, pp. 768-770 (English).
  5. Aviation Safety Reporting System (ASRS). "Confidential. Voluntary. Non-punitive “( German  confidential, voluntary, without penalty ). NASA , accessed July 18, 2014 (ASRS strongly recommends mailing the field report): “ASRS collects confidential reports, analyzes the resulting aeronautical security data, and disseminates critical information to the aviation community . "
  6. a b Jürgen Brenn: CIRS-NRW error reporting system on the right track (PDF (149 kB)) In: Rheinisches Ärzteblatt . North Rhine Medical Association and North Rhine National Association of Statutory Health Insurance Physicians. Pp. 21-22. Jan 2014. Retrieved on January 6, 2014: "§ 137 Abs. 1d SGB V"
  7. Thomas Gerst: Hospitals: Error reporting systems are mandatory in the future. In: Dtsch Arztebl 2014; 111 (6). Federal Medical Association (Association of German Medical Associations) and National Association of Statutory Health Insurance Physicians, February 7, 2014, p. A-191 / B-167 / C-163 , accessed on February 12, 2014 .
  8. The European Data Protection Supervisor : Summary of the opinion of the European Data Protection Supervisor on the Commission's proposal for a regulation on the reporting of incidents in civil aviation to repeal Directive 2003/42 / EC, Regulation (EC) No. 1321/2007 of the Commission Commission Regulation (EC) No. 1330/2007 and Article 19 of Regulation (EU) No. 996/2010 . In: Official Journal of the European Union . December 7, 2013 ( 2013 / C 358/11 [PDF; 716 kB ; accessed on January 14, 2014]): “Among other changes, the proposal has the following objectives: […] To promote reporting through harmonized protection against punishment by superiors and against the prosecution of those who report the incidents. (Note: extract) "
  9. S. Stand, J. Davies: The anesthesia critical incident reporting system: an experience based database . In: International journal of medical informatics . 47, No. 1997, 1997, pp. 87-90.
  10. B. Hoffmann, J. Rohe Barbara Hoffmann, Julia Rohe: Patient Safety and Error Management — What Causes Adverse Events and How Can They Be Prevented? . In: Deutsches Ärzteblatt International . 107, No. 6, 2010, pp. 92-9. doi : 10.3238 / arztebl.2010.0092 . PMID 20204120 . PMC 2832110 (free full text).
  11. Action alliance for patient safety recommendations for the introduction of CIRS in hospitals ( Memento of the original from October 9, 2007 in the Internet Archive ) Info: The archive link was automatically inserted and not yet checked. Please check the original and archive link according to the instructions and then remove this notice. (PDF; 92 kB) @1@ 2Template: Webachiv / IABot / www.aktionsbuendnis-patientensicherheit.de
  12. 4-step method according to Cartes
  13. 3Be system according to Cartes (PDF; 346 kB).