Automated pain detection

from Wikipedia, the free encyclopedia

The Automated pain recognition (AS) is a method for the objective measurement of pain and provides at the same time an interdisciplinary field of research, the parts of the medicine , psychology , Psychobiology and computer science comprises. The focus is on the computer-aided, objective recognition of pain, which is implemented on the basis of machine learning .

The automated pain detection enables a valid and reliable detection or monitoring of pain in people without verbal communication options. The underlying machine learning processes are trained and validated in advance using unimodal or multimodal human body signals. Signals for the detection of pain can have facial , gestural , ( psycho- ) physiological and paralinguistic character. So far, the focus has been on the recognition of the pain intensity, but visionary recognition of the quality, the localization and the time course of the pain is also sought.

However, the clinical implementation is controversial in the field of pain research. Critics of automated pain detection take the point of view that pain diagnosis can only be carried out subjectively by humans.

backgrounds

The pain diagnosis under the special conditions of the restricted verbal report, such as B. with verbally and / or cognitively impaired people or with patients who are sedated or mechanically ventilated, the behavioral observation by trained staff makes use of it. However, all known external observation procedures (e.g. Zurich Observation Pain Assessment, ZOPA; assessment of pain in dementia, BESD) require a great deal of professional expertise. Perception and interpretation-related misjudgments by the observer can make this process more difficult. With regard to the differences in design, methodology, sample evaluation and conceptualization of the pain phenomenon, the comparison of the quality criteria of the instruments is to be assessed as difficult. Even if trained medical staff could, in principle, record the pain intensity with observation instruments several times a day, a measurement every minute or second would not be possible. In this respect, the automated pain recognition pursues the goal of using multimodal, valid and robust patterns of pain reactions for a time-dynamic, high-resolution automated recognition system of pain intensity.

Action

In the case of automated pain detection, the recording of pain-relevant parameters is usually carried out using non-invasive sensor technology that records data on the (physical) reactions of the painful person. This can be achieved using camera technology that records facial expressions, gestures or posture, while audio sensors record the paralinguistics. (Psycho-) physiological information such as B. muscle tone or heart rate can be derived from biopotential sensors (electrodes).

The recognition of pain requires the extraction of significant features or patterns from the collected data. This is achieved through the use of machine learning processes, which are able to provide an assessment of the pain after training (learning) has taken place, e.g. B. “no pain”, “slight pain” or “severe pain”.

parameter

Although the phenomenon of pain is made up of different components ( sensory- discriminatory, affective (emotional), cognitive , vegetative and (psycho-) motoric ), automated pain recognition is currently based on the measurable parameters of pain reactions. These can be roughly divided into the 2 main categories “ Physiological reactions ” and “ Behavioral reactions ” .

Physiological responses

Pain almost always initiates autonomous-nervous processes in a person, which are measurably reflected in various physiological signals.

Physiological signals

Usually the electrodermal activity (EDA, also skin conductance), electromyography (EMG), electrocardiogram (EKG), blood volume pulse (BVP), electroencephalogram (EEG), respiration and body temperature are derived, which represent the regulatory mechanisms of the sympathetic and parasympathetic nervous system. The physiological signals are recorded mainly via special non-invasive surface electrodes (for EDA, EMG, EKG and EEG), a photoplethysmographic sensor (BVP), a breathing belt (respiration) and a thermal sensor (body temperature). In addition, endocrinological and immunological parameters can also be recorded, which, however, sometimes requires invasive measures (e.g. blood sampling).

Behavioral reactions

Behavioral reactions to pain fulfill two functions: protecting one's own body (e.g. through protective reflexes) and communicating the pain to the outside world (e.g. as a request for help). The reactions are particularly evident in facial expressions, gestures and paralinguistics.

facial expressions

Patterns of facial activity (expressive expression behavior) are recorded as behavior signals, which are technically measured with the help of video signals. The facial expression recognition is based on the clinical everyday observation that pain often shows itself in the facial expressions of the patient but does not necessarily have to show, since the facial expressivity can be inhibited by self-control. Despite the possibility of influencing facial expressions, expressive facial behavior is an essential source of pain diagnosis and is therefore also a source of information for an automated recognizer. One advantage of video-based facial expression recognition is the contact-free measurement of the face if it can be captured by video, which is not possible in every posture (e.g. prone position) or e.g. B. may be limited by bandages on the face. Rapid, spontaneous and temporary changes in neuromuscular activity that lead to visually detectable changes in the face are relevant for facial expression analysis.

gesture

Here, too, the acquisition is mostly carried out using contact-free camera technology. Motor pain responses are varied and strongly dependent on the type and cause of the pain. They range from abrupt protective reflexes (e.g. spontaneous withdrawal of extremities or bending over), to agitation (pathological restlessness) to protective behavior (hesitant, cautious movements). The head tends to move in the direction of the pain localization, or the painful part of the body is touched.

Paralinguistics

Pain leads u. a. to a non-verbal speech behavior that expresses itself in sounds such as sighing, gasping, moaning, moaning, etc. The paralinguistics are usually recorded using highly sensitive microphones.

Algorithms

After recording, preprocessing (e.g. filtering) and extracting relevant features, information can optionally be merged. Modalities from different signal sources are "merged" with one another in order to generate new or more precise knowledge.

The classification of the pain is based on machine learning processes. The choice of method has a significant influence on the recognition rate and depends heavily on the quality and granularity of the underlying data. Similar to the area of ​​affective computing, the following machine learners are mainly used at the moment:

Support Vector Machine (SVM) : The aim of an SVM is to find a clearly determined optimal hyperplane that is as close as possible to two (or more) classes to be separated. The hyperplane acts as a decision function to classify an unknown pattern.

Random Forest (RF) : RF is based on the composition of random, uncorrelated decision trees. An unknown pattern is assessed for itself by each tree and assigned to a class. A final classification of the pattern is then made by the RF through a majority decision.

k-Nearest Neighbors (k-NN) : The k-NN algorithm classifies an unknown object on the basis of a majority of the class labels, the k nearest neighbors to it. Its neighbors are determined with the aid of a selected similarity measure (e.g. Euclidean distance, Jaccard coefficient, etc.).

Artificial Neural Networks (ANN) : ANNs are inspired by biological neural networks and model their organizational principles and processes in a very simplified manner. By adjusting the weights of the individual neuron connections, patterns for classes are learned.

Simplified process of automated pain detection.  Process from left to right: origin of the pain, recording of the pain reactions, (optional) fusion of the modalities, pain classification based on fusion, pain classification based on individual parameters.
Simplified process of automated pain detection

Databases

In order to be able to classify pain in a valid way, it is necessary to create representative, reliable and valid pain databases that are available to the machine learner as a training basis. An ideal database would be sufficiently large and built from natural (not experimental) pain reactions of high quality. However, natural reactions are difficult and can only be collected to a limited extent; they are usually also characterized by suboptimal value. The currently available databases therefore contain experimentally or quasi-experimentally generated pain reactions, with different pain models being the basis depending on the database. The following list shows a selection of the most relevant pain databases (as of April 2020):

  • UNBC-McMaster Shoulder Pain
  • BioVid Heat Pain
  • EmoPain
  • SenseEmotion
  • X-ITE Pain

Potential uses

In principle, automated pain detection can be used in a variety of clinical contexts, e.g. B. in the intensive care unit (recovery room). The decisive focus, however, is to use this method in the area of ​​limited pain communication. Another option could be pain detection in a nocturnal environment when the clinics are understaffed. Ultimately, however, it is crucial to avoid under- and over-supply with analgesics . Chronic pain and cardiovascular stress in high-risk patients tend to be caused by an undersupply, while an oversupply can cause nausea, constipation , ulceration and gastrointestinal bleeding. From a visionary point of view, if further aspects of the pain can be recorded (localization, quality, duration), automated pain recognition can be used effectively in clinical diagnosis and therapy planning. A possible pain monitoring system could provide high-resolution information in real time about the pain condition of a patient, e.g. B. on a PC screen, tablet, mobile phone, etc. (see figure below).

Exemplary vision of a pain monitoring screen.  A visionary pain monitoring system should be able to provide the clinician with all relevant information (localization, quality, intensity, progression) about the patient's current pain in an effective way.
Exemplary vision of a pain monitoring screen. A visionary pain monitoring system should be able to provide the clinician with all relevant information (localization, quality, intensity, progression) about the patient's current pain in an effective way.

Individual evidence

  1. ^ A b S. Gruss et al .: Pain Intensity Recognition Rates via Biopotential Feature Patterns with Support Vector Machines. In: PLoS One. Vol. 10, No. 10, 2015, pp. 1–14, doi: 10.1371 / journal.pone.0140330 .
  2. ^ A b Elisabeth Handel: Practical Guide ZOPA: Pain assessment in patients with cognitive and / or impaired consciousness. Huber, Bern 2010, ISBN 978-3-456-84785-6 .
  3. a b besd videos. Retrieved January 28, 2019 .
  4. Henrik Kessler: Short textbook medical psychology and sociology . 3. Edition. Thieme, Stuttgart / New York 2015, ISBN 978-3-13-136423-4 , pp. 34 .
  5. ^ S. Gruss et al .: Pain Intensity Recognition Rates via Biopotential Feature Patterns with Support Vector Machines. In: PLoS One. Vol. 10, No. 10, 2015, pp. 1–14, doi: 10.1371 / journal.pone.0140330 .
  6. ^ S. Walter et al .: Automatic pain quantification using autonomic parameters. In: Psychol. Neurosci. Nol. 7, No. 3, 2014, pp. 363-380, doi: 10.3922 / j.psns.2014.041 .
  7. D. Lopez-Martinez, O. Rudovic, R. Picard: Physiological and behavioral profiling for nociceptive pain estimation using personalized multitask learning. November 2017, doi: 10.1109 / ACIIW.2017.8272611 .
  8. P. Werner, A. Al-Hamadi, K. Limbrecht-Ecklundt, S. Walter, S. Gruss, HC Traue: Automatic Pain Assessment with Facial Activity Descriptors. In: IEEE Trans. Affect. Comput. Vol. 8, No. 3, 2017, doi: 10.1109 / TAFFC.2016.2537327 .
  9. S. Brahnam, CF Chuang, FY Shih, MR Slack: SVM classification of neonatal facial images of pain. In: Fuzzy Log. Appl. Vol. 3849, 2006, pp. 121-128, doi: 10.1007 / 11676935_15 .
  10. ^ R. Niese et al .: Towards Pain Recognition in Post-Operative Phases Using 3D-based Features From Video and Support Vector Machines. In: JDCTA 3.4, 2009, pp. 21-33, doi: 10.4156 / jdcta.vol3.issue4.2 .
  11. ^ Philipp Werner, Ayoub Al-Hamadi, Kerstin Limbrecht-Ecklundt, Steffen Walter, Harald C. Traue: Head movements and postures as pain behavior . In: PLOS ONE . tape 13 , no. 2 , February 14, 2018, ISSN  1932-6203 , p. e0192767 , doi : 10.1371 / journal.pone.0192767 .
  12. a b Patrick Thiam et al .: Multi-modal pain intensity recognition based on the senseemotion database . In: IEEE Transactions on Affective Computing , 2019, doi: 10.1109 / TAFFC.2019.2892090
  13. ^ S. Walter et al .: Data fusion for automated pain recognition. In: IEEE. 9th International Conference on Pervasive Computing Technologies for Healthcare. 2015, pp. 261–264, doi: 10.4108 / icst.pervasivehealth.2015.259166 .
  14. Philipp Werner, Daniel Lopez-Martinez, Steffen Walter, Ayoub Al-Hamadi, Sascha Gruss: Automatic Recognition Methods Supporting Pain Assessment: A Survey . In: IEEE Transactions on Affective Computing . 2019, ISSN  1949-3045 , pp. 1–1 , doi : 10.1109 / TAFFC.2019.2946774 .
  15. P. Lucey, JF Cohn, KM Prkachin, PE Solomon, I. Matthews: Painful data: The UNBC-McMaster shoulder pain expression archive database. In: IEEE Int. Conf. Autom. Face Gesture Recognit. Work. FG. 2011, pp. 57-64, doi: 10.1109 / FG.2011.5771462 .
  16. ^ S. Walter et al .: The biovid heat pain database: Data for the advancement and systematic validation of an automated pain recognition. In: IEEE International Conference on Cybernetics. CYBCONF 2013, doi: 10.1109 / CYBConf.2013.6617456 .
  17. MSH Aung et al .: The Automatic Detection of Chronic Pain-Related Expression: Requirements, Challenges and the Multimodal EmoPain Dataset. In: IEEE Trans. Affect. Comput. Vol. 7, No. 4, 2016, pp. 435–451, doi: 10.1109 / TAFFC.2015.2462830 .

Web links