Complex event processing

from Wikipedia, the free encyclopedia

Complex Event Processing ( CEP , German  complex event processing ) is a listing of computer science that deals with the detection, analysis, and processing of grouping of interdependent events ( English events employed). CEP is therefore a collective term for methods, techniques and tools for processing events while they are happening, i.e. continuously and promptly. CEP derives higher, valuable knowledge from events in the form of so-called complex events, i.e. H. Situations that can only be recognized as a combination of several events. In order to process various types of data streams in real time and to extract and analyze the events, the relevant systems have to cope with high loads. Areas of application are, for example, network monitoring, public security, disaster control or energy management.

overview

In event-driven computer applications, the program sequence is controlled by a strictly sequential sequence of events. That is, individual events such as B. a mouse click, the receipt of an e-mail or the end of a loading process trigger further dependent events, such as B. Saving a file on the hard disk, lighting up a message or opening a program window.

The CEP deals with events that occur multiple redundantly, multiple running side by side and multiple unreliably chained. In addition to the logic of occurrence is not in focus, focus ( fuzzy logic ) are defined. The CEP provides adjusted events as statuses which eliminate the undesired or unnecessary source events.

Events

Complex Event Processing uses the term event in the following meanings:

  • Everything that happens or is considered to have happened
  • An object that represents, encodes, or stores an event; usually for the purpose of machine-based processing
  • An event is an event within a specific system or domain . It is something that has already happened or is viewed as something that has happened within this application domain. The term event is also used to describe a programming entity that represents such an event in an IT system

In particular, CEP deals with the handling of events that only occur when several events interact. These have, for example, the following optional features:

  • the events can be interdependent and occur several times for physical reasons: A complex event shows no repetitions, but only significant changes in a status. This requires filtering the events
  • the events alternate one after the other or occur simultaneously.
  • the events are hierarchically dependent on one another and occur repeatedly in the same chain.
  • the events are all independent: a complex event is a compound of several events and defines a status.
  • the events can be related: A complex event occurs when a certain relation, for example a fuzzy relation, is fulfilled in a time interval.
  • Some of the events occur in concurrent processes: A complex event occurs when a certain majority of events has occurred.

History of the term

Originally the term was first introduced by David Luckham in his book The Power of Events . He derived the term from event processing , a set of technologies and concepts that were first used around 1950. Over time, concepts of network technology, active databases, middleware , service-oriented architecture and other areas that also have to do with event processing have been incorporated.

Today the community is divided on the term complex , because instead of hiding complexity, attempts are often made to arbitrarily ignore it. For this reason, event processing is often used synonymously with complex event processing again today. The conceptual simplification contradicts the classification of event-driven processing, which is known from the area of Event Driven Architecture (EDA).

In addition to the terms complex event processing and event processing , the term business event processing is also used for marketing reasons .

Concepts

In Complex Event Processing , high-level concepts have so far been described that deal with the topic of event processing and the recognition of event patterns. Some of these terms are heavily overloaded since they are also used in other areas of information technology , mathematics and, in some areas, in business administration . Even if this overload of terms has caused some discussion within the complex event processing / event processing community, the terms are compatible with the existing literature. Their meanings can be derived from the context used.

Complex Event Processing goes a step further and defines a virtual event as "an event that does not actually occur in the physical world, but appears to suggest an event in the real world; an event that one imagines, models or simulates . " A virtual event is treated like any other event in CEP.

Obviously, almost anything that exists in the real world or inside a computer can be viewed as an event for use by CEP. The definition is intentionally so broad that CEP intends to establish relationships both between the different levels of events and among the design patterns used to create these events so that their semantics, storage medium and transmission mechanism are not compromised. In some areas, this definition is compatible (albeit broader) with the event definition of probability theory .

Levels of abstraction and dependencies

The basic concept of CEP is a structuring of the events in abstraction levels . On an abstraction level, individual events depend on one another and influence one another (horizontal dependency) . If you combine several of these events into a group and form a higher-level event from them, this is referred to as an aggregation or a complex event. This complex event is vertically dependent on the individual events that make it up and which are one level of abstraction lower.

In addition to vertical abstraction, there is also horizontal abstraction . Events on different levels, e.g. B. Network monitoring events, database events and business process incidents are linked in such a way that a higher-quality, complex event can be derived from them. There is currently little implementation experience with the formation of horizontal dependencies , since it is a non-trivial task to depict the causality of the occurrences between the individual levels.

Academic and entrepreneurial activities

The topic of complex event processing is the subject of both academic research and product development by software houses.

It has been part of various research projects at universities since the turn of the millennium. The following list gives an insight.

Academic research projects

  • iPRODICT (German Research Center for Artificial Intelligence, Saarland University) : In an interdisciplinary team of researchers and industry experts, iPRODICT is researching an intelligent approach to partially automated adaptation and improvement of business processes. In addition to the analysis of collected process data and the real-time evaluation of current context information from sensor networks, the optimal process flow is anticipated in an innovative way using forecast calculations (iPRODICT). In this way, processes can be individually adapted to the respective context situation in real time using big data analysis methods. The developed iPRODICT approach is implemented, tested and validated in the form of an integrated prototype within an ambitious application scenario from process manufacturing at the application partner Saarstahl AG. On the one hand, this should underline the feasibility of the approach and, on the other hand, make the potential benefits, such as the early anticipation of process problems based on the analysis of large amounts of data, measurable.
  • STREAM (Stanford University) : The motivation of the STREAM project is the need for long-term, continuous queries on data streams, for example from network monitoring, telco data management, production and sensor networks, instead of one-off queries on stored data sets.
  • Rapide (Stanford University) : In order to be able to describe events and their horizontal and vertical dependencies formally, an Event Processing Language (EPL) called Rapide was developed at Stanford University . It is intended to be integrated as an extension in modern object-oriented languages ​​such as Java or C #.
  • Telegraph (UC Berkeley) : Telegraph is an adaptive data flow system that allows data from different sources to be accessed, combined and analyzed. As a data flow system, Telegraph can process both stored data and data streams from sensor networks.
  • Aurora (Brandeis University, Brown University and MIT) : Aurora addresses three broad application types in a single framework. These are continuous real-time monitoring of applications, at the same time the processing of persistent, archived data in large quantities, as well as the combination of real-time data with historical data in an efficient way.
  • Borealis (Brandeis University, Brown University and MIT) : Borealis is a distributed data stream engine that builds on the experiences from the Aurora and Medusa projects. The software is designed for Linux x86 based computers.
  • SASE (UC Berkeley / UMass Amherst) : A growing number of applications require a high volume of real-time data, for example in the areas of financial services, network monitoring and sensor networks. These requirements are implemented in the SASE project by means of a declarative event language with formal semantics, the theoretical foundations of CEP on an efficient, machine-based implementation.
  • Cayuga (Cornell University) : Publish / Subscribe is a popular paradigm for expressing a user's interest (“Subscribtion”) in events (“Publications”). Cayuga deals with the topic of stateful subscriptions and enabling users to keep the context across multiple events.
  • Odysseus (University of Oldenburg) : Odysseus is a framework for event and data stream processing whose architecture is based on flexible, expandable and adaptable components. This makes it possible to integrate the heterogeneous approaches of the different systems in one system in order to combine the advantages or to quickly add new concepts. The processing allows different data models that can be defined by any query language and thereby address additional concepts such as optimization, load management, scheduling, distribution or robustness.
  • PIPES (University of Marburg) : PIPES deals with the research problems of adaptive memory management for scheduling and query optimization in a generic runtime environment, the maintainability of data streams and the indexing of historical queries, the maintenance of non-parameterized estimation variables via data streams, static and dynamic Multy-query optimizations and sorted, multiple join operations over data streams.
  • CEPiL (University of Stuttgart, GeorgiaTech) . The central objective of the "CEP in the Large" (CEPiL) project is to implement highly scalable, complex event processing with high dynamics of the components involved in the event processing as well as producers and consumers of information. In particular, the system developed as part of the project should meet the additional requirements of today's applications with regard to robustness and data security.
  • SpoVNet (Karlsruhe Institute of Technology, University of Stuttgart, University of Mannheim, University of Tübingen) . The aim of the SpoVnet architecture is to map decentrally organized applications spontaneously to heterogeneous networks with the required quality of service. In particular, adaptive procedures for event-based communication were developed in this project, which adapt the overlay topologies of event brokers to dynamically changing network structures.
  • BeepBeep (Université du Québec à Chicoutimi) : BeepBeep 3 is an event stream engine: it receives an event stream generated by an application or a process and generates a new event stream in real time. Internally, BeepBeep analyzes and transforms the stream of events by going through a chain of basic event processors, routing the output of one (or more) processors to the input of the next.

In addition to the projects mentioned, research continues to deal with complex event processing or event processing . An overview of current publications from the research environment can be obtained from the Link Collection event-based.org maintained by Arnd Schröter.

Web links

Individual evidence

  1. ^ Michael Eckert, Francois Bry: Complex Event Processing (CEP). Gesellschaft für Informatik , May 5, 2009, accessed on July 29, 2020 .
  2. Fraunhofer FOKUS Competence Center Public IT: The ÖFIT trend sonar in IT security - Complex Event Processing. April 2016, accessed May 30, 2016 .
  3. a b c D. Luckham, R. Schulte: Event Processing Glossary - Version 1.1 July 2008
  4. ^ O. Etzion, P. Niblett: Event Processing in Action . Manning Publications, to appear in May 2010, ISBN 978-1-935182-21-4
  5. ^ D. Luckham: The Power of Events: An Introduction to Complex Event Processing in Distributed Enterprise Systems . Addison-Wesley Professional, 2002, ISBN 978-0-201-72789-0
  6. ^ D. Luckham: A Short History of Complex Event Processing , Part 1: Beginnings. (PDF; 198 kB) 2007
  7. iPRODICT research project. Retrieved December 7, 2015 .