Paul presented on the role played by rules in CEP applications, and how they relate to “events”. As a disclosure, Paul was part of Neuron Data and/or Blaze Software in the past. Carole-Ann and I worked with him at one or another point in time. It’s interesting to see how clustered this world is…
Events are universal – and they represent the lens through which CEP people see the world; pretty much the same way analytics people will say that data is universal. My take is that these two notions are dual of each other: data has states, events represent the transitions of state for data (http://architectguy.blogspot.com/2009/11/events-and-semantics.html). Fundamentally, we should not try to contrast one against the other – they are just two facets of the same thing. I do believe that we need to be much crisper on the definition of the terms used to avoid confusion (http://architectguy.blogspot.com/2008/11/state-events-time-confusion-around-cep.html) – something that seems to be prevalent in the CEP world. Paul and I engaged in some “discussions” on this in the past (http://architectguy.blogspot.com/2008/11/more-on-cep.html).
The interest from the business world in events processing stems from the need to react at high speed to the events that have relevance to their business.
As presented by Paul, CEP is essentially the continuous processing of different types of events in real time through a number of techniques (going from correlations to analytics to human processing), and connected to business decisions and processes. Each one of these terms (real time, processing, etc…) is of course contextual.
Typical applications include efforts in energy (SmartGrid, predictive energy usage, etc…), finances, logistics, adaptive marking, government, telco. All these share the characteristic that
- they are real time
- they exhibit high volume
- involve a large amount of correlation
Paul used the “Operation Intelligence” term to describe the overarching goal of CEP.
Paul sees CEP applied to the following functional areas in decision heavy applications
- Complex Event Processing
- Decision Management
- Straight-Through Processing
- Real-time dashboards
This of course overlaps with functional areas that are covered by other technologies, in particular BRMS and BPM.
Focusing on the role of rules in CEP, Paul makes the distinction between Event Stream Processing (the big pipe, low latency being the single driver) and Event Cloud Processing (with low latency and ability to handle multiple event sources). Event Cloud Processing is a new term for me. Most Event Cloud Processing CEP vendors rely on rules-based technologies.
CEP mixes …
- ECA rules,
- Event-Enabled Inference Rules (frequently Rete)
- Continuous Queries (I agree that we should include queries (or continuous queries) in AI technology discussions – most AI systems amount to (very smart) searches through (very complex) spaces anyway.)
- State Machines
- Executable analytics
… all tuned to cope with
- very low latency
- very high throughput
- high reliability
… which means
- engines are highly parallelized
- rely on highly optimized (memory and speed) algorithms
- work in memory (sometimes shared memory)
Paul went through a few use cases:
- Citibank – an adaptive marketing application (real-time, content-based, automatic campaign engine)
- Tibco itself
The use cases are interesting, but they do deepen my feeling that CEP remains a misleading term. Through the years, I have seen (and built) very similar systems using event correlation, business rules, analytics, business process and case management following a model that I find significantly clearer – with the virtue that the decision management is clearly positioned, centralized and managed.
Paul ended his talk with the statement that CEP can be very rules focused, and the thought that, while CEP will not displace rules, rules and CEP are both in the forefront of what IT managers worry about.
Learn more about Decision Management and Sparkling Logic’s SMARTS™ Data-Powered Decision Manager