Complex Event Processing
Did you guess right on the acquisition rumors that were spreading at BBC 2011 this year?
Corticon announced today its acquisition by Progress software. One more independent vendors ends up in the heart of a BPM and more platform. This is a great validation for BRMS technology. You can’t really fit all decisioning logic into process maps without crowding them. Another interesting conclusion is that CEP did not suffice either in the BPM platform. The quotes from the announcement were pretty telling:
Dr. John Bates, chief technology officer, Progress Software said: “Within modern responsive businesses, the need to make informed and accurate decisions ‘in the moment’ is critical. High quality real-time decisions are key to avoid fraudulent transactions, to comply with complex and evolving regulations and to generally make the right decision for the business at the right time. The acquisition of Corticon reinforces Progress’ commitment to deliver operational responsiveness by helping customers build highly agile, responsive business systems with models and tools that maximize simplicity and accelerate time-to-value.”
Dr. Mark Allen, founder and former chief executive officer of Corticon, now a member of the office of Progress’ CTO, added: “[…] rules alone are sometimes not enough; to meet the holistic needs of customers, a number of technology areas need to converge […]”
Effective Decision Management is all about Agility and Simplicity. I wholeheartedly agree with that!
Do you remember the buzz at Business Rules Forum 2008 when RulesBurst / Haley did not show up? We all suspected an acquisition and soon got confirmation of the suspicious absence of this “regular” at the show. They had been acquired by Oracle.
This year again, one of the usual suspects did not come and, since then, the industry has been buzzing quite a bit. Well, technically the company was there, but the leadership team was not. We have reasons to believe that our colleagues have been acquired by another platform vendor. Do not look for the big names though. It is quite interesting to see the growing presence of BPM / CEP vendors in the Decision Management space, fast acquiring business rules capabilities…
let’s wait and see how long it will take for the rumor to turn into a formal announcement!
On the BRMS side though, we are left with fewer and fewer independent vendors. This puts more pressure on the vendor locking issue. The standards are not nearly mature enough to allow for interplay between the rules vendors — which has not been a burning issue up to now. But as BRMS are becoming an integral part of the entire platform, the initiative of swapping them becomes less of a per-project decision… and as a result, the current investment in business rules assets might require a port from platform A to platform B, meaning from BRMS A to BRMS B, as companies standardize on those ecosystems…
What are the outlets? Investing on interoperability standards is the long road. My personal inclination would be to take a better look at decision modeling environments — but I am biased of course 😉
I just got the word from Jason Morris, Chairman of the show. The show’s website is now open for registration!
If you have practical experience with Decisioning technologies like
- Business Rules,
- Complex Event Processing,
- Predictive Analaytics,
- Optimization or
- Articifial Intelligence,
Then you should consider submitting a paper to:
Please keep in mind that we are looking for hands-on experience, lessons learned, those kinds of things. Vendor pitches will not be accepted of course.
Well, of course. Who isn’t?
The very reason to implement Business Processes is to improve your overall business performance by:
- Automating straight-through processing
- Increasing customer satisfaction
- Reducing operational inefficiencies
Jim Sinur will host once again this year a show that will provide invaluable insights on how to do it right. Sparkling Logic has decided to join Jim & team for the very first time in history.
Decisions are a key part of those Business Processes. As we commented before, Business Rules, Event Processing and Business Processes are technologies that need to collaborate for best results. Carlos and I will be at the show to talk about the Best Practices to make it attainable. Sophisticated software does come with complexity that may be tricky to set up effectively. But it does not have to be. We will show you how!
We are very excited at the prospect to meet you at
BPM Summit 2011 in Baltimore on April 27-29.
Feel free to drop us a note if you are going too and want to set up some private time to talk about Decision Management the Pattern-Based Strategy way!
Jacob did not present Constraint Optimization as we grew used to. Instead, he shared his work on a “Connecting the Dots” approach applied to Mortgage Origination.
He started the talk with a practical use case: Peter applies for a loan but his income is not sufficient so he gets backing from Joe and Dawn allowing him to get the loan. While processing the paperwork, clerk notices that Joe has a business loan with Bill Smith that invalidates the decision. Fortunately Bill has equity on his house that can be put on the table. But in the end, Bill’s son, Tommy, has actually used up part of that equity leading to a small shortfall but still a shortfall.
Exciting real-life adventure! Though I long for a happy ending here…
The fact is that reality can actually be *that* convoluted and Decision Systems should be able to cope with such progressive path to uncovering the facts. If you have applied for a loan, you may have been experiencing similar “day by day” requests from the lending organization.
Jacob assembled together a number of technologies to accommodate for the dot-by-dot decision. It boils down to a state machine that is invoked in a pub-sub architecture. The traditional Decision Management pieces we expect are the rules engine, CEP module and of course a Constraint Solver. Personally I would recommend adding some Analytics in there too for proactive fact discovery — maybe text mining or something like it.
Jacob started and closed the talk with applicability for the Intelligence arm of the Government. Connecting the dots is something we wish we could have done prior to terrorist attempts like underwear bomber Umar Farouk Abdulmutallab.
Paul presented on the role played by rules in CEP applications, and how they relate to “events”. As a disclosure, Paul was part of Neuron Data and/or Blaze Software in the past. Carole-Ann and I worked with him at one or another point in time. It’s interesting to see how clustered this world is…
Events are universal – and they represent the lens through which CEP people see the world; pretty much the same way analytics people will say that data is universal. My take is that these two notions are dual of each other: data has states, events represent the transitions of state for data. Fundamentally, we should not try to contrast one against the other – they are just two facets of the same thing. I do believe that we need to be much crisper on the definition of the terms used to avoid confusion – something that seems to be prevalent in the CEP world. Paul and I engaged in some “discussions” on this in the past.
The interest from the business world in events processing stems from the need to react at high speed to the events that have relevance to their business.
As presented by Paul, CEP is essentially the continuous processing of different types of events in real time through a number of techniques (going from correlations to analytics to human processing), and connected to business decisions and processes. Each one of these terms (real time, processing, etc…) is of course contextual.
Typical applications include efforts in energy (SmartGrid, predictive energy usage, etc…), finances, logistics, adaptive marking, government, telco. All these share the characteristic that
– they are real time
– they exhibit high volume
– involve a large amount of correlation
Paul used the “Operation Intelligence” term to describe the overarching goal of CEP.
Paul sees CEP applied to the following functional areas in decision heavy applications
– Complex Event Processing
– Decision Management
– Straight ThroughProcessing
– Real time dashboards
This of course overlaps with functional areas that are covered by other technologies, in particular BRMS and BPM.
Focusing on the role of rules in CEP, Paul makes the distinction between Event Stream Processing (the big pipe, low latency being the single driver) and Event Cloud Processing (with low latency and ability to handle multiple event sources). Event Cloud Processing is a new term for me. Most Event Cloud Processing CEP vendors rely on rules-based technologies.
CEP mixes …
– ECA rules,
– Event-Enabled Inference Rules (frequently Rete)
– Continuous Queries (I agree that we should include queries (or continuous queries) in AI technology discussions – most AI systems amount to (very smart) searches through (very complex) spaces anyway.)
– State Machines
– Executable analytics
… all tuned to cope with
– very low latency
– very high throughput
– high reliability
… which means
– engines are highly parallelized
– rely on highly optimized (memory and speed) algorithms
– work in memory (sometimes shared memory)
Paul went through a few use cases:
– Citibank – an adaptive marketing application (real-time, content-based, automatic campaign engine)
– Tibco itself
The use cases are interesting, but they do deepen my feeling that CEP remains a misleading term. Through the years, I have seen (and built) very similar systems using event correlation, business rules, analytics, business process and case management following a model that I find significantly clearer – with the virtue that the decision management is clearly positioned, centralized and managed.
Paul ended his talk with the statement that CEP can be very rules focused, and the thought that, while CEP will not displace rules, rules and CEP are both in the forefront of what IT managers worry about.
George presented a fairly detailed analysis of the application of rules technologies to business problems at Union Pacific Railroad.
The key motivation behind the usage of rules technology to solve those problems is the need to make the supporting systems adaptive and agile. The challenge is that this does not come cheap – you have to implement the discipline to separate business data and logic from implementation code, and render their manage agile and adaptive. That touches data stores, transformations, logic execution, user interfaces, etc… It’s a hard problem.
The whole story behind the introduction and success of BRMS is precisely that – through business rules management, the business logic can be made adaptive and agile.
George went into the details of a classification for types of rules that he uses:
– “simple rules” – rules operating on a single business entity, following a common structure – usually represented in spreadsheets and decision tables
– “moderate rules” – rules that may apply to one or more business entity, but do follow a common structure – manageable by business users
– “complex rules” – rules that have no common structure, not manageable by business users
While I am all for classifications, I do not think that this classification is a good guide to managing the choices of representation – and I do not think George suggested that, it’s just that too frequently I have seen efforts fall into that trap. The representation choice is for me much more a question of how the business user thinks about the rules, and not what the rules are about or their level of commonality. Furthermore, I do believe that all representations need to be built in such a way that they are tolerant to exceptions – or else they break and introduce enormous friction which defeats the purpose – another reason for not pushing forward hard and fast classifications.
George also proposed a model of the roles involved in managing business rules through their lifecycle, as well as their expected interactions. To support the corresponding lifecycle, a real Enterprise BRMS is needed – and George went through the options available in the commercial and OSS world, and some elements that can be used to decide what to select.
His example is a good illustration of what really goes on in a significant Enterprise Decision Management effort. I recognized many of the issues that I have seen our customers go through.
George coincides with Carole-Ann and myself that the #1 challenge in these efforts is getting the rules management UI right. This is absolutely correct, and it’s easy to understimate the business value of it.
It has to be understandable, usable, expressive, reusable, customizable, easily accessible, and testable.
Making sure the rules are “good” (which actually combines technical correctness with business validity) is also important. Being able to simulate, test in a champion-challenger model, all that is key to ensure the success of an enterprise business rules deployment. George described how to use FIT – Framework for Integrated Test – for this purpose.
George then went into a discussion on what CEP is with respect to BRMS. This is one more polemic subject – both may use similar underlying technology (for example, some CEP implementations rely on Rete or derivations of it) but the focus is really different, and the rest of the non-engine features is different.
A good industry case of what it takes today to implement business rules in enterprise applications.
Rules Fest is only one week away![youtube=http://www.youtube.com/watch?v=FFiaZ-huobo]