WEBINAR:  TAKE A TOUR OF SPARKLING LOGIC’S SMARTS™ DECISION MANAGER

Best CEP Positioning I’ve ever seen

on April 6, 2010
SMARTS™ Data-Powered Decision Management Platform

I attended Gartner’s BPM Summit in las Vegas a couple of weeks ago.  After some deserved time off from my blog, I finally find the time to tell you more about the most impressive moments.  I’ll keep the Keith Ferrazzi’s talk for another time, this was a WOW experience for the whole audience!  Besides that there was another great moment I want to share with you that was more in the context of Decision Management…

It is really a shame that many people were gone by the time the very last session, a general session started.  Jim Sinur and Roy Schulte co-presented on “Living an Eventful Life With Agile BPM”.  It was absolutely fantastic.

My Human Body Analogy

I have seen many presentations and articles on Complex Event Processing (CEP)  that do not do a good job at explaining the purpose of the technology and how to collaborates with other technologies.  Some go as far as comparing CEP with Business Process Management (BPM) and Business Rules.  I have heard more than once statements that CEP is better than Business Rules or a wider technology than BPM.  I find those quite confusing and not very helpful.  I decided a while ago to demystify the role of CEP and came up with a description of my own.  Back then, I posted in my previous life what I thought was a nice analogy to the human nervous system:

  • CEP represented the sensorial system: pre-processing the many inputs and potentially reacting locally
  • BPM represented the nerves: they carried the input from the sensorial systems to the central brain
  • BRMS represented the brain: they receive timely information and make a concerted decision to be distributed to the involved systems (organs) by the BPM/nerves

I really liked this analogy, mostly because of my background in Life Sciences.  I find biology fascinating.  I am amazed by how much we still do not understand, and how much progress we’ve made nevertheless.  That being said, it takes extra efforts for people to truly appreciate the synergy between all those various mechanisms…  This is not as straight-forward an analogy as I had hoped.

Gartner’s Take

I was intrigued by Jim’s and Roy’s approach.  It was bold to go straight to a topology discussion rather than use an analogy but they pulled it off very elegantly.

The main point that was very well illustrated in their presentation was the role of CEP in conjunction with BPM and BRMS.  I don’t have the rights to copy their materials here of course so I’ll try to describe it as well as I can.  I encourage you to download the proceedings if you have access to them.  In a nutshell, they identified a layer of execution called the Runtime BPM Technology including:

  • Workflow engine
  • Orchestration engine
  • Business Rules engine — afterall it may make sense to just accept that business rules belong here, once and for all…

A traditional process flow would invoke any of those engines depending on the steps involved – application, human interaction or decisioning step.

The latest BPM developments have introduced an additional entity in the diagram that Jim and Roy call the Intelligence Layer.  Always worried about terminology I was wondering why Business Rules would not be in the Intelligence layer since they are smart, aren’t they?  My speaking French may very well have induced the vocabulary confusion…  Well in that case do not think of intelligence as intellect but rather as in “gathering intelligence”.  It is about providing extra information real-time.  With that in mind, CEP can be leveraged very effective in two different scenarios.

CEP understands the Environment

CEP gathers by definition lots of events from sensors, applications or any other sources.  Its sophisticated technology analyzes it and turn the collected intelligence into a complex event that is dispatched to the Runtime BPM Layer.  Knowing that “something” is going on, the process can adjust the flow to accommodate the new piece of information.

This is very much the Sensorial system I described before in my human nervous system analogy.  I implied but maybe did not do a good job highlighting the fact that those sensors are “always on” as Steve Hendrick would put it.  Sensors work 24×7, not just when triggered.  Well, let’s put it this way, they receive tons of information 24×7 and will do something about it when they detect a pattern of interest — could be a simple pattern of “hey, this just happened” or an aggregate / trending thing “hey, revenue has been going down steadily”, “hey, there is a spike in sales”…  This could happen at any point in time while you are processing.

Let’s take a realistic but sad scenario…  You are processing Joe’s case in Collections.  He has not been able to pay his bills for a few months, you are looking for a way to get your money…  You follow the process (not manually of course, you automate it via BPM):

  1. Joe’s case is dispatched to an agent based on his good tenure up to now,
  2. the agent tries to call him a few times per the script,
  3. notifies the system that he/she has not been able to join him,
  4. a somewhat friendly letter is sent to Joe,
  5. a couple of weeks later a more nasty letter is sent to Joe,
  6. etc.

At any point in time in the process, this company may receive a call from Joe’s relative to informs them that he unfortunately passed away.  This is not a pre-defined step in the process but there is an exception that will know how to deal with it, and undo some steps in the queue.  If it happens after step 4, the exception flow will definitely undo the request for the nasty letter.

Those exceptions could be hardcoded as processes.  It is typically the case in Collections systems of course but it become apparent that a well isolated layer in charge of figuring out the process notification needs adds value here.  You can build more agile Applications by adding those CEP services on top of the BPM layer.

CEP understands the System

CEP is not limited to receiving external information of course.  The second clever use of that technology is to collect and process events produced by the System itself.  This is brilliant as you open the door for capabilities that complement your traditional Business Activity Monitoring (BAM).

  • First you get the opportunity to publish events real-time.  Real-time analytics are very interesting because they give you this “right-now” view of your performance rather than the “way after the fact” view when you receive your dashboard.  You can combine the convenience of Business Intelligence dashboards with the real-time capabilities of your BPM systems.  This is where dashboards turn into cockpits with active charts being drawn in front of your very eyes…  Very cool of course but also extremely powerful when combined to the notification capabilities we talked about yesterday.
  • Second you can make some very interesting pre-processing.  Calculations, correlations can be injected in your dashboards (or cockpits) such that you can display interesting information that goes above and beyond the number of occurrences or the average over time…
  • Lastly, CEP offer you the extra benefit of combining this excellent information with external intelligence. You open the door for sophisticated reports that can tell you what happens after a target event occurs in real-life.  For example “how are my call center volumes affected in the 3 hours that follow a 7+ earthquake?”.  This is something you would have never been able to measure if your Application does not keep track of earthquakes and their magnitude.  CEP allows you to pull information for other sources and mix it with your internal intelligence.

The sky the limit…

CEP is not all

I like the interesting point that Jim and Roy make in the presentation about being event-driven.  With new technology or new emergence of old technology, we get excited quite rapidly.  This is what Gartner also calls the Peak of Inflated Expectations in their legendary Hype Cycle.  I appreciate in that presentation that Jim and Roy did not state that everything was CEP since we were always talking about services exchanging data aka events.

They actually introduced the term Event-Driven Architecture (EDA) which is quite appropriate here.  When you think about it, services are loosely coupled in BPM by definition.  You want to be able to quickly assemble services together to reflect you process flows as they are with the flexibility to change them over time.  So services may be event-driven considered individually but the Application as a whole is not EDA since very clear contract must have been designed between the parts.  They fail the test of loose coupling on the consumer’s side.

Learn more about Decision Management and Sparkling Logic’s SMARTS™ Data-Powered Decision Manager

Search Posts by Category

ABOUT US

Sparkling Logic Inc. is a Silicon Valley-based company dedicated to helping organizations automate and optimize key decisions in daily business operations and customer interactions in a low-code, no-code environment. Our core product, SMARTS™ Data-Powered Decision Manager, is an all-in-one decision management platform designed for business analysts to quickly automate and continuously optimize complex operational decisions. Learn more by requesting a live demo or free trial today.