Call to Action

Webinar: Take a tour of Sparkling Logic's SMARTS Decision Manager Register Now

Complex Event Processing

SMARTS for data marketplaces


SMARTS for data marketplaces

In my earlier blog post, I explained how decision management and business rules were suitable for micro-calculations, the type of computations that businesses often codify into large spreadsheets and use to score, rate, or price items. In this post, I explain how they are also suitable to simplify data integration, aggregation, and enrichment when building and running data marketplaces.

Data marketplaces

Data marketplaces are a shift from data warehouses where the goal is not only to store large volumes of data, but to make that data be consumed as a service without resorting to IT or prior knowledge of a query language. Data marketplaces are often organized into three layers:

  • At the lowest layer, we find raw data stored in the form in which it was ingested from the data sources. Data sources can be global ERP and CRM systems, or even local MySQL databases and shared Excel spreadsheets.
  • At the middle layer, we find integrated data from multiple sources that is reconciled to resolve disparities and inconsistencies found in the original data. Often, the source systems do not have the same format for dates, names, phone numbers, and addresses. Sometimes the same object can have different attributes in different data sources.
  • At the topmost layer, we find aggregate data expressed in summarized forms, often to inform about groups rather than individuals. It is at this level that we also find data enriched by external data to make them directly consumable by the businesspeople.

To learn more about data marketplaces, I recommend the Eckerson Group white paper: The Rise of the Data Marketplace – Data as a Service by Dave Wells.

Easy to define, hard to construct

Defining a data marketplace as we have just done is simple, its construction is complicated for two reasons:

1) Data heterogeneity. It is not enough to bring together all the company’s data in a data marketplace for the data to be transformed into knowledge, forecasts, and decisions. Indeed, all data does not have the same age, the same structure, the same format, the same quantity, the same quality, and above all the same utility. If an attribute is important for a business line, it is not automatically important for another business line, yet within the same company.

Each business line has its vision of the product, of the customer, and of any entity managed by the various actors of the company. In the luxury sector for example, a dress, a bag, or a piece of jewelry, although it is a unique object, is seen through different attributes according to the databases where this same object is stored. Looking to exploit all the data available in a company to extract predictions and then decisions not only require integration but also transformation, unification, harmonization, and enrichment.

2) Data reorganization. A data marketplace would work better if data, information, and needs were always stable. But in a dynamic and rapidly changing business world, groups are reorganizing, and companies are acquired, merged or separated. For example, to simplify finance reporting, a country can change the region it was in a year ago, and it’s a safe bet that it will change yet again if a new boss is appointed, or a region is split or added. To be successful, data marketplaces must be implemented as change-tolerant projects.

These two reasons are representative of situations where decision management technologies are used: piecing together things that move independently to each other. Under the name of decision management, we group all the technologies that help organizations to automate all those simple but plentiful granular decisions and calculations that businesses often codify into decision tables, decision trees, or business rules.

Decisioning technologies to the rescue

Decisioning technologies can be used here. But one can use a database programming language or a general-purpose scripting language to automate the loading of data from the data sources into the raw data layer if the original format is kept. There is no real value using a business rules engine to do this straightforward job. The value of using a decisioning technology starts at the integration data layer, where attributes of objects are grouped together to form an updated version of an existing attribute or a new attribute.

Take the example of customer data from two different databases. Suppose that customers have their home address and business address in a first source database, and that they only have their home address in a second source database. Suppose also that the format of the addresses is not the same in the two databases. What should the integration data layer hold? An address? So, which one? And what format? Two addresses? So, what professional address to put for these customers who are only present in the second database with the home address? These questions can be easily answered through decision rules.

Now, suppose at the aggregate data layer, we want to add an average turnover with a client that buys from two business units. Here again, calculation rules can be easily used. One can use SMARTS’ look-model engine to automate such calculations.

SMARTS

SMARTS is our all-in-one low-code platform for data-driven decision-making. It unifies authoring, testing, deployment, and maintenance of micro-decisions and micro-calculations described in this article. SMARTS comes in the form of one product, four capabilities:

SMARTS has been extensively used for decisions and calculations in finance, insurance, healthcare, retail, IoT, and utility sectors. To learn more about the product or our references, just contact us or request a free trial.

Wrap-up

  • Data marketplaces promise to change the way data is consumed by businesses. Contrary to data warehouses, they are more complex to build and therefore deliver on their promises.
  • Decision management technologies simplify data integration, aggregation, and enrichment when building and running data marketplaces. They make decisions and calculations explicit and therefore easy to change whenever situations change.
  • SMARTS supplies multiple graphical representations* and engines to make such transformations, integrations, and enrichment easy to design, implement, test, deploy, and change according to situation changes.

* The following table, tree, and graph show three different representations of the same decision logic so that developers can use one that they are familiar with or that best fits the task at hand.

DecisionDecision treeDecision graph

About

Sparkling Logic is a Silicon Valley company dedicated to helping businesses automate and improve the quality of their operational decisions with a powerful decision management platform, accessible to business analysts and ‘citizen developers.’ Sparkling Logic SMARTS customers include global leaders in financial services, insurance, healthcare, retail, utility, and IoT.

Sparkling Logic SMARTSTM (SMARTS for short) is an all-in-one low-code platform for data-driven decision-making. It unifies authoring, testing, deployment, and maintenance of operational decisions. SMARTS combines the highly scalable Rete-NT inference engine, with predictive analytics and machine learning models, and low-code functionality to create intelligent decisioning systems.

Hassan Lâasri is a data strategy consultant, now leading marketing for Sparkling Logic. You can reach him at hlaasri@sparklinglogic.com.

Technical Series: Decision Management Platform Integrations


DMNDecision Management and Business Rules Management platforms cater to the needs of business oriented roles (business analysts, business owners, etc.) involved in operational decisions. But they also need to take into account the constraints of the enterprise and its technology environment.

Among those constraints are the ones that involve integrations. This is the first series of posts exploring the requirements, approaches and trade-offs for decision management platform integrations with the enterprise eco-system.

Why integrate?

Operational decisions do not exist in a vacuum. They

  • are embedded in other systems, applications or business processes
  • provide operational decisions that other systems carry out
  • are core contributors to the business performance of automated systems
  • are critical contributors to the business operations and must be under tight control
  • must remain compliant, traced and observed
  • yet must remain flexible for business-oriented roles to make frequent changes to them

Each and every one of these aspects involves more than just the decision management platform. Furthermore, more than one enterprise system provides across-application support for these. Enterprises want to use such systems because they reduce the cost and risk involved in managing applications.
For example, authentication across multiple applications is generally centralized to allow for a single point of control on who has access to them. Otherwise, each application implements its own and managing costs and risk skyrocket.

In particular, decision management platforms end up being a core part of the enterprise applications, frequently as core as databases. It may be easy and acceptable to use disconnected tools to generate reports, or write documents; but it rarely is acceptable to not manage part of core systems. In effect, there is little point in offering capabilities which cannot cleanly fit into the management processes for the enterprise; the gain made by giving business roles control of the logic is negated by the cost and risk in operating the platform.

In our customer base, most do pay attention to integrations. Which integrations are involved, and with which intensity, depends on the customer. However, it is important to realize that the success of a decision management platform for an enterprise also hinges on the quality of its integrations to its systems.

Which integrations matter?

We can group the usual integrations for decision management platforms in the following groups:

  • Authentication and Access Control
  • Implementation Support
  • Management Audit
  • Life-cycle management
  • Execution
  • Execution Audit
  • Business Performance Tracking

Authentication and access control integrations are about managing which user has access to the platform, and, beyond that, to which functionality within the platform.
Implementation support integrations are those that facilitate the identification, implementation, testing and optimization of decisions within the platform: import/export, access to data, etc.
Management audit integrations enable enterprise systems to track who has carried out which operations and when within the platform.
Life-cycle management integrations are those that support the automated or manual transitioning of decisions through their cycles: from inception to implementation and up to production and retirement.

Similarly, execution integrations enable the deployment of executable decisions within the context of the enterprise operational systems: business process platforms, micro-services platforms, event systems, etc. Frequently, these integrations also involve logging or audit systems.
Finally, performance tracking integrations are about using the enterprise reporting environment to get a business-level view of how well the decisions perform.

Typically, different types of integrations interest different roles within the enterprise. The security and risk management groups will worry about authentication, access control and audit. The IT organization will pay attention to life-cycle management and execution. Business groups will mostly focus on implementation support and performance tracking.

The upcoming series of blog posts will focus on these various integrations: their requirements, their scope, their challenges and how to approach them.

In the meantime, you can read the relevant posts in the “Best Practices” series:

Decision Management Rumors Confirmed


Did you guess right on the acquisition rumors that were spreading at BBC 2011 this year?

CorticonCorticon announced today its acquisition by Progress software.  One more independent vendors ends up in the heart of a BPM and more platform.  This is a great validation for BRMS technology.  You can’t really fit all decisioning logic into process maps without crowding them.  Another interesting conclusion is that CEP did not suffice either in the BPM platform.  The quotes from the announcement were pretty telling:

Dr. John Bates, chief technology officer, Progress Software said: “Within modern responsive businesses, the need to make informed and accurate decisions ‘in the moment’ is critical. High quality real-time decisions are key to avoid fraudulent transactions, to comply with complex and evolving regulations and to generally make the right decision for the business at the right time. The acquisition of Corticon reinforces Progress’ commitment to deliver operational responsiveness by helping customers build highly agile, responsive business systems with models and tools that maximize simplicity and accelerate time-to-value.”

Dr. Mark Allen, founder and former chief executive officer of Corticon, now a member of the office of Progress’ CTO, added: “[…] rules alone are sometimes not enough; to meet the holistic needs of customers, a number of technology areas need to converge […]”

Effective Decision Management is all about Agility and Simplicity.  I wholeheartedly agree with that!

Congratulations, Mark!

More Business Rules Consolidation?


Do you remember the buzz at Business Rules Forum 2008 when RulesBurst / Haley did not show up?  We all suspected an acquisition and soon got confirmation of the suspicious absence of this “regular” at the show.  They had been acquired by Oracle.

This year again, one of the usual suspects did not come and, since then, the industry has been buzzing quite a bit.  Well, technically the company was there, but the leadership team was not.  We have reasons to believe that our colleagues have been acquired by another platform vendor.  Do not look for the big names though.  It is quite interesting to see the growing presence of BPM / CEP vendors in the Decision Management space, fast acquiring business rules capabilities…

let’s wait and see how long it will take for the rumor to turn into a formal announcement!

On the BRMS side though, we are left with fewer and fewer independent vendors.  This puts more pressure on the vendor locking issue.  The standards are not nearly mature enough to allow for interplay between the rules vendors — which has not been a burning issue up to now.  But as BRMS are becoming an integral part of the entire platform, the initiative of swapping them becomes less of a per-project decision…  and as a result, the current investment in business rules assets might require a port from platform A to platform B, meaning from BRMS A to BRMS B, as companies standardize on those ecosystems…

What are the outlets?  Investing on interoperability standards is the long road.  My personal inclination would be to take a better look at decision modeling environments — but I am biased of course 😉

BREAKING NEWS: Rules Fest Call for Paper is now Open!


RulesFestI just got the word from Jason Morris, Chairman of the show.  The show’s website is now open for registration!

Register!

If you have practical experience with Decisioning technologies like

  • Business Rules,
  • Complex Event Processing,
  • Predictive Analaytics,
  • Optimization or
  • Articifial Intelligence,

Then you should consider submitting a paper to:

Join the Rules Fest Speaker Hall of Fame!

Please keep in mind that we are looking for hands-on experience, lessons learned, those kinds of things.  Vendor pitches will not be accepted of course.

Interested in BPM Success & High Performance?


Well, of course.  Who isn’t?

The very reason to implement Business Processes is to improve your overall business performance by:

  • Automating straight-through processing
  • Increasing customer satisfaction
  • Reducing operational inefficiencies

Jim Sinur will host once again this year a show that will provide invaluable insights on how to do it right.  Sparkling Logic has decided to join Jim & team for the very first time in history.

Why?

Decisions are a key part of those Business Processes.  As we commented before, Business Rules, Event Processing and Business Processes are technologies that need to collaborate for best results.  Carlos and I will be at the show to talk about the Best Practices to make it attainable.  Sophisticated software does come with complexity that may be tricky to set up effectively.  But it does not have to be.  We will show you how!

We are very excited at the prospect to meet you at

BPM Summit 2011 in Baltimore on April 27-29.

Feel free to drop us a note if you are going too and want to set up some private time to talk about Decision Management the Pattern-Based Strategy way!

 

Rules Fest Live: Jacob Feldman / Connecting the Dots


Jacob Feldman

Jacob did not present Constraint Optimization as we grew used to.  Instead, he shared his work on a “Connecting the Dots” approach applied to Mortgage Origination.

He started the talk with a practical use case: Peter applies for a loan but his income is not sufficient so he gets backing from Joe and Dawn allowing him to get the loan.  While processing the paperwork, clerk notices that Joe has a business loan with Bill Smith that invalidates the decision.  Fortunately Bill has equity on his house that can be put on the table.  But in the end, Bill’s son, Tommy, has actually used up part of that equity leading to a small shortfall but still a shortfall.

Exciting real-life adventure!  Though I long for a happy ending here…

The fact is that reality can actually be *that* convoluted and Decision Systems should be able to cope with such progressive path to uncovering the facts.  If you have applied for a loan, you may have been experiencing similar “day by day” requests from the lending organization.

Jacob assembled together a number of technologies to accommodate for the dot-by-dot decision.  It boils down to a state machine that is invoked in a pub-sub architecture.  The traditional Decision Management pieces we expect are the rules engine, CEP module and of course a Constraint Solver.  Personally I would recommend adding some Analytics in there too for proactive fact discovery — maybe text mining or something like it.

Jacob started and closed the talk with applicability for the Intelligence arm of the Government.  Connecting the dots is something we wish we could have done prior to terrorist attempts like underwear bomber Umar Farouk Abdulmutallab.

Rules Fest Live: Paul Vincent / Rules in CEP Applications


Paul Vincent

Paul presented on the role played by rules in CEP applications, and how they relate to “events”. As a disclosure, Paul was part of Neuron Data and/or Blaze Software in the past. Carole-Ann and I worked with him at one or another point in time. It’s interesting to see how clustered this world is…

Events are universal – and they represent the lens through which CEP people see the world; pretty much the same way analytics people will say that data is universal. My take is that these two notions are dual of each other: data has states, events represent the transitions of state for data. Fundamentally, we should not try to contrast one against the other – they are just two facets of the same thing.  I do believe that we need to be much crisper on the definition of the terms used to avoid confusion – something that seems to be prevalent in the CEP world. Paul and I engaged in some “discussions” on this in the past.

The interest from the business world in events processing stems from the need to react at high speed to the events that have relevance to their business.

As presented by Paul, CEP is essentially the continuous processing of different types of events in real time through a number of techniques (going from correlations to analytics to human processing), and connected to business decisions and processes.  Each one of these terms (real time, processing, etc…) is of course contextual.

Typical applications include efforts in energy (SmartGrid, predictive energy usage, etc…), finances, logistics, adaptive marking, government,  telco. All these share the characteristic that
– they are real time
– they exhibit high volume
– involve a large amount of correlation

Paul used the “Operation Intelligence” term to describe the overarching goal of CEP.

Paul sees CEP applied to the following functional areas in decision heavy applications
– Complex Event Processing
– Decision Management
– Straight ThroughProcessing
– Real time dashboards
This of course overlaps with functional areas that are covered by other technologies, in particular BRMS and BPM.

Focusing on the role of rules in CEP, Paul makes the distinction between Event Stream Processing (the big pipe, low latency being the single driver) and Event Cloud Processing (with low latency and ability to handle multiple event sources). Event Cloud Processing is a new term for me. Most Event Cloud Processing CEP vendors rely on rules-based technologies.

CEP mixes …
–  ECA rules,
–  Event-Enabled Inference Rules (frequently Rete)
– Continuous Queries (I agree that we should include queries (or continuous queries) in AI technology discussions – most AI systems amount to (very smart) searches through (very complex) spaces anyway.)
– State Machines
– Executable analytics

… all tuned to cope with
– very low latency
– very high throughput
– high reliability

… which means
– engines are highly parallelized
– rely on highly optimized (memory and speed) algorithms
– work in memory (sometimes shared memory)

Paul went through a few use cases:
– Citibank – an adaptive marketing application (real-time, content-based, automatic campaign engine)
– Tibco itself

The use cases are interesting, but they do deepen my feeling that CEP remains a misleading term. Through the years, I have seen (and built) very similar systems using event correlation, business rules, analytics, business process and case management following a model that I find significantly clearer – with the virtue that the decision management is clearly positioned, centralized and managed.

Paul ended his talk with the statement that CEP can be very rules focused, and the thought that, while CEP will not displace rules, rules and CEP are both in the forefront of what IT managers worry about.


© 2022 SparklingLogic. All Rights Reserved.