In my earlier blog post, I explained how decision management and business rules were suitable for micro-calculations, the type of computations that businesses often codify into large spreadsheets and use to score, rate, or price items. In this post, I explain how they are also suitable to simplify data integration, aggregation, and enrichment when building and running data marketplaces.
Data marketplacesData marketplaces are a shift from data warehouses where the goal is not only to store large volumes of data, but to make that data be consumed as a service without resorting to IT or prior knowledge of a query language. Data marketplaces are often organized into three layers:
- At the lowest layer, we find raw data stored in the form in which it was ingested from the data sources. Data sources can be global ERP and CRM systems, or even local MySQL databases and shared Excel spreadsheets.
- At the middle layer, we find integrated data from multiple sources that is reconciled to resolve disparities and inconsistencies found in the original data. Often, the source systems do not have the same format for dates, names, phone numbers, and addresses. Sometimes the same object can have different attributes in different data sources.
- At the topmost layer, we find aggregate data expressed in summarized forms, often to inform about groups rather than individuals. It is at this level that we also find data enriched by external data to make them directly consumable by the businesspeople.
To learn more about data marketplaces, I recommend the Eckerson Group white paper: The Rise of the Data Marketplace – Data as a Service by Dave Wells.
Easy to define, hard to constructDefining a data marketplace as we have just done is simple, its construction is complicated for two reasons:
1) Data heterogeneity. It is not enough to bring together all the company’s data in a data marketplace for the data to be transformed into knowledge, forecasts, and decisions. Indeed, all data does not have the same age, the same structure, the same format, the same quantity, the same quality, and above all the same utility. If an attribute is important for a business line, it is not automatically important for another business line, yet within the same company.
Each business line has its vision of the product, of the customer, and of any entity managed by the various actors of the company. In the luxury sector for example, a dress, a bag, or a piece of jewelry, although it is a unique object, is seen through different attributes according to the databases where this same object is stored. Looking to exploit all the data available in a company to extract predictions and then decisions not only require integration but also transformation, unification, harmonization, and enrichment.
2) Data reorganization. A data marketplace would work better if data, information, and needs were always stable. But in a dynamic and rapidly changing business world, groups are reorganizing, and companies are acquired, merged or separated. For example, to simplify finance reporting, a country can change the region it was in a year ago, and it’s a safe bet that it will change yet again if a new boss is appointed, or a region is split or added. To be successful, data marketplaces must be implemented as change-tolerant projects.
These two reasons are representative of situations where decision management technologies are used: piecing together things that move independently to each other. Under the name of decision management, we group all the technologies that help organizations to automate all those simple but plentiful granular decisions and calculations that businesses often codify into decision tables, decision trees, or business rules.
Decisioning technologies to the rescueDecisioning technologies can be used here. But one can use a database programming language or a general-purpose scripting language to automate the loading of data from the data sources into the raw data layer if the original format is kept. There is no real value using a business rules engine to do this straightforward job. The value of using a decisioning technology starts at the integration data layer, where attributes of objects are grouped together to form an updated version of an existing attribute or a new attribute.
Take the example of customer data from two different databases. Suppose that customers have their home address and business address in a first source database, and that they only have their home address in a second source database. Suppose also that the format of the addresses is not the same in the two databases. What should the integration data layer hold? An address? So, which one? And what format? Two addresses? So, what professional address to put for these customers who are only present in the second database with the home address? These questions can be easily answered through decision rules.
Now, suppose at the aggregate data layer, we want to add an average turnover with a client that buys from two business units. Here again, calculation rules can be easily used. One can use SMARTS’ look-model engine to automate such calculations.
SMARTSSMARTS is our all-in-one low-code platform for data-driven decision-making. It unifies authoring, testing, deployment, and maintenance of micro-decisions and micro-calculations described in this article. SMARTS comes in the form of one product, four capabilities:
- A decision management platform that spans the entire life cycle of decisioning from modeling to deployment.
- A low-code no-code environment in which users express decisions and calculations through point-and-clicks and web forms.
- An AI & ModelOps environment that covers the full spectrum of ModelOps from importing existing models, to defining new ones, to executing learning tasks.
- A real-time decision analytics environment to manage the quality of decision and calculation performance.
SMARTS has been extensively used for decisions and calculations in finance, insurance, healthcare, retail, IoT, and utility sectors. To learn more about the product or our references, just contact us or request a free trial.
- Data marketplaces promise to change the way data is consumed by businesses. Contrary to data warehouses, they are more complex to build and therefore deliver on their promises.
- Decision management technologies simplify data integration, aggregation, and enrichment when building and running data marketplaces. They make decisions and calculations explicit and therefore easy to change whenever situations change.
- SMARTS supplies multiple graphical representations* and engines to make such transformations, integrations, and enrichment easy to design, implement, test, deploy, and change according to situation changes.
* The following table, tree, and graph show three different representations of the same decision logic so that developers can use one that they are familiar with or that best fits the task at hand.
AboutSparkling Logic is a Silicon Valley company dedicated to helping businesses automate and improve the quality of their operational decisions with a powerful decision management platform, accessible to business analysts and ‘citizen developers.’ Sparkling Logic SMARTS customers include global leaders in financial services, insurance, healthcare, retail, utility, and IoT.
Sparkling Logic SMARTSTM (SMARTS for short) is an all-in-one low-code platform for data-driven decision-making. It unifies authoring, testing, deployment, and maintenance of operational decisions. SMARTS combines the highly scalable Rete-NT inference engine, with predictive analytics and machine learning models, and low-code functionality to create intelligent decisioning systems.
Hassan Lâasri is a data strategy consultant, now leading marketing for Sparkling Logic. You can reach him at email@example.com.
In our last two blog posts in this series we discussed decision engine performance and how performance is impacted by deployment architecture choices. In addition to those considerations, you should also focus on business decision performance, the topic of this post.
Central to SMARTS’s approach to decision design is the idea that you need to have a strong focus on the expected business performance of your decision. The business performance of the decision is measured by multiple KPIs defined by the different business stake holders and characterize how the decision is contributing to the business.
Decision Analytics for Simulations
SMARTS provides you with fully integrated decision analytics, including aggregates, reports and dashboards that you can configure to track those KPIs. As you are implementing and optimizing the decision logic, you can run simulations to assess the impact your change have on the decision, and take appropriate action.This allows you to ensure that the business performance of your decision is actually what you want before you deploy it in production.
Real-time Decision Metrics
SMARTS also provides you streaming decision analytics, allowing you to monitor the same KPIs on the live decisions as they are deployed, and to specify alerts that trigger if those KPIs deviate from limits you can set.This gives you the peace of mind that you are always kept up to date on how well the deployed decision is behaving and that you can take early action to update it should the situation need it.
There are also cases where it is not possible to necessarily know in advance the impact of a change. You may be exploring with new decision options you had never attempted before. SMARTS allows you to deploy your decision in an experimental mode – where part of your invocations will be routed to the new “experimental” version, and the rest to the proven one, and where you will be monitoring the relative performance to identify whether your “experimental” version is doing better than the proven one. In many financial services areas, this is called Champion-Challenger, in marketing or design, this is called A/B testing. With this approach, you can gradually and safely introduce decision optimizations that lead to better and better business performance.
In summary, when considering performance of decision management systems it is critical to consider the topic from a business perspective as well as a technical perspective. We hope this series has helped clarify performance related issues pertaining to decision management.
In our last post, we looked at how predictive models are used in automated decisions. A key take away from that post is that a prediction is not a decision. Rather predictive models provide us with key insights based on historical data so we can make more informed decisions.
For example, a predictive model can identify customers that are likely to churn, transactions that are suspicious, and offerings and ads that are likely to have the most appeal. But, based on these predictions, we still need to decide the best response or course of action. A decision combines one or more predictions with business knowledge and expertise to define the appropriate actions.
From Predictions to Decisions
Determining how to take action based on predictions is not trivial . Most likely, there are multiple business options for the actions an organization could take based on a prediction. Consider for example charges that are identified as potentially fraudulent, a card issuer could report the case to the fraud team for further investigation, shut down the card to prohibit further charges, or text the cardholder to verify the charge.
Or in the case of customers who have been identified as having a high probability of switching to a competitor, a company may decide to contact them with a special incentive or renewal offer. But the company still needs to decide exactly how many customers will receive the offer and how much to offer. The company could target a flat percentage of customers, or could focus only on those with the highest projected CLTV (customer lifetime value). Targeting too many customers with too large off an offer might be too expensive to be worthwhile. The possible actions an organization can take based on predictions have different costs and benefits that need to be evaluated to determine the optimal decision. This is where decision simulation is applicable. Decision simulations help you identify the best decision strategy from amongst a set of alternatives.
Measure Your Decision Quality with KPIs and Metrics
The “best” decision strategy means the one that most closely meets your organization’s objectives. By defining KPIs and metrics that measure the quality of the decision in relation to these objectives, we have a basis to compare alternative decision approaches. Ideally these KPIs were identified early on, when you first decided to automate the decision.
Decision KPIs give us a clear understanding of how decision performance is related to business performance. They provide the basis for evaluating decision alternatives. To compare alternatives, you can run simulations using historical data. Using simulations you can compare one decision strategy to another, or you can compare how a given strategy performs on each of your customer segments as represented in your data.
Returning to the above customer churn example, we may decide we want to target customers who have an 80% or greater probability of churn based on our predictive model. One option would be to offer them a special 25% discount to attempt to re-engage and keep them as a customer. We can run a simulation against our historical data to learn how many customers fall into this bucket. From there, we can evaluate how much the discount offer would likely cost us. We can run multiple simulations using different thresholds, offers, and combinations until we find the best decision approach to deploy.
Decision Simulation Helps You Evaluate Alternative Decision Strategies
Decision simulations help us evaluate alternative decision strategies to narrow to the best approach. Modern decision management technologies, like SMARTS, make it easy to set up and run these simulations, even on very large data samples. Of course, the ultimate quality of the selected decision approach is related to its success once deployed- how many customers do we manage to retain and at what cost?
Once we deploy a decision we can monitor and track the KPIs but we have no way of knowing whether customers who did not accept our offer would have instead accepted a different offer. Or whether customers who did accept would also have accepted a 20% rather than a 25% discount. To answer these questions we need to use Champion / Challenger experiments. We’ll cover how Champion / Challenger works with decision management in an upcoming post.
In part 1, we saw that we could use knowledge, experience and intuition to build a model serving as a basis for making decisions. But when historical data is available, we can do more…
When large amounts of historical data are available (and the larger, the better), a predictive model can be built using predictive analytics: this basically uses statistics to comb through the data and find patterns. Such patterns can of course be found more easily when they occur frequently. It can be quite useful to make use of the results of BI (if available) to guide the predictive analytics algorithms so that they find the proper correlations.
When successful, the predictive model, used on new cases, will predict a given outcome –therefore based on past experience. Automation of the decision making, using the predictive model, can be performed by building business rules from that model.
And the resulting business rules can, as usual, be enriched using existing knowledge or future knowledge acquired over time (from human experience, or other predictive analytics “campaigns”).
When the results of predictive analytics are used in a number of simulation scenarios, we end up with a number of possible outcomes, a few of them possibly more optimal than others (and here we are talking business performance).
These simulation scenarios may be run continually, as new historical data becomes available, in order to constantly optimize the predictive models –and also so that they correspond to a reality that is more current.
The possibility of obtaining a number of possible decisions trying to maximize an expected outcome, all based on historical data (and possibly also on existing knowledge) leads to a real prescription: “something that is suggested as a way to do something or to make something happen” (Merriam-Webster dictionary).
Automatically providing advice on decisions to make to reach a given target is a very appealing and powerful idea: you don’t just rely on “gut feeling” or experience or past knowledge; you rely on all of these, simultaneously. And the suggestions evolve as time passes, allowing quick refocusing.
Making informed decisions
The ability of making decisions based on so many different aspects that evolve over time is already something we, humans, do at our own level (both consciously and unconsciously).
Scaling this up to tactical and strategic levels in the Enterprise requires the use of prescriptive analytics, backed by knowledge, experience, and big data. So that we can have some comfort that we made those decisions based on all that we had at our disposal.
Now, should I eat some Thai food for lunch, or some Japanese food?
We spend our lives, both personal and professional, making decisions, all day long; some without consequences, and some with long-lasting and even perhaps game-changing ones.
Should I eat some Thai food for lunch, or some Japanese food?
Do we make targeted offers to customers that have been with us for more than 2 years, or to those that have been with us for more than 5?
How do we reduce the time it takes us to fix defective devices?
Although sometimes not making a decision is worse than making the wrong one, we all strive to make the best decisions possible. And to make the best decisions, we rely on experience and whatever information is at hand. With experience in the subject matter, decisions can be made very quickly; when the matter is new or information is scarce, we usually require more time to evaluate a number of possibilities, to make a few computations, to balance the pros and cons.
All this is part of our daily lives. But when a large number of decisions need to be made in a short amount of time, or when the data available to us is limited, or on the other hand enormous, automation can come to the rescue. But how can we make informed decisions at a large scale?Read More »
Last week we jointly hosted a webinar with our consulting and implementation partner, Mariner. Shash Hedge, BI Solution Specialist from Mariner, described operational BI, its challenges, and some traditional and recent implementation approaches. He concluded with a few cases studies of operational BI projects that were missing an important piece — the ability to make decisions based on the operational insight provided by the system.
Operational BI systems provide critical insight on business operations and enable your front-line workers to make more informed decisions. But as Shash highlighted, insight delivered in the right format, to the right people, at the right time is often not enough, you need to make decisions based on that insight in order to take action…
I lead the second half of the webinar, introducing decision management and describing how it complements operational BI. Watch the recorded webinar to learn more.
The recording is a bit rough when the video gets to my part; it sounds like I am presenting from another country! We’re planning another joint webinar in May where we will cover the topic in more depth and demonstrate how these two technologies complement each other. Stay tuned for dates and registration information. I’m sure we’ll get the sound issues resolved next time!
As it is customary, let me share what I foresee as being big this new year… I would like to focus on just three points that are striking me as important in no particular order.
1. Predictive Analytics
Well, of course, we have been seeing that trend develop for a while. This is certainly not a surprising entry in this list.
The fact is that we see more and more projects combining predictive analytics and business rules. What is really interesting to me is the fact that more and more business analysts are getting trained to develop some of these predictive models.
Given the data scientist shortage, it make total sense. If you do not have a modeling team in-house or if it is swamped with high priority projects, you may as well look for other ways to leverage the available data to inform your decisions.
I am optimistic that we will see more business analysts add predictive analytics to their skill set.
2. Business Intelligence
Sticking with analytics at large, I see also a greater synergy between business intelligence and business rules. We have talked about ‘Operational BI’ for a while now, but there seems to be a lot of activity finally taking shape.
I believe that there will be more projects that actually combine both in 2014, allowing companies to act on the gained from monitoring historical trends.
3. Internet of things
When I was still in my early years, we dreamed of ‘intelligent’ equipment, cars and other things that would make our life easier. While embedding computers in all things around the house has been cost prohibitive for the mass market back then, the Cloud is now making it a reality.
The beauty of having ‘things’ that can communicate is that they are immediately candidate for ‘higher intelligence’. By hooking them up with a decision service on the cloud, we can seamlessly allow them to act more appropriately and subtly to signals they sense around around. They can better adapt since changing their behavior does not involve any hardware changes, or more generically any changes in-situ. The intelligence is located on the cloud, readily available for all connected things.
I am totally in awe with the progress we have made thus far, and the potential for a global ‘increase of intelligence’ of the things around us. The future is now!
How do we prioritize our project portfolio according to our business objectives?
Why are our customers buying mountain bikes and not city bikes?
Couldn’t we try to sell more of those top-brand bagels instead of regular sandwiches?
What is the best time to alert our customers about our deals on Black Friday to maximize our profits?
When should I buy my flight tickets to Europe to get the best deal?
From the largest company to the individual, we all naturally strive to maximize, optimize, get better returns, reduce turnover, pay less…