Call to Action

Webinar: Take a tour of Sparkling Logic's SMARTS Decision Manager Register Now
Home » Analytics

Analytics

When They’re Searching for the Next Big Thing


How to Keep and Delight Fickle Customers in a Competitive Insurance and Fintech Markets

Are your customers fickle? How well do you anticipate their needs, proactively offer packages at a competitive prices, react to regulatory and competitive changes before they leave you?

decision analytics toolsToday’s banks, insurance and financials firms operate in a fast moving, highly competitive and rapidly changing market. Disruption is everywhere and the customers have choices they can make in an instant from their smart phones. Losing a customer to a more nimble competitor can be as quick as a cup of coffee with a few finger swipes at a Starbucks patio.

Particularly in the insurance market, customer interactions are precious and few. An insurance company rep needs to not only delight their customer when an opportunity arises, but also upsell them by offering them a personalized product or service tailored to their need virtually instantly.

Doing the same thing as before is a certain way to lose business

Nimble competitors now use the latest AI and analytics technology to rapidly discover and deploy intelligent decision systems which can instantly predict customer needs and customize the offering and pricing relevant for the customer at the right time.

To achieve and sustain such flexibility, a financial organization needs to modernize its underlying systems. Best companies build a living decision intelligence into their systems, ready to be updated on a moments notice.

If a competitor offers a better deal, customer has an life event or data analysts discover a new pattern for risk or fraud, core systems need to be updated virtually instantly. By having an intelligent, AI-driven central decision management system as the heart of your core system, anyone in your organization can have the latest intelligence at their fingertips. Intelligent systems will help verify customer eligibility, provide custom product or a bundle offering at a competitive price, speed up and automate claim adjudication and automate loan origination across all sales and support channels.

The heart of this solution is a modern, AI-driven decision management and rule engines platforms that use the latest AI and analytics techniques, have sophisticated cloud offerings providing unparalleled flexibility and speed. Best systems are no longer just for the IT – they allow most business analysts to view, discover, test and deploy updated business logic in a real time.

A modern organization needs the latest decision analytics tools

These tools will allow you to discover new patterns from the historical data using machine learning, connect and cross correlate multiple sources of data and incorporate predictive models from company’s data analysts. Updating and deploying new logic is now as easy as publishing a web page and does not require changing the application itself, just the underlying business logic.

decision analytics tools

Sparkling Logic SMARTS AI Decision Management is the third and the newest generation of the decision management and rules engine offering using cloud, AI, decision analytics, predictive models and machine learning. We currently process millions of rules and billions of records for the most progressive companies. Find out how we succeeded in creating the ultimate sophisticated set of decision management and decision analytics tools that every modern financial institution should have in their competitive tool chest.

Analytics- Driven Automated Decisions

SMARTS Decision Manager White Paper

Automated decisions are at the heart of your processes and systems. These operational decisions provide the foundation for your digital business initiatives and are critical to the success and profitability of your business. Learn how SMARTS Decision Manager lets you define agile, targeted, and optimal decisions and deploy them to highly available, efficient, and scalable decision services.
Get the Whitepaper

The Convergence of Data Analysts and Business Analysts


ConnectionsDecision Management has been a discipline for Business Analysts for decades now.  Data Scientists have been historically avid users of Analytic Workbenches.  The divide between these two crowds has been crossed by sending predictive model specifications across, from the latter group to the former.  These specifications could be in the form of paper, stating the formula to implement, or in electronic format that could be seamlessly imported.  This is why PMML (Predictive Model Markup Language) has proven to be a useful standard in our industry.

The fact is that the divide that was artificially created between these two groups is not as deep as we originally thought.  There have been reasons to cross the divide, and both groups have seen significant benefits in doing so.

In this post, I will highlight a couple of use cases that illustrate my point.

Read more…

The Role of Data in Decision Management


AnalyticsIn the early days, we were very focused on knowledge.  Figuring out how to extract, capture and model knowledge biased our approach to business rules the same way that we can get obsessed with nails once we have a hammer in hand.

I am not saying that knowledge isn’t important or valuable of course.

The point I want to make is that knowledge in the abstract isn’t as valuable as it could be with data.  Data is the life blood of decision management.  I came to realize that a few years ago, once I finally took a step back to rethink where we were at in this industry.  It was ironic that we did not see that working for an analytics company back then.

Read more…

From Business Intelligence to Better Decisions


Last week we jointly hosted a webinar with our consulting and implementation partner, Mariner.  Shash Hedge, BI Solution Specialist from Mariner, described operational BI, its challenges, and some traditional and recent implementation approaches.  He concluded with a few cases studies of operational BI projects that were missing an important piece — the ability to make decisions based on the operational insight provided by the system.

Operational BI systems provide critical insight on business operations and enable your front-line workers to make more informed decisions.  But as Shash highlighted, insight delivered in the right format, to the right people, at the right time is often not enough, you need to make decisions based on that insight in order to take action…

I lead the second half of the webinar, introducing decision management and describing how it complements operational BI.  Watch the recorded webinar to learn more.

The recording is a bit rough when the video gets to my part; it sounds like I am presenting from another country!  We’re planning another joint webinar in May where we will cover the topic in more depth and demonstrate how these two technologies complement each other.  Stay tuned for dates and registration information.  I’m sure we’ll get the sound issues resolved next time!

Hot Tech Trends for Machine Learning


I joined the Churchill Club this morning for an exciting breakfast on Machine Learning.  In May 2013, Steve Jurvetson of DFJ said on the Churchill Club stage that he believes machine learning will be one of the most important tech trends over the next 3-5 years for innovation and economic growth.  I was eager to hear what Peter Norvig and the other guys would say about that.

No surprise

What might be surprising is that none of them painted an ‘unfathomable’ picture of the future.  It was all about more power, faster modeling, more data…

Star Trek InnovationVision?

I can’t say that they shared a vision…  I wonder if we have all been dreaming in our young years, watching Star Trek, and super-computers fueled our imagination.  Super smart machine able to assist the crew, and eventually perform medicine or look for their ‘humanity’, is the vision.  We are all working hard at figuring out ways we can make it real, ways we can build technology that achieve the ideals we grew up dreaming about.

It has been a rocky road for Artificial Intelligence, but in the past few years, Watson, the self driving car and other wonders have made us believe that machine learning could actually live up to our expectations, and more.

Read more…

5 Things You Can Do to Improve Your Decisions


Improving your DecisionsAutomating decisions has its own Return on Investment (ROI), but it is only the very beginning of a Decision Management transformation.  The end goal should be to improve your decisions.  Having the underlying decision logic out of the system code, gives you opportunities to analyze, understand and experiment; which was not really possible before.

Don’t wait

It used to take time to ensure that your decision logic did what it was supposed to.  Business Rules had to be implemented as code, then compiled, then tested in QA, then deployed, then eventually executed for real and the produced Production reports would tell you if there is a problem.

The sooner you actually see the effect of those business rules on your production data, the sooner your can correct the course of action, or feel safe about pushing those rules into Production.

What you can do… is to operationalize the gathering of data from your Production systems, and get them into your Decision Management systems to see how your business rules will be applied.

Measure early, measure often

Testing is good, and allows you to reduce the typos and logic errors in your business rules.  You can see the raw impact of your decision logic before it gets out the door.  What will make a key difference on your bottom-line though is how this new or update decision logic will behave in aggregate.

What you can do… is to measure Key Performance Indicators (KPIs) in your systems.  KPIs are aggregated metrics that measure your business performance, your success.  For example, you might care about the distribution of Approve, Decline and Refer decisions.  but that datapoint alone is not sufficient: you want to make sure you decline the bad risk, keep the good risk, while at the same time do not undercut your revenue.  In the Fraud case study I presented with ebay, we had a different set of key metrics that were critical to the fraud expert: Catch rate and Hit rate.  Whatever those KPIs might be for your business, make sure you define them carefully, and that you measure continuously the progress you make towards them.

Look for more

There is what you see in the KPI reports, and there is what you don’t see…  Why don’t you get an extra help from the super processing power of some analytics?  they will likely not know better than you, but they can uncover some patterns in your historical data that you can refine and operationalize.

What you can do… is to crunch your data using analytic algorithms that help you ‘mine your business rules’.  Once you obtain the data-driven rules, massage them

Be creative

There might be more than one way to improve your bottom-line.  If you are implementing compliance rules, then you may not have that many options to experiment on; but if you are looking to improve your profitability you might have to try things for real into multiple segments to see for yourself which one is most effective.

What you can do… is to start by setting up your simulation environment to run those business rules ‘comparisons’ at large-scale.  It will give you a more realistic idea of your KPIs based on a larger sample.  The next step is to start experimenting live.  Marketers have done A-B testing for a long time.  In the Decision Management space, we call that experimental design or champion-challenger.

Challenge yourself

You are the business expert, right?  But how well do you know the contribution of your rules?  The decisions we make in life do not always pan out exactly how we expected them to, sometimes for the better and sometimes not…  It is the same for your business decisions.  They generally work the way you expect but there could be surprises.

You could measure Key Performance Indicators (KPIs) in your systems and look at the reports on a regular basis.  we recommend that of course.

What you could do too that is even more powerful, a greater learning experience… is to take the time to anticipate where you expect those KPIs to land and where you would like to take them.  With clear objectives in mind, you will be more attuned to the outcome and, as a result, more effective in affecting those KPIs.

If you are interested in this topic and would like to hear practical illustrations of these techniques, please join us for a webinar on 9/19 or 10/3 on “Decisions by the Numbers”.

Data versus Expertise Dilemma


balanceIn the decade (or two) I have spent in Decision Management, and Artificial Intelligence at large, I have seen first-hand the war raging between knowledge engineers and data scientists.  Each defending its approach to supporting ultimately better decisions.  So what is more valuable?  Insight from data?  Or knowledge from the expert?

Mike Loukides wrote a fantastic article called “The unreasonable necessity of subject experts on the O’Reilly radar, that illustrates this point very well and provides a clear picture as to why and how we would want both.

Data knows stuff that experts don’t

In the world of uncertainty that surrounds us, experts can’t compete with the sophisticated algorithms we have refined over the years.  Their computational capabilities goes way above and beyond the ability of the human brain.  Algorithms can crunch data in relatively little time and uncover correlations that did not suspect.

Adding to Mike’s numerous example, the typical diaper shopping use case comes to mind.  Retail transaction analysis uncovered that buyers of diapers at night were very likely to buy beer as well.  The rationale is that husbands help the new mom with shopping, when diapers run low at the most inconvenient time of the day: inevitably at night.  The new dad wandering in the grocery store at night ends up getting “his” own supplies: beer.

Mike warns against the pitfalls of data preparation.  A hidden bias can surface in a big way in data samples, whether it over-emphasizes some trends or cleans up traces of unwanted behavior.  If your data is not clean and unbiased, value of the data insight becomes doubtful.  Skilled data scientists work hard to remove as much bias as they can from the data sample they work on, uncovering valuable correlations.

 Data knows too much?

When algorithms find expected correlations, like Mike’s example of pregnant women being interested in baby products, analytics can validate intuition and confirm fact we knew.

When algorithms find unexpected correlations, things become interesting!  With insight that is “not so obvious”, you are at an advantage to market more targeted messages.  Marketing campaigns can yield much better results than “shooting darts in the dark”.

Mike raises an important set of issues: Can we trust the correlation?  How to interpret the correlation?

Mike’s article includes many more examples.  There are tons of football statistics that we smile about during the Super Bowl.  Business Insider posted some even more incredible examples such as:

  • People who dislike licorice are more likely to understand HTML
  • People who like scooped ice cream are more likely to enjoy roller coasters than those that prefer soft serve ice cream
  • People who have never ridden a motorcycle are less likely to be multilingual
  • People who can’t type without looking at the keyboard are more likely to prefer thin-crust pizza to deep-dish

There may be some interesting tidbit of insight in there that you could leverage.  but unless you *understand* the correlation, you may be misled by your data and make some premature conclusions.

Expert shines at understanding

Mike makes a compelling argument that the role of the expert is to interpret the data insight and sort through the red herrings.

This illustrates very well what we have seen in the Decision Management industry with the increased interplay between the “factual” insight and the “logic” that leverages that insight.  Capturing expert-driven business rules is a good thing.  Extracting data insight is a good thing.  But the real value is in combining them.  I think the interplay is much more intimate than purely throwing the insight on the other side of the fence.  You need to ask the right questions as you are building your decisioning logic, and use the available data samples to infer, validate or refine your assumptions.

As Mike concludes, the value resides in the conversation that is raised by experts on top of data.  Being able to bring those to light, and enable further conversations, is how we will be able to understand and improve our systems.

DecisionStats – Predictive Models Ain’t Easy to Deploy


PitfallOne of my articles was published on the DecisionStats blog.  Thanks, Ajay!  You can read it there.

This article highlights the main issues that Decision Management practitioners are facing when deploying Predictive Models with their Business Rules.

For your convenience, here it is:

Decision Management is about combining predictive models and business rules to automate decisions for your business. Insurance underwriting, loan origination or workout, claims processing are all very good use cases for that discipline… But there is a hiccup… It ain’t as easy you would expect…

What’s easy?

If you have a neat model, then most tools would allow you to export it as a PMML model – PMML stands for Predictive Model Markup Language and is a standard XML representation for predictive model formulas. Many model development tools let you export it without much effort. Many BRMS – Business rules Management Systems – let you import it. Tada… The model is ready for deployment.

What’s hard?

The problem that we keep seeing over and over in the industry is the issue around variables.

Those neat predictive models are formulas based on variables that may or may not exist as is in your object model. When the variable is itself a formula based on the object model, like the min, max or sum of Dollar amount spent in Groceries in the past 3 months, and the object model comes with transaction details, such that you can compute it by iterating through those transactions, then the problem is not “that” big. PMML 4 introduced some support for those variables.

The issue that is not easy to fix, and yet quite frequent, is when the model development data model does not resemble the operational one. Your Data Warehouse very likely flattened the object model, and pre-computed some aggregations that make the mapping very hard to restore.

It is clearly not an impossible project as many organizations do that today. It comes with a significant overhead though that forces modelers to involve IT resources to extract the right data for the model to be operationalized. It is a heavy process that is well justified for heavy-duty models that were developed over a period of time, with a significant ROI.

This is a show-stopper though for other initiatives which do not have the same ROI, or would require too frequent model refresh to be viable. Here, I refer to “real” model refresh that involves a model reengineering, not just a re-weighting of the same variables.

For those initiatives where time is of the essence, the challenge will be to bring closer those two worlds, the modelers and the business rules experts, in order to streamline the development AND deployment of analytics beyond the model formula. The great opportunity I see is the potential for a better and coordinated tuning of the cut-off rules in the context of the model refinement. In other words: the opportunity to refine the strategy as a whole. Very ambitious? I don’t think so.


 2018 SparklingLogic. All Rights Reserved.