Call to Action

Webinar: Take a tour of Sparkling Logic's SMARTS Decision Manager Register Now

Optimization

SMARTS Real-Time Decision Analytics


Real-time decision analytics
In this post, I briefly introduce SMARTS Real-Time Decision Analytics capability to manage the quality and performance of operational decisions from development, to testing, to production.

Decision performance

H. James Harrington, one of the pioneers of decision performance measurement, once said, “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” This statement is also true for decision performance.

Measuring decision performance is essential in any industry where a small improvement in a single decision can make a big difference, especially in risk-driven industries such as banking, insurance, and healthcare. Improving decisions in these sectors means continuously adjusting policies, rules, prices, etc. to keep them consistent with business strategy and compliant with regulations.

Decision performance management in SMARTS

SMARTS helps organizations make their operational decisions explicit, so that they can be tested and simulated before implementation —- thereby reducing errors and bias. To this end, we added a real-time decision analytics capability to the core decision management platform.

Currently used in financial and insurance services, it helps both business analysts and business users to define dashboards, assess alternative decision strategies, and measure the quality of performance at all stages of the lifecycle of decision management — all with the same interface without switching from one tool to another.

Development. From the start, SMARTS focuses the decision automation effort on tangible business objectives, measured by Key Performance Indicators (KPIs). Analysts and users can define multiple KPIs through graphical interactions and simple, yet powerful formulas. As they capture their decision logic, simply dragging and dropping any attribute into the dashboard pane automatically creates reports. They can customize these distributions, aggregations, and/or rule metrics, as well as the charts to view the results in the dashboard.

Testing and validation. During the testing phase, analysts and users have access to SMARTS’ built-in map-reduce-based simulation environment to measure these metrics against large samples of data. Doing so, they can estimate the KPIs for impact analysis before the actual deployment. And all of this testing work does not require IT to code these metrics, because they are transparently translated by SMARTS.

Execution. By defining a time window for these metrics, business analysts can deploy them seamlessly against production traffic. Real-time decision analytics charts display the measurements and trigger notifications and alerts when certain thresholds are crossed or certain patterns are detected. Notifications can be pushed by email, or generate a ticket in a corporate management system. Also, real-time monitoring allows organizations to react quickly when conditions suddenly change. For example, under-performing strategies can be eliminated and replaced when running a Champion/Challenger experiment.

Uses cases

Insurance underwriting. Using insurance underwriting as an example, a risk analyst can look at the applicants that were approved by the rules in production and compare them to the applicants that would be approved using the rules under development. Analyzing the differences between the two sets of results drive the discovery of which rules are missing or need to be adjusted to produce better results or mitigate certain risks.

For example, he or she might discover that 25% of the differences in approval status are due to differences in risk level. This insight leads the risk analyst to focus on adding and/or modifying your risk related rules. Repeating this analyze-improve cycle reduces the time to consider and test different rules until he or she gets the best tradeoff between results and risks.

Fraud detection. An other example from a real customer case is flash fraud where decisions had to be changed and new ones rolled out in real time. In this case, the real-time decision analysis capability of SMARTS was essential so that the customer could spot deviation trends from normal situation directly in the dashboard and overcome the flood in the same user interface, all in real time.

Without this built-in capability, the time lag between the identification of fraud and the implementation of corrective actions would have been long, resulting in significant financial losses. In fact, with SMARTS Real-Time Decision Analytics, the fraud management for this client has gone from 15 days to 1 day.

Marketing campaign. The two above examples are taken from financial services but SMARTS real-time decision analytics helps in any context where decision performance could be immediately affected by a change in data, models, or business rules, such as in loan origination, product pricing, or marketing promotion.

In the latter case, SMARTS can help optimize promotion in real-time. Let’s say you construct a series of rules for a marketing couponing using SMARTS Champion/Challenger capability. Based on rules you determine, certain customers will get a discount. Some get 15% off (the current offering — the champion), while others get 20% (a test offering — the challenger). And you wonder if the extra 5% discount leads to more coupons used and more sales generated. With SMARTS real-time decision analytics environment, you find out the answer as the day progresses. By testing alternatives, you converge to the best coupon strategy with real data and on the fly.

Conclusion

As part of the decision lifecycle, business analysts obviously start by authoring their decision logic. As they progress, testing rapidly comes to the forefront. To this end, SMARTS integrates predictive data analytics with real-time decision analytics, enabling business analysts and business users to define dashboards and seamlessly associate metrics with the execution environment — using the same tool, the same interface, and just point and click.

Takeaways

  • SMARTS comes with built-in decision analytics — no additional or third-party tool is required
  • You can define metrics on decision results so you can measure and understand how each decision contributes to your organization’s business objectives
  • Decision metrics enable you to assess alternative decision strategies to see which should be kept and which rejected
  • SMARTS add-on for real-time decision analytics lets you monitor the decisions being made and make adjustments on the fly
  • SMARTS’ real-time decision analytics helps in any context where decision performance could be immediately affected by a change in data, models, or business rules

About

Sparkling Logic is a decision management company founded in the San Francisco Bay Area to accelerate how companies leverage data, machine learning, and business rules to automate and improve the quality of enterprise-level decisions.

Sparkling Logic SMARTS is an end-to-end, low-code no-code decision management platform that spans the entire business decision lifecycle, from data import to decision modeling to application production.

Carlos Serrano is Co-Founder, Chief Technology Officer at Sparkling Logic. You can reach him at cserrano@sparklinglogic.com.

IBM expanding its footprint in Risk Management


Well, it feels like we should devote a complete section of this blog to IBM and its acquisitions. Over $14 billion in acquisitions in the last 5 years, many of those in technologies and expertise in areas squarely in or related to the field of Decision Management.

Over the past few days, IBM has announced its acquisition of UK-based I2, and of Canada-based Algorithmics. The official press releases – of course, similar in form – provide some insight on what motivated IBM to invest in these acquisitions: “accelerate big data analytics to transform big cities” for I2, and “accelerate business analytics into financial risk management” for Algorithmics. The terms of these acquisitions – not disclosed for I2 but believed to be in the $500M range, and in the $380M range for Algorithmics – are not enough to make them mega deals, but, they position IBM squarely as a major provider of risk management solutions for multiple industries, but in particular financial services and  defense/security.

Both companies leverage data, and increasingly what is now called big data with its volume + variety + complexity + velocity challenge, through sophisticated analytics to support automated and human decision making by assessing and qualifying risk.

A lot of noise has been made that these two acquisitions increase the presence of IBM in the big data analytics world. While that is correct at the technology level, I believe that they also do something else: they contribute to make IBM a key provider of core enterprise risk management solutions.

I2 is not a well known company outside the  security, fraud and crime risk management spaces. Of course, it’s never helped that a much better supply chain management company has the same name… I2’s products allow organizations to track vasts amounts of data, and organize it, search through it in order to identify patterns that may be indicative of terrorist, criminal or fraudulent behavior. A number of techniques are used, but a big claim to fame for I2 is its leading position in the link analysis space. Link analysis, also sometimes referred to as network analysis, and in a particular form made popular by social network analysis, identifies relevant relationships between entities, qualifies them (for example in terms of “betweenness”, “closeness”, etc) and allows to navigate them through multiple dimensions, including time, leading to the recognition of a pattern of entities and non obvious relationships indicative of potential issues. The analysis is carried out on large sets of seemingly disparate data:  transaction data, structured and semi-structured documents, phone records, email trails, IP data, etc. Its products, for example Analyst’s Notebook, have receive great reviews.

I2 brings to the table not just the risk management products and expertise that has made the company famous in that space, but also a solid expertise in the management of big data. IBM has made acquisitions in this space – Netezza a year ago, in September 2010 and NISC a little bit earlier  –  and I2 brings to the company complementary solutions and expertise.

Algorithmics is also a well known company in its space which is not well known outside of it. It specializes in the measurement and management of the risk of financial investments. Up to now it has been part of the French holding company Fimalac which happens to also own the Fitch Ratings agency which issues credit ratings to both commercial companies and governments – I would expect them to use the capabilities of Algorithmics. The company was created in 1989, and its initial charter – to create solutions to characterize and manage the financial risk of investments – addressed some of the risk issues faced during the 1987 stock market crash. We are not going to elaborate on why the similar risk management and rating issues remain at the forefront of preoccupations in the financial and political worlds…

Risk management is a fairly fragmented space, with specialized solutions focusing on different types of risks. In the short term, it is possible that IBM will not immediately compete with some of its partners in the risk management space, such as  Fair Isaac (disclosure: I used to work there). However, risk management is becoming much more of a global enterprise affair than it used to be – the sources of risk are becoming multi-faceted, delivered through multiple channels, touching multiple processes at once. Customers are looking for, and assembling themselves, enterprise risk management solutions.  This trend makes IBM’s acquisitions in this space well thought out to position the company at the core of these solutions, and I am certain that IBM will displace or acquire niche risk management vendors as its footprint in the space continues to grow. It should be noted that, like for big data, the acquisitions in risk management areas have been quite frequent for (Ever) Big(ger) Blue: NISC already mentioned earlier in January 2010, OpenPages in September 2010, PSS Systems in October 2010, and now these two.

Another important aspect is that a lot of the technology and solutions applied to the management of risk are also applicable to the optimization of processes and the increase of competitiveness. In a world where regulations will increase to reign in excesses, the search for incremental competitiveness will be combined with the compliance to regulations and the management of risk in comprehensive solutions. Decision Management already plays and will continue to play a central role there, leveraging data management, analytics, business rules, optimization and case management technologies in concert.

I do expect IBM to continue completing its portfolio in big data, decision management and risk management. IBM clearly has its acquisition machine in control – and it is paying off. For example, the investments in analytics have enabled its business analytics software and services unit to see seven consecutive quarters of growth, with a growth of 20% just in the first half of 2011. IBM’s goal is to go from $10 billion in annual sales now to $16 billion by 2015. A significant increase, but one that it is giving itself the means to achieve.

I have a little list of companies I would not be surprised to see becoming part of all this (although I missed one: Autonomy, bought by HP not long ago, was one I expected to see acquired by IBM…)

BREAKING NEWS: Rules Fest Call for Paper is now Open!


RulesFestI just got the word from Jason Morris, Chairman of the show.  The show’s website is now open for registration!

Register!

If you have practical experience with Decisioning technologies like

  • Business Rules,
  • Complex Event Processing,
  • Predictive Analaytics,
  • Optimization or
  • Articifial Intelligence,

Then you should consider submitting a paper to:

Join the Rules Fest Speaker Hall of Fame!

Please keep in mind that we are looking for hands-on experience, lessons learned, those kinds of things.  Vendor pitches will not be accepted of course.

Optimization vs. Business Rules: pick your evil


During my career, I have been asked over and over again what my opinion was regarding 2 different approaches: Optimization or Business Rules.  Which one is best?

People always expect more than the “it depends” answer.  But in that case, how could anyone answer such a broad “apples and oranges” question?  It is actually like asking if I recommend a hammer or a screwdriver…  Of course it depends on the job at hand!

Trying to satisfy my audience, I thought about it very carefully.  It has always been clear to me that this technology choice was not really about the nature of the problem as it was about the approach to solve it.  Let me clarify what I mean after we settle on a common terminology.

What are those Technologies?

Optimization is about finding the best solution to a problem.  You can either find any solution that works, get all possible solutions or select the best one based on some objective criteria.  The premise is that you can describe the problem itself and the associated constraints that define a “good” solution.  The technology is responsible for performing the optimized search.

Illustration – an optimization model might look like those problems we used to solve in school.  For example:

  • X is monthly income
  • Y is the monthly mortgage payment which is known to be $3,000
  • X must be greater than 3 times Y for the primary borrower
  • What is the lowest income you need to get documentation for? *** this is what the engine figures out ***

Granted, this example is extremely simplistic.  In reality a traditional optimization model might look at a loan request and define the best terms for the bank (i.e. highest margin) that have the best chance to be competitive for the applicant.  Interestingly, the optimization job can also deal with much larger search space — for example, allocating gates in airports or scheduling crews over a large geography.

Business Rules are about describing how you go about getting to the end result.  You can describe the various steps you might perform to reach an end state — often a decision but it could also be a configuration or, more generally, some set values.  The rules engine is responsible for “sequencing” and executing those IF-THEN-ELSE statements.

Illustration – a rulebase might look like this:

  • If the total income of the primary borrower is less than 3 times the loan monthly payment then require guarantor
  • If the total income of the guarantor is less than 5 times the loan monthly payment then decline

For two rules only, a rules engine would definitely be overkill but consider that you may have thousands or millions of rules in your system.  Regulations may force you to create many different alternatives of the same rule with variations per State or Country; different products may have slightly different governing rules; Gold customers might be treated differently than your general customer.  The beauty of engine that executes them is that it does not care about overlap or sequencing.  Those potentially huge volumes of rules will be executed very efficiently as it relates to repeated tests over several rules and consistently.  I should add a caveat here that I am mostly referring to RETE-based engines.  Non-inferencing engines would not necessarily provide the same convenience.

So What Makes Them Similar?

With those definitions in mind you might wonder why anyone would use one in place of the other one…  Well, there is overlap.  The most obvious one being the problem of CONFIGURATION.

If you want to find a valid configuration as a manufacturer, you could possibly go either way.  I have seen either technology being used for:

  • Computer assembly as well as Trucks — in both cases, the problem was to fit parts that may be mutually exclusive (you can only have one motherboard in the PC and only one type of tires on the truck), including some physical limitations (only so many slots in your PC, a V8 engine may not fit under certain hoods), etc.
  • Manufacturing processes — Some gases may not mix too well; some drugs may interact…
  • Services — Life insurance policies can be extremely complex to set up; Telecom services now including so many options do require some intelligence as to what is compatible / required…
  • Even Sports!  Games can be scheduled according to the league’s demands: alternating home and away games, avoid blocked dates (football stadium may be used for concerts or baseball game), etc.  Similar demands may be made by the announcers for the ads scheduling…  You may have attended my landmark presentation on NFL at Business Rules Forum last year.

The nature of the problem does not dictate the technology that is most appropriate.  You need to focus on the type of outcome and the approach that stakeholders are comfortable with.

Usage Guidelines for Optimization

Optimization, per its name, is about finding the best solution to a problem.  Especially when it is heavily constrained, that technology is very efficient at finding a feasible solution if it exists, or at letting you know that none exists which is valuable in itself.  Furthermore, if that problem lends itself to ranking — where one solution might be more valuable than another and you ave a way to measure that — optimization technology will typically outperform other techniques.

Let me illustrate with a typical crew scheduling example.

Let’s say you have 1,000 engineers to dispatch.  Jobs have been contracted and include a specific deadline.  You need to make sure that each engineer is assigned to a job that matches his/her credentials.  Your objective is to perform all the work within the allocated timeframe to avoid penalties.

Granted the standard Miss Manners rules benchmark solves a somewhat similar problem (not scheduling within a timeframe but placing guests next to each other according to topics of interest), but the rules using in that test are certainly not easy to understand since they refer to the mechanics of allocating those guests to their seats rather than the description of the end result.  And more importantly, that design is very opaque…  How would you inject additional rules to reflect that Joe never works on Fridays and that Jack and Jill can’t be collaborating on a project?

Usage Guidelines for Business Rules

Business Rules are very good at applying a technique that you know will get you to the end result.  If the problem is not constrained, meaning that you will not get yourself into a corner when you assemble your configuration, then business rules offer some advantages.  Since a heuristic will lead you to a solution rather than the set of all possible solutions, business rules would make more sense when all solutions are equal.  You simply care about finding any configuration that works.

I have met end users that were not comfortable using optimization because it changes radically from the way they have always been business and they will not trust that the outcome will be satisfactory or in other words “as good as if they had done it manually”.  For those guys, business rules offer the flexibility to describe the logic step by step, one rule at a time, and they can also review execution traces that justify how the system got to the recommendation.

Imagine you had a Sudoku grid and you fed it into an engine that filled it in completely.  You might go and check each row, each column and each square…  just to be sure.  But if you had a step by step animation explaining that this “3” completes the top right square, you would not worry as much.  You would likely make sure that the square logic, the row logic, the column logic, etc. work correctly then trust the system.  People like to understand and be able to anticipate how systems work in order to trust them.

Let me use an example that is a little bit more serious.  In the Telecom space, you might sign up for products and services.  In a package, you might get to select phone, internet access, television and cell phone.  Product Managers can describe using business rules what components are mandatory or optional.  Business rules can also be used to define incompatibilities and requirements.  You cannot have DSL and Fiber at the same time.  You need local service to enable long distance.  You must select one data plan for an iPhone or smart phone.  The advantage of using business rules in that example is that you can reuse them in different contexts: to guide a consumer on the self-serve website, to script an agent’s conversation in a call center, to validate application transmitted electronically or by mail.

In a Nutshell

Searching a needle in a hay stack?  And not just any needle?  Then pick Optimization!

Your experts know the recipe?  A solution is a solution?  Then pick Business Rules!


 2021 SparklingLogic. All Rights Reserved.