RuleFest 2011 – George Williamson: Scalable Rules Implementation

on October 25, 2011
SMARTS™ Data-Powered Decision Management Platform

George comes back to present after having presented last year.

George works for Union Pacific – for which car scheduling is one of the most critical operations. As most resource scheduling problems, the number of constraints and objectives involved in the scheduling of railway car is significant, making the process impractical for human definition. George’s approach relies on the usage of rules engines, which in itself is interesting – this may have been traditionally solved through other means such as linear programming or constraint satisfaction solutions. I have seen rules used for this before – most notably for contract configuration or advertisement placement issues, and the key reason for the choice of rules over those other approach was essentially the requirement to manage the decision in the way the business users already did it. Carole-Ann presented a few years ago on this issue, illustrated by the usage of rules and linear programming (http://www.slideshare.net/cmatignon/the-science-of-sports-matignon-2009) to solve similar issues (and that was a fun delivery too).

Resource allocation or scheduling tends to be a fairly complex problem for which the solution space is huge, and for which business domain expertise translates mostly into heuristic choices that allow the algorithms to shortcut the search and efficiently (more or less) get at best or locally best solutions.

George went in detail into the parameters that contribute to the complexity of the processing of the rules, and the approach taken in order to solve that. He did not go into the details of rules, but at the end of the day, the application has a large number of facts and a very small number of rules. This is very consistent with what I have seen in some other cases – although I remember a 30,000 rules custom truck configuration application…

George shared his experience with performance analysis:

  • In his implementation, he achieved around 2 secs/request with a default set up in terms of the engine configuration.
  • By distributing the facts across a pool of engines, he got better throughput at the price at steeper memory consumption.
  • Finally, he converted the facts into rules, ending with a larger amount of rules (20,000) and fewer facts. Throughput became significantly better – essentially at the price of maintainability as the rules end up being a combination of decision logic and fact translation.

This is an interesting use case for those interested in the tuning of the performance of rule systems.

I was expecting Jacob Feldman, CTO of OpenRules and major leader for the JSR on Constraint Satisfaction Programming, to jump in, and he did…

Learn more about Decision management and Sparkling Logic’s SMARTS™ Data-Powered Decision Manager

Search Posts by Category

ABOUT US

Sparkling Logic Inc. is a Silicon Valley-based company dedicated to helping organizations automate and optimize key decisions in daily business operations and customer interactions in a low-code, no-code environment. Our core product, SMARTS™ Data-Powered Decision Manager, is an all-in-one decision management platform designed for business analysts to quickly automate and continuously optimize complex operational decisions. Learn more by requesting a live demo or free trial today.