Call to Action

Webinar: Take a tour of Sparkling Logic's SMARTS Decision Manager Register Now

Evolution of the Rete Algorithm


Written by: Carole-Ann BerliozPublished on: May 4, 20123 comments

The Rete Algorithm Demystified blog series attracted a huge crowd.  I want to thank you for your readership!  Following the popular demand, let me continue on the series and add a few words on the latest and greatest Rete-NT.

rete algorithmWhat’s new? 

Well, this is where I can’t say much without violating Charles’s trade secrets…  Sorry!

So what can I share?

For the evolution of the Rete Algorithm, Charles has focused primarily on runtime performance, looking for ways to accelerate rule evaluations and reduce memory usage.  Mission accomplished.

Faster: With Rete III, the speed increase came with the ability to efficiently handle a larger number of objects per transaction.  With Rete-NT, the speed increase came from optimizations on complex joins in the Rete network.  As described in part 2, the discrimination tree performs a product of object lists.  The list of all drivers satisfying x,y,z requirement is combined with the list of all vehicles matching some other requirements, for example, producing the cartesian cross product.  The more patterns, you add, the more joins will be added.  This has been referred to as the problem of multi-patterns.  The combinatorial explosion is kept under control with the latest algorithm, in a dramatically different way than previously attempted, achieving unprecedented performance.  This algorithm shines when business rules involve complex conditions, which tends to be the case in real-life applications.

Slimmer: This is related to the complex join speed increase.  The less combinatorial explosion, the less memory usage.  It is actually a lot more sophisticated than that, but I am unfortunately bound to secrecy…  The most important thing to remember is that memory usage goes down quite a bit.  This is a concern that software architects can truly appreciate!

James Owen scrutinized the actual performance increase and published the results in InfoWorld last year.  Although the overhead may slow down a tiny bit the performances for overly simple tests, the performance gain is huge: an order of magnitude faster than the previous generation.

Does performance matter?

Most rules engines have achieved some excellent levels of runtime performance, so performance for the sake of performance is not an objective in itself.

I am excited about Rete-NT because it improves performance where it is needed.  Previous generations of Rete engines put pressure on rules writers to design rules that avoid as much as possible multi-patterns.  This algorithmic innovation removes a painful hurdle, or at least move the boundary.  In my career, especially in the past 2 years at Sparkling Logic, I have come across new use cases that do require more flexibility, more expressibility that would be hard to cope with using less efficient algorithms.  One thing we can always seem to be able to count on is for complexity to increase…

How does that compare to non-inference engines?

You can over-simplify the inference versus compiled sequential debate by saying that:

  • Rete shines when the number of rules is large and the number of objects in memory is small
  • Non-inference shines when the number of rules is small and the number of objects in memory is large

Rete-NT changes the game a bit by expanding the scope of problems that can be handled effectively.  As a result, non-inference engines dominate a smaller and smaller number of use cases, while Rete keeps its lead on large rulebases.

Call to Action

SMARTS Decision ManagerSparkling Logic SMARTS is a decision management platform that empowers business analysts to define decisions using business rules and predictive models and deploy those decisions into an operational environment. SMARTS includes dashboard reporting that allows organizations to measure the quality of decisions both during development and post deployment. Learn more about how SMARTS can help your organization improve decisions.
© 2022 SparklingLogic. All Rights Reserved.