Call to Action

Webinar: Take a tour of Sparkling Logic's SMARTS Decision Manager Register Now
Home » Cloud

Cloud

Hot Tech Trends for Machine Learning


I joined the Churchill Club this morning for an exciting breakfast on Machine Learning.  In May 2013, Steve Jurvetson of DFJ said on the Churchill Club stage that he believes machine learning will be one of the most important tech trends over the next 3-5 years for innovation and economic growth.  I was eager to hear what Peter Norvig and the other guys would say about that.

No surprise

What might be surprising is that none of them painted an ‘unfathomable’ picture of the future.  It was all about more power, faster modeling, more data…

Star Trek InnovationVision?

I can’t say that they shared a vision…  I wonder if we have all been dreaming in our young years, watching Star Trek, and super-computers fueled our imagination.  Super smart machine able to assist the crew, and eventually perform medicine or look for their ‘humanity’, is the vision.  We are all working hard at figuring out ways we can make it real, ways we can build technology that achieve the ideals we grew up dreaming about.

It has been a rocky road for Artificial Intelligence, but in the past few years, Watson, the self driving car and other wonders have made us believe that machine learning could actually live up to our expectations, and more.

Read more…

Intelligence in the Cloud


The Cloud is one of the top priorities for most CIOs.  They want to better understand the risk and the opportunities.  This disruptive technology is forcing both technologists and business owners to think outside the box, to take real advantage of the “new way” of doing things.

Cloud

Being Sparkling Logic, we have been thinking pretty hard about it for a couple of years.  BPM has been an interesting case study for Decision Management on the Cloud, as it is a sister technology with a wider adoption in the Enterprise world, and, predictably, a wider mind-share in Cloud solutions.  My objective here is to share some of the key benefits that the cloud certainly brings to Decision Management practitioners. This is a topic that I have seen popping up in some forums, and also the main theme this year for IntelliFest, the new name for the Rules Fest conference, to reflect its wider scope on Artificial Intelligence.

Why the Cloud?

Have not heard it a thousand times already?  I am pretty sure I am not saying anything new here:

  • It’s easy to get started: Software-as-a-Service is all about having the infrastructure ready for your usage.  You get a login and password and you’re all set.  No more IT to work with for procuring the equipment and software.  No more internal delays.
  • Opex versus Capex: this is indeed a financial argument but it plays a role in your budgeting decision.  Cloud subscription means that you can pay as you go, based on your needs.  You can start small and then, over time, add usage as you need it.  This is a business model change that is not tied to the technology evidently, but it is expected that you can do that with Software-as-a-Service.  Given the current economic situation, it is only fair that organizations want to be more cautious about the rate of adoption.  What used to be “one project then expand” is becoming “one seat then expand”.
  • Elasticity: it is the equivalent argument on the IT side.  Many projects starts with a painful discussion on sizing…  Sizing is hard because of the so-many unknowns.  In the end, it often boils down to someone putting a stake in the ground and hoping for the best.  Truth is usage is something that evolves through time, so having the flexibility to adjust to the added capacity by the click of a button, rather than IT having to upgrade equipment.  That works for scaling up and down by the way.
  • It is secure…  yes, really!  There has been tons of arguments about security in the Cloud.  The reality is that the Cloud has to be secure.  The media would make a mess for every instance of a security breach on those servers.  In fact, it has been said that the Cloud was actually more secure than private facilities.  For some reason, I believe I might get some push back on that one!
  • It feeds off Big Data: Being on the Cloud, it is no surprise that Big Data is just a click away.  SaaS products are so much more ready for harnessing Big Data.

What is driving the Cloud adoption for BPM?

As I see it, BPM mostly gained acceptance on the Cloud for its collaborative capabilities, albeit mostly for capture and modeling purposes.  Being able to quickly get started with no IT intervention is highly valuable even with such a restriction.

In fact, many of the BPM vendors on the Cloud do not really focus on execution.  The perceived value is on business process capture or authoring.  Some refer to the “walk before you run” analogy.

Think about it, there are so many processes that are hardly documented.  Visio only helps organizations so much in getting things well documented.  Having the opportunity to collaborate with no or little set-up brings a bunch of value to the organization.

Running business processes on the Cloud is certainly doable technically but still limited to fewer projects.  Why?  My guess is that most businesses are either worried about outsourcing their operations on the Cloud, or simply are not ready to do so. It could also be because the data required to run on the cloud is not available there, for reasons ranging from practical to corporate policy-driven.

Why is Decision Management behind?

BPM has been much ahead of BRMS and Decision Management technologies.  Though the arguments for BPM technically would equally apply to Decision Management, there is a key difference in the application of the technologies.  I may not go as far as Boris in his BI-CRM comparison…  I am actually very optimistic on the Cloud uptake for those technologies, but I think that he has a very good point, or maybe a couple.

First of all, and this is shared with BPM and other Enterprise tools technologies, there has been few efforts to create dedicated cloud-based Decision Management tools. Software-as-a-Service is much more complicated than putting “cloud” lipstick on a pig.  Software that is not architectured to be horizontally scalable, elastic, multi-tenant, etc. can be thrown “out there”, but it won’t really work with the needs and expectations of those adopting the cloud. It’s not just administration and management that need to be made as simple as consumer applications. The awkward usability and necessary complex “set up” that many quickly put together cloud solutions require are, in my opinion, much bigger issues.

Second, decision management requirements are still hard to formulate for prospective users, and availability of the technology on the cloud does not change that fact.  Partially, this is because Decision Management is new to many industries; and partially because the needed interactions are still new to the industry, that has been feeding of the idea that IT would be there for a big deal of customization.

Those are some of the “excuses” for Decision Management being new to the Cloud.  I don’t think that they represent barriers that will limit its adoption going forward though, as the stakes are significant enough to make those initiatives appealing to the traditional consumers of those technologies, but also eventually to a much wider audience.

I think that Decision Management is following a similar trajectory as BPM in terms of cloud adoption. Though some users are already jumping on the Cloud for the execution of their Decisions, a great deal of opportunities lies in simply capturing and refining collaboratively the business rules in a “business-ready” environment which is greatly simplified by the Cloud.

To learn more, join us at IntelliFest.
Carole-Ann will address this very topic in more details.

IntelliFest

Want to know how strongly I feel about it?  Check my video:

[youtube=http://www.youtube.com/watch?v=tuM7dQs9blM]

Shaping Serendipity on the Job


The announcement of the acquisition of SocialCast by VMware caught my eye last week.  This is not surprising since we have been very interested VMware Logoin the dynamics of Social and Collaboration for over a year as you know.  Let me point you to a very good blog post by Mike Fauscette from IDC that describes the value-add of SocialCast to VMware’s portfolio and I will then share my own thoughts on the subject.

I view Social as several waves of capabilities that are gradually penetrating the Enterprise with increasing value.

Social = Communication

The early Enterprise 2.0 companies / capabilities have focused primarily on the social updates to keep the user’s entourage aware of his / her status.  Facebook excels at sharing, tagging and commenting on pictures between friends.  Twitter allows small pieces of news to travel the world in record time.

Applying those technologies to the Enterprise required some thinking.  When I visited the Enterprise 2.0 conference last year, I was only partially surprised that large companies were still struggling to find the right use case for this technology.

The obvious first step was to leverage it with the current users, the “consumers”, for marketing or public relations.  The success stories we kept hearing about back then was about Comcast trying to turn around its image by servicing its customers via Twitter — looking out for angry and loud customers and proactively giving them red carpet treatment –, or Dell creating a new sales channel for refurbished equipment.  I must admit that the early mover and creative thinking here gave them extra credits and (more importantly) exposure.  It is refreshing to see marketing dollars routed to a value-added service rather than pure advertisement.

Product Management / Marketing also started creating some inbound traffic by allowing their customers to express themselves in their communities and share ideas on what they like / dislike in the current offerings as well as ideas on how to make them better.  With voting capabilities, you can filter a lot of the noise that could be generated on mass market products.

Social = Serendipity

The next move with the likes of Moxie or Jive has been to shape serendipity.  I do not recall who coined this expression but I love it.  By communicating at large to an available audience, you can increase your odd of come across the right information at the right time.  In our Encounter with Geoffrey Moore post, he amusingly referred to “the serendipity of the guy with chocolate running into the guy with peanut butter”.

The typical Salesforce example is for Sales automation of course.  As a sales guy (or gal), you may be looking for nuggets of information in your ecosystem at the time you need it — which is inevitably minutes before a sales call.  You certainly do not care to know about every single call into tech support in real-time, but when you meet this important customer, it is invaluable to know that he/she has a dozen open tickets including 2 critical ones that have been pending for over a week now with very little activity, possibly some angry language was exchanged.  If you do not have the time and energy to look for it, you may want to post a quick note asking if anyone has anything to report on that very important customer.  The answer, happy or not, may come from tech support or fulfillment or training or professional services or legal or marketing, etc.  The beauty of the social platform is that only those who are available will look into it and feel compelled to share what they know and think is relevant.  Company-wide emails was the old way of doing it but they tends to be pooh-pooh’d if not ignored by most of the employees.

Social software allows employees to connect and get those conversations going.  Employee communication is for me a much greater animal than the Voice-of-the-Customer initiatives I referenced earlier.  Having a Product Management background, and a relatively niche market (B2B), I feel quite comfortable about getting the meat of what my customers want.  Corporate efficiency is a real challenge though.  Optimizing one division is hard enough but breaking the silos between those divisions is extremely complicated.  Whatever can be done to improve that situation has the potential to reach very high ROIs with little efforts given where we are starting from.

At Enterprise 2.0, a large insurance company asked a great question though.  How do you make those tools effective?  Having the ability to engage others is great but you still need some guidance to drive conversations with more value-add than comments on the cafeteria food…

Social = Serendipity on the Job

Granted you can post tweets to let your ecosystem know that your plane is late and serendipitily discover that you are stranded with an old buddy and meetup for a drink but you would get great value-add, at least in the corporate sense, if you could mary the social “icing” to the corporate “cake”.  Michael Fauscette points out that the ability to bring those activity streams and collaboration tools in the context of actual applications is critical to the enterprise adoption.  This is what we call “serendipity on the job” and I agree that those capabilities will enable Social software to soar throughout the enterprise bringing tremendous value.

The raw capability of exchanging information puts the burden on the users to self-organize and find a sense of purpose.  When those capabilities are intrinsically integrated with day-to-day tasks, they have the opportunity to be used without excessive thinking or learning curve by the stakeholders.

When Salesforce released Chatter in the context of the Sales Automation application, it unlocked something big: the ability to work collaboratively, to leverage the collective in the context of day-to-day activities.  As the Sales exec, I can look at my portfolio of customers and post information that is targeted to a captive audience.  Only service reps in charge of my account or for some other reason interested in this account will subscribe to the status updates and will be notified.  This reduces the chatter (no pun intended) that goes around in company-wide emails.  It also captures the thoughts and contribution of the involved stakeholders on the spot — eliminating unnecessary follow-up discussions as well as capturing tacit knowledge.

Do not underestimate the value of being in the context of your application.  I love Twitter but I don’t have the time to read all the tweets from my friends.  Nobody does.  It serves a purpose of communication and trend-watching.  Integrated Social / Collaborative capabilities serve a different purpose of connecting “doers” for a well-defined purpose.

It is not rare to hear about the “intangible” value of Social Software.  I would argue that, when it is clearly applied for a given purpose, its value is much more obvious and measurable in terms of productivity and eventually bottom-line results.

I believe that this acquisition is a brilliant move from VMware and we shall hear about more Social Software acquisitions from the platform players that are building the “next generation”.

What is Social Logic?


Sparkling Logic SMARTS LogoThis is a big moment for all of us at Sparkling Logic. After a few months of work with customers and prospects, of intense design and implementation work, we are finally announcing the company’s first product: Sparkling Logic SMARTS™. We had the privilege of launching the product at Gartner’s BPM Summit in Baltimore last week, and the satisfaction of a great reception and significant customer and partner interest. Having lived through a number of product launches in this space, this is the one that I am the proudest of – please apologize that I use this post to convey our enthusiasm with this new approach and product.

Sparkling Logic SMARTS is a new kind of Decision Management product. One that, we believe, represents a radical shift from the current way of thinking about Decision Management – something even more momentous than the introduction of the Business Rules Management Systems that we were responsible for with Blaze Advisor circa 1997, bringing the AI rules engines and the Business Rules methodologies together.

The current Decision Management industry is stuck in a no-man’s land. On one hand, Decision Management remains at the core of many mission critical systems, with large implementations that impact our daily lives in many different ways – from what we get marketed to how we get sued. On the other hand, it has not grown up to the expectations we had, and that the value it brings to the table should justify. Even more worrisome, the number of Decision Management projects that fail to deliver on the promised ROI – regardless of the stated reasons – is much higher than it should be: too many projects fail before delivery, or take too long or too much effort to get completed. At Business Rules Forum last year, I did ask the vendors panel the question on why this is the case in their opinion, and what the industry should do to overcome the issue. To their credit, the representatives from Pega Systems and InRule did provide some constructive insight on the issue, but in general terms, very little introspection has effectively taken place lately. The industry remains in that no-man’s land.

Having stated this is just part of the journey to the solution. The next step is to identify the root causes for the situation so that we can envision a way out of it. After leaving our former employer, Carole-Ann and I invested a significant amount of time discussing with industry leaders and enterprise application technical and business leaders, collecting our vision and ideas, and identifying what we believe are the key attributes of a decision management solution that will deliver on its promise to have control of the automated and hybrid decisions with the decision makers.

In a few words: empower all decision maker in whatever role they have – from decision area specialist to case worker – to work on their decisions in the context that they naturally evolve in, using the paradigms that they naturally use.

What does that mean?

First, it’s important to recognize that most complex decisions in enterprise applications end up involving both an automated part and a manual part. In general terms, the automated part deals with most of the cases presented to it, and the manual case deals with the business exceptions mostly, where the flexibility of a human is necessary. For example, the decision to accept or not an electronic payment – the automated system may idenfty that a particular payment presents a level of risk for a customer that needs to be well treated: instead of rejecting the transaction, decision control will be passed to a human which will complete the decision. The human, what he or she does with the information received, the process he/she follows to get to a conclusion, the decision taken: all these are part of the actual business decision. Case workers are part of the decision management system, and they are an essential part of it, dealing with the complex cases, those who in the end may represent where most of the actual risk is taken and/or the most opportunity is possible.

There are thus at least two key roles involved in making, codifying, operationalizing decisions: the traditional business user who knows enough of the business and problem that he/she can work in automating and improving decisions and the case worker whose role on a permanent basis is to participate to the decisions as part of the enterprise application eco-system.

This simple fact has enormous implications on what an effective Decision Management system must be. It must take into account all the roles in the decision, it cannot simply ignore part of then. Traditional BRMS-based Decision Management systems stay away from considering case management as part of the decision, the same way Case Management-based decision support systems stay away from automated decisions. As a result, most large enterprise decision management applications include both BRMS-based and Case Management-based decision systems, and these do not communicate nor share anything but routing logic and operational data schemas at best.

We created Sparkling Logic SMARTS with an approach that allows for decisions to be managed consistently by all roles involved, and thus, achieve full visibility on the decisions and their management through their real complete lifecycle, and not just the part that is automated or the part that is manual.

The challenge for such  a product is to create an environment which enable these different roles to be all effective on the same decision logic. This is not simple… Technically savvy business experts convey their view on the decisions in particular forms, frequently resorting to graphical representations such as flows or tree.  Business domain experts tend to think in terms of policies. Case workers tend to think in terms of cases.

But they all view their decisions in the context of both data, and objectives supported by metrics. In the process of harvesting rules, it’s typical that time will be spent trying to come up with all the “prototypical cases” to categorize them and create the corresponding abstraction (decision tree, rule template, etc…) to represent the way the cases corresponding to those “prototypical cases” should be treated. The process starts with data – the prototypical cases. It also starts with objectives identified – in the example above we want to reduce the number of fraudulent payments we authorize while minimizing the number of situations in which we do not authorize a legitimate payment, in particular between a good customer and a good network partner… It also starts with metrics supporting these objectives: the number and total amount of fraudulent payments non blocked, the number of legitimate payments denied and the impact on the retention of customers. All business users and case workers will have present both data and objectives supported by metrics concretely as part of their daily decision-making work.

Taking them out of their context in order to put them through a methodology which has nothing to do with the way they do their daily work is actually dangerous. They are being asked to think about all the fine details of their decisions and the way they do it, at the same time as they are being asked to change their vocabulary, adopt new tools, learn new approaches. And they are asked to do that at specific points in time, putting aside the evolving nature of the environment decision-making – more than any other part of the enterprise application – is part of. It is not a surprise that we’ve seen a number of times the disconnect between what the business user thought communicated through the process and what they got back through a traditional implementation. Even metaphor-based (for example rule-tables) or template-based systems face the problem – the approach does give flexibility, but only to the extent it does actually capture the reality of how the business user decides, and only to the extent they continue to do so as the way the way the business user decides evolves.

Sparkling Logic SMARTS is the first Decision Management System that enables the business user to design the decisions by actually making them. Design by doing – a very powerful concept, one that is starting to see the light in the BPM world under names such as Dynamic BPM or Adaptive Case Management. The business user continues working manipulating the concepts they usually manipulate, dealing with business data and with the objectives and the metrics always present and contextually available, focusing on making decisions and capturing the decision logic in the process. Sparkling Logic SMARTS uses patent-pending technology to let business users manipulate the decisions by actually doing them – and in the process enabling the collaboration between all different roles and stakeholders in the management of the decisions through the lifecycle. We have invented a very pragmatic approach to solve one of the key problems in the adoption of Decision Management Systems, and we have implemented it in Sparkling Logic SMARTS.

Furthermore, we recognize the fact that decisions are social

They involve multiple stakeholders – even at the objectives level, different stakeholders have different objectives for a single decision, yet the decision taken is in general only one. In the previous example, accepting or denying the internet payment. So, by nature, the making of the decisions and its codification for repeated making will represent a compromise among multiple stakeholders – a compromise that will not be perfect at any point in time and that will need to evolve.

Similarly, making or improving a single good decision may involve deep expertise which is available within the organization but not directly to each and every one of the decision makers, rule designer or codifiers. The roles cooperate. And they do it today in ad-hoc semi-formal manner – discussions in meetings, email exchanges, and, more and more, ephemeral instant messages.

Enabling that collaboration and gathering information on how the decision is managed in the process extends the reach of the individual business user to the collective, and increases the quality of the decision codification and improvement processes. The Enterprise 2.0 movement has seen it clearly – companies like Moxie Software are the new alternative to the old Knowledge Management systems, and one that is far more effective and adaptive. It’s not a surprise that one of the most popular Salesforce products is Chatter, for the same reasons.

We built Sparkling Logic SMARTS around a social collaboration platform, implementing effective collaborative decision management through a number of social techniques, including some patent-pending ones. Sparkling Logic SMARTS is the first Social Decision Management system that covers the full lifecycle of decisions and their operationalization through both systems and humans.

We call that Social Logic!

Sparkling Logic SMARTS represents a pragmatic revolution in the way Decision Management is approached, and enables business users to design and manage operationalized decisions by making them, making the system accessible to all roles involved in decision-making, resilient to changes in objectives and environment, and adaptive to new conditions, risks and/or opportunities.

Stay tuned for further announcements and discussions. We’ll talk more about the details on how the product delivers on its promises.

And in the mean time, tell us what you think!

Big Data meets Analytics… again


Well, another month, another acquisition… Teradata has announced the acquisition of Aster Data. You can find a less formal yet official post on the acquisition in Aster Data’s blog – Mayank and Tasso go into some more details on what Aster Data is all about and why the deal makes a lot of sense to the industry.

Aster Data has focused its energy into developing a low-cost (from a systems footprint perspective) platform to manage and process data at large-scale without imposing restrictions on the types of data being managed and the type of processing being carried out. The resulting platform is a show case for a solid commercial implementation of the much talked about map-reduce approach to big data processing, and has enabled companies from different industries to extract analytic insight from both structured and unstructured data. As a result, they’ve been able to make better decisions leveraging not just the traditional operational data, but also the social data, the web click data, etc…, that is generated in huge numbers around their products and services.

The approach also allows to shorten the time needed to bring that insight back into decisions. This type of close-to-real-time insight makes the understanding of decision impact as well as the their evolution much more dynamic, giving those companies that can leverage it an edge in managing risk and benefiting from trends.

The acquisition is a good move for Teradata. It also re-inforces the following key trends:

  • Platform players continue acquiring young innovative companies which solve complex data, analytics and/or decision management problems. Just in this space, EMC bought Greenplum some time ago, IBM bought Netezza, HP bought Vertica…
    The consolidation trend will continue
  • The big data management and processing spaces are merging on unified platforms. There is less and less distinction between managing vast amounts of data and processing them to gain insight, generating more data on the fly.
  • Managing and processing non-structured data – which makes up most of big data – is becoming an integral part of what companies need to do to manage the decisions around their products and services.  And contrary to popular belief, this is as important in B2B as in B2C.
    This is also the consequence of the trend of the importance of the decision data than can be extracted from social data. This will accelerate with Enterprise 2.0.
  • And finally platform vendors are morphing into Cloud-based/backed Saas providers, and they are making tasks such as the ones enabled by Aster Data accessible at low entry cost.

Exciting times. Congratulations to the Aster Data team.

This, of course, reduces the pool of independent big data management and processing products. InfoBright and ParAccel come to mind – and HP, Dell and the like still need to move in this space. Who wants to start bets?

Rules Fest Live: Alex Guazzelli / Follow your Rules but Listen to your Data


Alex Guazzelli

Alex presented a well structured case for why decision applications should include knowledge extracted from data in addition to knowledge captured from experts.  Alex used a fish processing plant as the illustration for his presentation – an interesting choice.

If you have enough data relative to your problem available to you for processing, applying predictive analytics modeling enables you to extract knowlege in the form of risk scores, fitness scores, etc… that can then be leveraged by business rules. To do so, you use standard predictive model building approaches, using tools such as the free ones provided by Project R, or one of the commercial ones.

However, one of the key issues that you will face is how to deploy the models created through the predictive analytics model building. Alex referred to typical model update processes in financial services, where the frictions in the deployment process result in 6 months+ release cycles. I am very famliar with those cases – as a disclosure, Alex and I both worked at HNC a few years ago.

Alex has been very active in the efforts to solve this deployment problem, and has contributed significantly to the creation and growth of the PMML standard. PMML is an interesting effort – it has resulted in its first generations in a well supported model description tool, but suffered from the lack of support for expressing variables. That was a key barrier to adoption, and one of the key difficulties we had to work on when designing and implementing the PMML support for Blaze Advisor.
That problem has been largely (and possibly completely – I have not checked the details) in the latest version of the standard – which is not yet supported by everybody but will be.

Alex wrote a book on PMML:  PMML in action. He does cover PMML 4.

Zementis, the company Alex works for, combines Drools and a PMML deployment capability, to let you create decision applications that combine rules capturing the expertise and models managing the risk and are deployed to the cloud.

Rules Standard: Yay or Nay?


Tossing a coin on whether or not we need a rules standard
A Coin Toss?

I do not recall standing in a panel at a conference and not getting “The Question”…

What do you think about Rules Standard?

Good question…  Although my answer always remains the same: “Standards are incredibly useful and powerful but only when they address a real need from the user community”.

There have been zillions of standard initiatives in the world for all kinds of technologies but only those that were indeed solving a real problem became real standards.  There are typically two main ways to get to that point:

  1. Standard Body Route: a Standard body such as OMG (Object Management Group) or W3C (World Wide Web Consortium) or an Industry Standard body such as ACORD (Insurance) unites vendors and users to brainstorm on needed standard representations
  2. Vendor Route: either a major technology becomes de-facto standard due to overwhelming adoption or a group of leading vendors get together to agree on a common representation

My strong conviction is that having a Standard for the sake of having a Standard never succeeds.  We need to focus on the objectives.

A little History…

Has anyone tried to come up with a Business Rules Standard?  If you have been hanging with the Business Rules crowd, you know that many attempts have taken place over time…  Let me tell you about a few of them…

A long time ago, over a decade ago, while J2EE was emerging and the JSRs were still furiously active, a group of companies dedicated time to work on a standard interface for rules engines.  It was called JSR94.  The idea was that you could bind different engines with no source code changes.  The same code could target vendor A’s libraries or vendor B’s with a simple change in configuration and compilation.

That effort successfully closed.  A standard was born.  This JSR94 checkbox appeared on countless Requests for Proposals (RFPs) from that day on and I think it still shows as being a selection criteria here and there.  The reality though is that very few companies use it.  Why?

  • Because it limits what you get out of the vendor…  The major vendors have designed and promoted server-side architectures that wrap the rules engine in order to deliver more sophisticated services (transaction-safe on-the-fly rules upgrades, jukebox-style dynamic rules loading…).  Selecting the JSR94 APIs is a deliberate choice to give up both flexibility and performance optimizations that are typically sought after by architects!
  • Because it does not serve a realistic purpose…  Why would you switch vendor when your rules are all implemented in a proprietary format?  Changing that invocation source code was 2 lines of code anyway, granted with a few imports to adjust…  Changing the rules…  That’s a huge investment…

Service Oriented Architecture (SOA) may very well be doing a much better job at solving that integration issue at the rules service level rather than rules engine.  A web service is a web service is a web service…  As long as the contract exposed by the new web service is consistent with the original web service contract, it should be super easy to mash up one service instead of the other.  From that standpoint, JSR94 does not make much sense to me nowadays…  With Cloud Computing, this is pretty much history…

If converting the rules appears to be the major piece of work, it seems logical that most Standard efforts revolved around a universal representation for those business rules.

The major issue we face here is the scope of the language.  Rules syntax varies from vendor to vendor.  If that was not enough, the semantic varies as well.  Some vendors offer lots of expressivity, others are more constrained…  Natural Language does not fit easily in this vision of a universal language.

RuleML was an early attempt to define a rules language that could be used by any vendor.  It got close as far as getting the publicity, but the focus on the “minimum set” made it very impractical.  Working for one of the major vendors at the time, we took a serious look at that initiative but it fell short: most if not all real-life implementations leveraged constructs that were not supported by RuleML.

It was a crazy time back then…  So you may be able to find a few “Standards” that made their way into RFPs as such but were in fact the proprietary syntax of a single vendor…  Sneaky strategy…

Colleen McClintock, from ILOG, reached out to me a couple of times to “make a standard”.  Granted, if we had been able to agree on one, we could have swayed the whole industry…  At the time, our combined marketshare was around 70% of the overall market.  Those two efforts never made it because we could not get any end-users to be vocal about the need, so the investment never made it.

More recently, OMG and W3C launched a few initiatives.  the OMG efforts turned into 2 distinct initiatives: SBVR and PRR.  The former focuses on the modeling-time representation and the later on the executable business rules.  PRR eventually merged with the W3C RIF effort, more focused on the interchange aspect.  Those efforts though really lack the voice of the customers.  They are driven primarily by academics that care about artificial intelligence.  When sitting on those working groups, I was shocked that the simple notion of a rule entity did not make it…  How could it be a rules standard?

I am not advocating that those people are not capable…  I worry, again with my Product Manager hat on, that this group may be too esoteric on the definitions and not down-to-Earth enough…  Without end-users to remind the group what the end-goal is — an interchange format for projects like “mine” –,  we may end up yet again with a standard for the sake of having a standard, something that could collect dust on the shelf.

For what purpose?

This may be the Product Manager voice constantly whispering in my ear, but the truth is that I cannot conceive that design happens by chance.  If you know the WHAT, then the HOW will become much easier to figure out.

In the case of Business Rules (BRMS), I see a couple of reasons why we may want one Standard…

First there is the “vendor lock in” argument…  When selecting a technology vendor, you invest in licenses of course but you also turn your internal business knowledge into a format that is proprietary.  Having an interchange format helps reduce the switching cost from vendor to vendor.

Please keep in mind here that there are two parts here:

  1. Seamless Integration:  keep in mind the JSR94 and SOA stories…
  2. Rules language:  This is where we do not yet have a good solution…

And again, what is your use case?  Do you worry about upgrading from an entry-level or open-source engine to a full blown BRMS?  Or do you want the flexibility to swap you Ferrari for a Rolls Royce (or vice versa)?

The other argument I hear from time to time is to decouple the execution side from the authoring side.  My analogy here would be SQL…  Although you might be using Oracle or DB2 for execution, you may want to author your SQL code with a cool UI from any vendor.  Do not take the analogy too far…  I realize of course that SQL is a fantastic solution for the vendor lock-in problem as well.  Yes it has a few specificities from vendor to vendor, but it is very portable.

The Interchange format solution would add a layer of transformation between the Business Rules producer and the Business Rules consumer…  Similar to the Predictive Model Markup Language (PMML) solutions that exist today…  It intuitively sounds like a standard storage representation would allow enterprises to capture their rules in their favorite tool and deploy it seamlessly to any execution engine, skipping the transformation stage, enabling the true universal business rules repository.  There is merit in that story, at least for management purposes (maintaining one source of truth, ensuring simultaneous deployment across systems, etc.).

How do we succeed?

With passion and determination!  Well, the key here is that passion and determination needs to be shared.  We are missing that level of engagement from the practitioners, from the true end-users.

I made an open call to end users when asked that very question at shows like Business Rules Forum or October Rules Fest.  As far as I know, no end user joined the effort at OMG…  Please chime in if you learn about anyone that did!

I met some people at BPM Summit that were interested in extending the Standard work that was done on BPEL and BPMN to cover Business Rules, so that may be a direction for a Business Rules Standard.

Sparkling Logic Community is open to Business Rules practioners and vendors
Sparkling Logic Community

Since some of you seem to be very interested in Standards, I will beg you again to be vocal.  You are welcome to comment here of course but we have also set up a thread onto our brand-new community to discuss this very topic.  Feel free to join and express your opinion on whether you think we need one or we don’t…  This is an open community dedicated to practitioners but also open to vendors (as long as you do not pitch your products blatantly): Sparkling Logic Community

Lessons learned in 2009 – Dealing with uncertainty


 

Dealing with Uncertainty

2009 has been a rough year for the business.  We had to deal the aftermath of the 2008 economic crisis and start rebuilding.

In those times, I try to keep a French proverb in mind “À quelque chose malheur est bon”.  It literally means “Unhappiness is good for something”.  In the US, we would rather say “Every cloud has a silver lining”.  Given the emergence of Cloud technology as a consequence of the recession, it seems to be quite appropriate.

Capital vanished into thin air, confidence went down the drain as well.  Most businesses had to do more with less in those circumstances.  the industry focused on better managing finances to run operations.  Cloud computing, Software as a Service, that technology with many names, became attractive to finance new projects or to reallocate Capital Expenditures into Operational Expenditures.  Less risk, faster execution.  Sounds logical.

How we got there though was in my mind a more interesting ounce of wisdom that, if well applied, could turn into real value.  Nassim Taleb captured it quite well in the Black Swan.  In addition to expected changes — those regulations that are announced in advanced, competitive product announcements, or predictions of the market — there are unexpected changes that need to be dealt with.  They may be totally unexpected or events with a probability so low nobody is getting ready for it — the unknown unknowns.  Those will never be taken out of the equation.  Although we may be tempted to at times, it would only lead to incomplete, insufficient models such as those that brought us where we are.

Learning to live with those unknowns, learning to better handle uncertainty is a skill that became more appreciated in 2009 and that business will start developing more in 2010 and beyond.  Our systems should help us made decisions that are not so sensitive to market conditions we believe are here to stay.

Jim Sinur at Gartner is spot on in his blog post about business agility.  Agility is what we all know is required in our systems but it is not enough.  New capabilities such as Decision Simulation and Decision Optimization are key.

I am predicting that 2010 will see a lot more emphasis on this topic.


 2018 SparklingLogic. All Rights Reserved.