In this post, we present how Sparkling Logic continues its involvement in the DMN standard, through its graphical tool SMARTS Pencil, which business analysts use to model business decisions by drawing a diagram to form a decision process.
DMN, a bit of historyThe Decision Model and Notation (DMN) was formally introduced by the Object Management Group (OMG) as a v1.0 specification in September 2015. Its goal was to provide a common notation understandable by all the members of a team whose goal is to model their organization’s decisions.
The notation is based on a simple set of shapes which are organized in a graph. This allows the decomposition of a top-level decision into more, simpler ones, whose results must be available before the top-level decision can be made. These additional decisions themselves would be decomposed, and so on and so forth until the model reaches a more complete state. In addition, the implementation of the decisions can be provided, notably in the form of decision tables (which is also a very common means of representing rules).
The normalization of the graphical formalism (the DMN graph) and of the way the business logic is implemented (e.g., decision tables) allows teams to talk about their decisions, using diagrams with a limited set of shapes.
Sparkling Logic was one of the early vendors to provide a tool to edit (and execute) these decision models: Pencil Decision Modeler. It was released in January 2015, before the standard was officially approved.
Since then, the DMN standard evolved significantly, by adding new diagram elements, new constructs and new language features, while clarifying some of the existing notions. It is now at version 1.3. And we didn’t rest on our laurels either: in SMARTS Ushuaia, we made Pencil Decision Modeler part of SMARTS, as a first-class feature and added full compliance to DMN 1.3! This post describes how SMARTS supports DMN 1.3.
BasicsDMN 1.3 still defines the building blocks which were in the original standard and which I mentioned in Talking about decisions.
As a recap:
- A Decision determines its output based on one or more inputs; these inputs may be provided by an input data element, or by another decision
- An input data is information used as input by one or more decisions, or by one or more knowledge sources
- A business knowledge model represents knowledge which is encapsulated, and which may be used by one or more decisions, or another business knowledge model. This knowledge may be anything which DMN does not understand (such as a machine learning algorithm, a neural network, etc.) or a DMN construct (called a “boxed expression”, see below)
- A knowledge source represents the authority for a decision, a business knowledge model, or another knowledge source: this is where the knowledge can be obtained (be it from a written transcription or from someone)
These blocks are organized in a graph and the links between them are called requirements.
What’s new in SMARTS’ DMN Support
More building blocksIn DMN 1.3, the following elements may also be added to a graph:
- A decision service exposes one or more decisions from a decision model as a reusable element (a service) which might be consumed internally or externally
- A group is used to group several DMN elements visually (with whatever semantics may be associated with the grouping)
- A text annotation is a shape which contains a label and can be attached to any DMN element
Custom types and variablesInput data, decision and business knowledge model elements all have an associated variable, which is of a given type (string, number etc., or custom). A variable is a handle to access the value directly passed by an input data element, or calculated by the implementation of a decision or a business knowledge model, from within the decision implementation.
Custom types may be defined to group multiple properties under a single type name (with structure) or to allow variables which will hold multiple values (arrays).
Boxed ExpressionsA few constructs are available to provide an implementation for a decision or a business knowledge models; they are termed boxed expressions since such expressions are shown in boxes which have a normalized representation. The following types of boxed expressions are available in DMN 1.3:
- Literal expression: this is a simple expression which can use the available variables to calculate a result
- Context: this is a set of entries, each combining a variable and a boxed expression. Each entry in the context can use the variables of the entries defined before it, which is like using “local variables” in some languages
- Decision table: this is a tabular representation where rows (called rules) provide the value of outputs (supplied in action columns), depending on the value of inputs (supplied in condition columns)
- Function: a function can be called using an invocation, by passing arguments to its parameters. The result of a function is the result of the execution of its body (which is an expression that can use the values of the passed parameters). A Business knowledge model can only be implemented by a function
- Invocation: this is used to call a function by name, by passing values to the function’s parameters
- List: this is a collection of values calculated from each of the boxed expressions in the list
- Relation: this is a vertical list of horizontal contexts, each with the same entries
In addition to these, SMARTS defines an additional boxed expression, called the rule set. This is a set of named rules, where each rule is composed of a condition (an expression evaluating inputs) and action (an expression providing some values to outputs).
Helping Industry AdoptionWith SMARTS Ushuaia, decision models are first-class citizens. The full compliance with DMN 1.3 means that all the DMN elements and boxed expressions, as well as the ability to interchange diagrams with other tools, are part of the package.
As is usual, any model can be tested and executed in the same context as your SMARTS decision –a decision is never made in isolation, and a model is never used in isolation either. And of course, you will benefit from the great tooling we provide.
Finally, we at Sparkling Logic strongly believe that decision management technologies should be put in the hands of all business analysts. This is why we are part of the DMN On-Ramp Group, whose mission is to provide a checklist to help customers find the DMN tool to suit your needs, educate and raise awareness about DMN, and help with DMN compliance. For a great presentation of the group, check out here.
AboutSparkling Logic is a Silicon Valley company dedicated to helping businesses automate and improve the quality of their operational decisions with a powerful digital decisioning platform, accessible to business analysts and ‘citizen developers’. Sparkling Logic’s customers include global leaders in financial services, insurance, healthcare, retail, utility, and IoT.
Sparkling Logic SMARTSTM (SMARTS for short) is a cloud-based, low-code, AI-powered business decision management platform that unifies authoring, testing, deployment and maintenance of operational decisions. SMARTS combines the highly scalable Rete-NT inference engine, with predictive analytics and machine learning models, and low-code functionality to create intelligent decisioning systems.
Marc Lerman is VP of User Experience at Sparkling Logic. You can reach him at firstname.lastname@example.org.
If you envision modernizing or building a credit origination system, an insurance underwriting application, a rating engine, or a product configurator, Sparkling Logic can help. Our SMARTS digital decisioning platform automate decisions by reducing manual processing, accelerating processing time, increasing consistency, and liberating expert resources to focus on new initiatives. SMARTS also improve decisions by reducing risk and increasing profitability.
In this post, I briefly introduce SMARTS Real-Time Decision Analytics capability to manage the quality and performance of operational decisions from development, to testing, to production.
Decision performanceH. James Harrington, one of the pioneers of decision performance measurement, once said, “Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.” This statement is also true for decision performance.
Measuring decision performance is essential in any industry where a small improvement in a single decision can make a big difference, especially in risk-driven industries such as banking, insurance, and healthcare. Improving decisions in these sectors means continuously adjusting policies, rules, prices, etc. to keep them consistent with business strategy and compliant with regulations.
Decision performance management in SMARTSSMARTS helps organizations make their operational decisions explicit, so that they can be tested and simulated before implementation — thereby reducing errors and bias. To this end, we added a real-time decision analytics capability to the core decision management platform.
Currently used in financial and insurance services, it helps both business analysts and business users to define dashboards, assess alternative decision strategies, and measure the quality of performance at all stages of the lifecycle of decision management — all with the same interface without switching from one tool to another.
Development. From the start, SMARTS focuses the decision automation effort on tangible business objectives, measured by Key Performance Indicators (KPIs). Analysts and users can define multiple KPIs through graphical interactions and simple, yet powerful formulas. As they capture their decision logic, simply dragging and dropping any attribute into the dashboard pane automatically creates reports. They can customize these distributions, aggregations, and/or rule metrics, as well as the charts to view the results in the dashboard.
Testing and validation. During the testing phase, analysts and users have access to SMARTS’ built-in map-reduce-based simulation environment to measure these metrics against large samples of data. Doing so, they can estimate the KPIs for impact analysis before the actual deployment. And all of this testing work does not require IT to code these metrics, because they are transparently translated by SMARTS.
Execution. By defining a time window for these metrics, business analysts can deploy them seamlessly against production traffic. Real-time decision analytics charts display the measurements and trigger notifications and alerts when certain thresholds are crossed or certain patterns are detected. Notifications can be pushed by email, or generate a ticket in a corporate management system. Also, real-time monitoring allows organizations to react quickly when conditions suddenly change. For example, under-performing strategies can be eliminated and replaced when running a Champion/Challenger experiment.
Uses casesInsurance underwriting. Using insurance underwriting as an example, a risk analyst can look at the applicants that were approved by the rules in production and compare them to the applicants that would be approved using the rules under development. Analyzing the differences between the two sets of results drive the discovery of which rules are missing or need to be adjusted to produce better results or mitigate certain risks.
For example, he or she might discover that 25% of the differences in approval status are due to differences in risk level. This insight leads the risk analyst to focus on adding and/or modifying your risk related rules. Repeating this analyze-improve cycle reduces the time to consider and test different rules until he or she gets the best tradeoff between results and risks.
Fraud detection. An other example from a real customer case is flash fraud where decisions had to be changed and new ones rolled out in real time. In this case, the real-time decision analysis capability of SMARTS was essential so that the customer could spot deviation trends from normal situation directly in the dashboard and overcome the flood in the same user interface, all in real time.
Without this built-in capability, the time lag between the identification of fraud and the implementation of corrective actions would have been long, resulting in significant financial losses. In fact, with SMARTS Real-Time Decision Analytics, the fraud management for this client has gone from 15 days to 1 day.
Marketing campaign. The two above examples are taken from financial services but SMARTS real-time decision analytics helps in any context where decision performance could be immediately affected by a change in data, models, or business rules, such as in loan origination, product pricing, or marketing promotion.
In the latter case, SMARTS can help optimize promotion in real-time. Let’s say you construct a series of rules for a marketing couponing using SMARTS Champion/Challenger capability. Based on rules you determine, certain customers will get a discount. Some get 15% off (the current offering — the champion), while others get 20% (a test offering — the challenger). And you wonder if the extra 5% discount leads to more coupons used and more sales generated. With SMARTS real-time decision analytics environment, you find out the answer as the day progresses. By testing alternatives, you converge to the best coupon strategy with real data and on the fly.
ConclusionAs part of the decision lifecycle, business analysts obviously start by authoring their decision logic. As they progress, testing rapidly comes to the forefront. To this end, SMARTS integrates predictive data analytics with real-time decision analytics, enabling business analysts and business users to define dashboards and seamlessly associate metrics with the execution environment — using the same tool, the same interface, and just point and click.
- SMARTS comes with built-in decision analytics — no additional or third-party tool is required
- You can define metrics on decision results so you can measure and understand how each decision contributes to your organization’s business objectives
- Decision metrics enable you to assess alternative decision strategies to see which should be kept and which rejected
- SMARTS add-on for real-time decision analytics lets you monitor the decisions being made and make adjustments on the fly
- SMARTS’ real-time decision analytics helps in any context where decision performance could be immediately affected by a change in data, models, or business rules
AboutSparkling Logic is a decision management company founded in the San Francisco Bay Area to accelerate how companies leverage data, machine learning, and business rules to automate and improve the quality of enterprise-level decisions.
Sparkling Logic SMARTS is an end-to-end, low-code no-code decision management platform that spans the entire business decision lifecycle, from data import to decision modeling to application production.
Carlos Serrano is Co-Founder, Chief Technology Officer at Sparkling Logic. You can reach him at email@example.com.
Low-code no-code is not a new concept to Sparkling Logic. From the beginning, the founders wanted to deliver a powerful yet simple product, so that a business analyst could start with data and build decision logic with built-in predictive data analytics and execution decision analytics.
Version after version, they have achieved this vision through SMARTS, an end-to-end decision management platform that features low-code no-code for business analysts and business users to develop and manage decision logic through point-and-click operations.
For business analysts, SMARTS offers a low-code development environment in which users can express decision logic through a point-and-click user interface to connect data, experiment with decisions, monitor execution without switching between different tools to get the job done. Depending on the nature of the decision logic at hand and user preferences, business analysts can choose on the fly the most appropriate representation to capture or update their decision logic. The resulting decision logic is seamlessly deployed as a decision service without IT intervention.
To push the simplification even further, Sparkling Logic founders turned to their customers for inspiration on their needs and developed three complementary technologies:
- RedPen, a patented point-and-click technology that accelerates rule authoring without a need to know rule syntax or involve IT to author the rules
- BluePen, another patented point-and-click technology to quickly create or leverage a data model and put it into production without involving data scientists or IT
- A dynamic questionnaire to produce intelligent front-ends that reduces the number of unnecessary or redundant questions
In addition to low-code development capability for business analysts, SMARTS also elevates the decision logic to a simple web form-based interface for untrained business users. They can configure their decision strategies, test the updated decision logic, and promote the vetted changes to the next staging environment — without learning rules syntax.
These business apps offer a business abstraction for most tasks available in SMARTS related to configuration, testing and simulation, invocation and deployment management, as well as administration.
For example, credit risk specialists can configure loans, credit cards, and other banking products, and pricing specialists can control pricing tables, through a custom business app specific to their industry. The no-code business app enables business users to cope with environment changes whether they are related to internal policies, competition pressure, or industry regulation.
Furthermore, SMARTS tasks can also be automated through orchestration scripts. Business users can trigger these scripts through the click of a button, or schedule them to be performed automatically and seamlessly.
Sparkling Logic is a decision management company founded in the Bay Area to accelerate how companies leverage internal and external data and models to automate and improve the quality of enterprise-level decisions.
Sparkling Logic SMARTS is an end-to-end, low-code no-code decision management platform that spans the entire business decision lifecycle, from data import to decision modeling to application production.
In our last post we discussed Decision Engine Performance and how SMARTS provides different engines that are optimized to their specific application. In this post we will cover how deployment architecture choices impact performance.
SMARTS provides you higher level support for implementing your decision management system than just a decision engine. In particular, it provides you with support for building decision services for micro-services as well as other service approaches.
Delivered in either repository-based or decision-based Docker containers or Virtual appliances, SMARTS decision services add to the decision engine the following, among other features:
Support for Secure Service Invocations (typically JSON over HTTPS)
Decisions may be invoked through services in an authenticated (access tokens) context. Many users may invoke the service concurrently, using any client technology that can interact with services. Sparking Logic provides Java, .NET standard, Python 3, NodeJS SDKs to facilitate the client implementation.
Support for Horizontal and Vertical Scalability
Decision engines executing within the decision service will leverage all cores available to them within the installation. Adding more cores results in the ability for the engine to support more concurrent executions, and the scalability is typically linear.
You may also deploy multiple microservice installations behind a load balancer. You will typically do that using an orchestration technology and leveraging the load balancer technologies available within your environment (on-premise or cloud). SMARTS allows you to have multiple instances leveraging their own replicated repositories or leveraging an external repository, all implementing the same set of services. These instances are deployed behind a load balancer and provide you with scalability by adding more instances. You may also add on-demand instances (with replicated or delegated repositories) to cope with elastic loads. SMARTS automates all the process of keeping all those services in sync and updating them as you change the decision logic.
Support for High Availability
SMARTS also allows you to have redundancy in your decision services. Having multiple instances with replicated repositories removes single points of failure. You can have an instance taken out, the rest of the replicated instances can continue with the load .
Support for No-downtime Hot Swap of Decision Logic with Full Traceability
SMARTS provides multiple levels at which you can swap decision logic. At the highest level, your lifecycle manager, without any IT intervention, can change the release of the decision logic being executed with one click, and no downtime. SMARTS will load the new release and hot swap it atomically if there is no problem. You can configure the strategy to take in case the new release is not loadable or has compilation problems: continue using the previous one and notify, stop providing the service, etc. Of course, you can also hot swap your decision logic using other orchestration mechanisms, but those tend to involve IT.
Support for Ready-To-Execute Decision Logic
SMARTS allows you to specify when a decision service is declared to be ready to receive invocations. Typically, you want to make sure that is only the case when it is actually loaded, compiled and cached in memory, so that the first invocations hitting it do not pay the price of an update
In addition to providing support for all these performance related features, SMARTS does it all in a secure and auditable way. Decision services are configured to use read-only project releases, and the information of what release is used on any service invocation is returned to the invoker.
Finally, you should also focus on the business decision performance. We’ll discuss that topic in our next blog post.
A key benefit of using a Decision Management System is to allow the life-cycle of automated decisions to be fully managed by the enterprise.
When the decision logic remains in the application code, it becomes difficult to separate access to decision logic code from the rest. For example, reading through pages of commit comments to find the ones relevant to the decision is close to impossible. And so is ensuring that only resources with the right roles can modify the logic.
Clearly, this leads to the same situation you would be in if your business data were totally immersed in the application code. You would not do that for your business data, you should not do that for your business decision logic for exactly the same reasons.
Decision Management Systems separate the decision logic from the rest of the code. Thus, you get the immense benefit of being able to update the decision logic according to the business needs. But the real benefit comes when you combine that with authentication and access control:
- you can control who has access to what decision logic asset, and for what purpose
- and you can trace who did what to which asset, when and why
Of course, a lot of what is written here applies to other systems than Decision Management Systems. But this is particularly important in this case.
Roles and access control
The very first thing to consider is how to control who has access to what in the DMS. This is access control — but note that we also use authorization as an equivalent term.
In general, one thinks of access control in terms of roles ans assets. Roles characterize how a person interacts with the assets in the system.
And the challenge is that there are many roles involved in interacting with your automated decision logic. The same physical person may fill many roles, but those are different roles: they use the decision management system in different ways. In other words, these different roles have access to different operations on different sets of decision logic assets.
Base roles and access control needs
Typically, and this is of course not the only way of splitting them, you will have roles such as the following:
The administrator role administers the system but rarely is involved in anything else. In general, IT or operations resources are those with this role.
- Decision definer
The decision definer role is a main user role: this role is responsible for managing the requirements for the automated decision and its expected business performance. Typically, business owners and business analysts are assigned this role.
- Decision implementer
The decision implementer role is the other main user role: this role designs, implements, tests and optimizes decisions. Generally, business analysts, data analysts or scientists, decision owners, and sometimes business-savvy IT resources are given this role.
- Decision tester
The decision tester role is involved in business testing of the decisions: validating they really do fit what the business needs. Usually, business analysts, data analysts and business owners fill this role.
- Life-cycle manager
The life-cycle manager role is responsible for ensuring that enterprise-compliant processes are followed as the decision logic assets go from requirements to implementation to deployment and retirement.
More advanced needs
There may be many other roles, and the key is to realize that how the enterprise does business impacts what these roles may be. For example, our company has a number of enterprise customers who have two types of decision implementer roles:
- General decision implementer: designs, implements the structure of the decision and many parts of it, tests and optimizes it
- Restricted decision implementer: designs and implements only parts of the decision — groups of rules, or models
The details on what the second role can design and implement may vary from project to project, etc.
Many other such roles may be defined: those who can modify anything but the contract between the automated decision and the application that invokes, etc.
It gets more complicated: you may also need to account for the fact that only specific roles can manage certain specific assets. For example, you may have a decision that incorporates a rate computation table that only a few resources can see, although it is part of what the system manages and executes.
Requirements for the Decision Management System
Given all this, the expectation is that the DMS support directly, or through an integration with the enterprise systems, the following:
- Role-based access control to the decision logic asset
- And ability to define custom roles to fit the needs of the enterprise and how it conducts its business
- And ability to have roles that control access to specific operations on specific decision logic assets
This can be achieved in a few ways. In general:
- If all decision assets are in a system which is also managed by the enterprise authentication and access control system: you can directly leverage it
- And if that is not the case: you delegate authentication and basic access control to the enterprise authentication and access control system, and manage the finer-grained access control in the DMS, tied to the external authentication
Of course, roles are attached to a user, and in order to guarantee that the user is the right one, you will be using an authentication system. There is a vast number of such systems in the enterprise, and they play a central role in securing the assets the enterprise deals with.
The principle is that for each user that needs to have access to your enterprise systems, you will have an entry in your authentication system. Thus, the authentication system will ensure the user is who the user claims, and apply all the policies the enterprise wants to apply: two-factor authentication, challenges, password changes, etc. Furthermore, it will also control when the user has access to the systems.
This means that all systems need to make sure a central system carries out all authentications. And this includes the Decision Management System, of course. For example:
- The DMS is only accessible through another application that does the proper authentication
- Or it delegates the authentication to the enterprise authentication system
The second approach is more common in a services world with low coupling.
Requirements for the Decision Management System
The expectation is that the DMS will:
- Delegate its authentication to the enterprise authentication and access control systems
- Or use the authentication information provided by an encapsulating service
Vendors in this space have the challenge that in the enterprise world there are many authentication systems, each with potentially more than one protocol. Just in terms of protocols, enterprises use:
- OpenID Connect
- and more
Additionally, enterprises are interested in keeping a close trace of who does what and when in the Decision Management System. Of course, using authentication and the fact that users will always operate within the context of an authenticated session largely enables them to do so.
But this is not just a question of change log: you also want to know who has been active, who has exported and imported assets, who has generated reports, who has triggered long simulations, etc.
Furthermore, there are three types of usages for these traces:
- Situational awareness: you want to know what has been done recently and why
- Exception handling: you want to be alerted if a certain role or user carries out a certain operation. For example, when somebody updates a decision in production.
- Forensics: you are looking for a particular set of operations and want to know when, who and why. For example, for compliance verification reasons.
A persisted and query-able activity stream provides support for the first type of usage. And an integration with the enterprise log management and communication management systems support the other types of usages.
Requirements for the Decision Management System
The expectation is that the DMS will:
- Provide an activity stream users can browse through and query
- And support an integration with the enterprise systems that log activity
- And provide an integration with the enterprise systems that communicate alerts
There are many more details related to these authentication, access control and trace integrations. Also, one interesting trend is the move towards taking all of these into account for the beginning as the IT infrastructure moves to the models common in the cloud, even when on-premise.
This blog is part of the Technical Series, stay tuned for more![Image Designed by security from Flaticon]
It has been quite a while since Carlos and I blogged about our hikes. A quick blog on the topic is long overdue. You can’t blame us though for being more passionate about technology!
If you grew up watching lots of westerns like we did, the view of the hanging tree is likely to bring back a lot of memories. It makes the place a little more dramatic under the hot sun of summer (it was long overdue too). People used to throw rocks at the tree as a symbol of disgust for the despicable crimes committed. As a Decision Management person, my mind was contemplating the fast decisions they made here and the lack of process that lead to some mistakes for sure.
Many tunnels and shafts spring up here and there. The San Cristobal tunnel is worth stopping by. You won’t be able to follow the old track very far but that is certainly enough to imagine the boring yet nerve-racking days that constituted the miners’ daily life. Their version of fun back then was competing on their drilling abilities on the boulder brought there from Sierra Nevada I believe. It is interesting though that they overcame so much trouble only to have a reliable point of comparison on their mining skills. Something to inspire us on being Performance-Driven I suppose.
The top of hill overlooks San Jose. Back in those days, it was a small downtown next to the Santa Clara mission, surrounded by miles and miles of orchards. How I wish I could see that Spring panorama of fruit blossoms. It must have been absolutely gorgeous!
This week, we finally found a park with some elevation. We enjoyed a beautiful day at Alum Rock Park. Weather was gorgeous. Ideas were sparkling. View was fantastic.
The first picture was taken from the parking lot. We started with a hike to the rock up there. It was a little steep for a change. Not a stroll around the lake! It is not as far as it looks though. It was a quick walk up to a gorgeous view.
Straight ahead we could see downtown San Jose in the valley. That day was very nice, and hot, but due to rain we have been having this year, it was a bit foggy.
The contrast of the green hills and the blue-ish bay was stunning. When you look around, you have a hard time believing that you are only minutes away from the legendary world-wide Technology headquarter.
We were not alone on the trail. Besides other hikers and bikers, we met a fair number of ground squirrels and a flock of raptor birds. I am not sure if they were hawks or vultures. We only saw them from a distance. While we walked though, they flew constantly above our heads, closer than I am used to. The big, wide shadows were quite impressive!
The park is very extensive with loads of trails. We decided to follow the North Rim Trail to the ancient mineral springs.
The geography of the area is conducive to charging water at least 7 different minerals. As a result some of the springs are salted, carbonated or sulfurous — yes, that would be smelly! I did not study too much the geology but I found interesting that the park used to be a beach… a long long time ago…
When the park opened in the late 1800’s, as a nationally renowned health spa. The 27 naturally enriched springs were used to fill up baths and an indoor pool.
Fancy baths and grottoes were constructed back then using rocks from the park itself. They make a fancy decor for professional pictures. The area is absolutely gorgeous.
If you prefer less “constructed” parks, just keep walking down the path… The stream may not be as wild as it was after the heavy rain we’ve had recently but when we hiked, it was definitely neat with many rapids along the way.
You can tell that we enjoyed that park a lot. I will definitely come back with the boys. For pictures or just to go wild on the playground, they would love it.
Fresh air does wonders on an open-mind. It is amazing how much work, thinking, analysis you can get done when you get out of your office or, in our case, just out of the house.
I wish they still had those sparkling baths, especially in the baths that were a constant 98 F… Although, it is said that a Midwesterner visiting the springs in the 1890’s sent a post card home with the message that he was sure he’d experienced a taste of purgatory!
So far, we have had a very mild winter but a lot of rain. This is obviously good for the State that has experienced at least 3 years of drought. Good for the land but not so good for our hikes…
We have had a hard time getting one scheduled that week. We had to plan for an outing over the weekend instead of our traditional weekday. That also meant having one of my sons with us and therefore little real hiking.
So we decided to opt for a local park in San Jose that educates city kids on the basics of a Farm. Kids can feed all kinds of farm animals or learn about compost.
The Emma Prusch Farm park is a surprising neat little park on King st, just at the 101 and 280 intersection.
Carlos and I had a lazy hike to say the least but a fun time running after the roosters with Lucas. We strolled in the small animals area and most spent time in the playground. It was very crowded with kids and families but quite pleasant.
Definitely not the typical Thinking Outside the Cube outing!