Better, Faster, Cheaper Part II - The Decision Model Meets Data Quality Head On


It has been almost a year and a half since I published my experiences from my first decision modeling project in a chapter titled “Better, Cheaper, Faster” in the book, The Decision Model: A Business Logic Framework Linking Business and Technology by Barbara von Halle and Larry Goldberg1.  Since then, I have completed eleven more decision modeling projects and pressure tested and stretched The Decision Model in each project in ways that I could not have originally imagined.  I can confidently report that The Decision Model Principles have held up well in the toughest projects.

In this article, I explain a project completed in the financial services industry.  A client asked me to lead a project to redesign a failed sub-process that had resulted in billions of dollars of backed up financial transactions.  This particular financial process had a history of failed and abandoned process improvement projects.  The pressure was on and, I must confess, I was not entirely sure that The Decision Model would be a good fit.

Before presenting the details of this project, I want to report that the results were impressive!  The team that I led completed this project in ten weeks.  We reduced the processing of 200 “transactions with issues” from 90 hours to 3 minutes and 30 seconds.  We reduced the processing time of “transactions with no issues” from an average of 30 hours to 3 minutes and 30 seconds.  The ten weeks included discovering and modeling the old process, decision modeling, process redesign, system development, and testing.  This is pretty impressive considering that I initially was not entirely sure I could help them!

Therefore, let me start with an understanding of the project and how I used The Decision Model to achieve these results. From there I  drill into the results a little deeper.

Project Approach and Steps
In the first step of the project, I led the team through the documenting of  the business motivations2 for the project and aligning the motivations with business performance metrics.  These will measure our success.  For this, we use parts of the Business Motivation Model, which is an Object Management Group (i.e., OMG) specification addressing the practice of business planning.  We use it  to manage the project scope from a business perspective, keeping the team focused only on tasks that directly relate to achieving these business goals.

In the next step of the project, I led the team through discovering the current process and the decisions buried in it.  As is typical on other projects, the “as-is” process had been documented in a recently created eighty page document also used to train new staff.  The team quickly discovered that the document was difficult to understand, ambiguous, contained errors and was missing information.  I find that most companies document their processes in a similar fashion.

No one is really at fault.  After all, it took decades to understand the value of separating different dimension, such as separating data from process.  It also took decades to understand that each separated dimension is best represented in its own unique way and unbiased by the other dimensions.  Prior to The Decision Model, we did not have a unique way to separate decisions from process as their own standalone deliverable.

In spite of the existing documentation, the team  quickly created a high-level process model and discovered the decisions in the process.  We separated the process from the decisions, casting each decision into its own decision model.  Separating the decisions from process models greatly simplified the process model.  Figure 1 contains a generic version of the resulting high-level to-be process model.

Figure 1: High-Level To-Be Process Model with The Decision Model (click image for higher resolution)

Better, Faster, Cheaper Part II - The Decision Model Meets Data Quality Head On

The new process model illustrates the important improvement The Decision Model brings to existing process improvement methodologies.  I’m a big fan of process improvement methodologies such as LEAN, Six Sigma, and other proven approaches and I have used them to improve and redesign many processes.  However, until I learned about The Decision Model, process redesign and improvement were not as clear and straight forward as I was led to believe.  This is because when decisions are separated from the process, as Larry Goldberg has observed to me (only somewhat tongue in cheek!), a process principally exists to collect data to make decisions and take appropriate action!  With this in mind, I always ask a simple question: at what point in a process is it most valuable to make each decision?  This question reveals a level of separation and abstraction that makes process improvement and redesign much easier to envision and more intuitive.  The tangible separation of process and decisions takes process improvement and reengineering to a new level.

In the next step, the team completed the business logic in each decision model.  No one on the team had previous decision modeling experience, with the exception of one team member certified in The Decision Model.  I introduced the team to decision modeling by giving them a short primer.  The team then dove directly into decision modeling.  All of the business SMEs quickly grasped The Decision Model and, almost immediately, became our main project contributors and evangelists. 

Confession about The Decision Model
In my introduction, I confessed that I was originally not sure that The Decision Model would be a good fit for this project.  The reason for my hesitation was that this was primarily a Data Quality project.  The Decision Model was originally advertised as an unlikely fit for data quality projects (in fact von Halle and Goldberg say as much in their book!).  However, during a previous engagement, I was assigned the task of determining how The Decision Model could be used to solve some tough data quality challenges.  After careful evaluation, I concluded that The Decision Model is an excellent technique for documenting and managing data quality logic.  I wrote a paper documenting the data quality framework using The Decision Model.  Portions of this framework were presented at the MIT 2010 Information Quality Industry Symposium, (see  A more complete version of the framework is in Table 1.

Table 1: Data Quality Framework

Data Quality Dimension


Repository of Data Quality

1. Completeness

The determination that no additional data is needed to determine whether a Single Family Loan is eligible for settlement.

Completeness applies to fact types that are always required, those that are sometimes required (depending on circumstances), and those whose population is irrelevant.

  • The Glossary will indicate, for each fact type, whether it is always required or always optional for the general context of a business process (applies to all circumstances regardless of process context)

  • Decision Model Views will indicate the logic by which a fact type is sometimes required or irrelevant (i.e., those circumstances are tested in the logic)

2. Data Type

The determination that a fact value conforms to the predefined data type for its fact type. For example; date, integer, string, Boolean, etc.

  • The Glossary will indicate, for each act type, its data type

3. Domain Value

The determination that a fact value falls within the fact type's valid range.

  • The Glossary will indicate, for each fact type, its valid values

  • Decision Model Views will indicate the logic by which the domain for a fact type is further restricted for a specific process context to be compliant with Policy

4. Consistency

The determination that a fact value makes business sense in the context of related fact type/s fact values

  • Decision Model Views will provide the logic to determine consistency of values for related fact types.

5. Reasonableness

The determination that a fact value conforms to predefined reasonability limits.

  • Decision Model Views will provide the logic to determine whether a fact value is within sensible limits.

6. Accuracy

The determination that a fact value (regardless of the source) approaches its true value. Accuracy is typically classified into the following categories:

  1. Accuracy to authoritative source: A measure of the degree to which data agrees with an original, acknowledged authoritative source of data about a real world object or event, such as a form, document, or unaltered electronic data from inside or outside the organization.
  2. Accuracy to real world data (reality): A characteristic of information quality measuring the degree to which a data value (or set of data values) correctly represents the attributes of the real-world object such as past or real time experience.
  • Decision Model Views will indicate the logic by which a fact value is determined to be as accurate as is necessary.

Once the team discovered the high-level process and decisions, I led them iteratively through populating the decision models, analyzing them for integrity, populating the project glossary, and updating the process model. We also frequently reviewed the business motivations with the SMEs to make sure that each decision’s logic aligned with its business motivations.

We iterated through the decision models and process model four times.  Each iteration improved the decision models, process model, and eliminated business logic that did not contribute to the business objectives.

Decision Conclusion Messages
A critical success factor for this project was that each decision provides messages explaining the reason for each conclusion and the steps needed to resolve any issues.  For example, if a decision determined that a transaction is not ready to move forward in the process, the rules engine needed to create a precise message explaining the issues that arose from the logic and exactly what needed to be done do to fix each issue.  We had to implement separate messages for internal and external (customers) use also.

In traditional applications this functionality is complex, difficult to implement, and expensive.  The Decision Model greatly simplifies it because every Rule Family row suggests a message.  The business SMEs simply added messages to the Rule Family rows and the rules engine, following the messaging methodology, automatically displayed only and all messages relevant to a specific decision conclusion value.

Business Logic Transparency and Agility
Another critical requirement of this project was business decision logic that is easy for all stakeholders to understand and to interpret in only one way.  The client had suffered through too many projects where this was not the case (as is the situation for most companies.)  They no longer wanted business logic documented in a form (usually a textual statement) that stakeholders could not understand or where the business logic was subject to interpretation, and ultimately buried in the program code.

The Decision Model exceeded these expectations. 

Testing the Decision Models and Process
Many years ago, I was in charge of my first enterprise process improvement and reengineering project where I implemented business logic across 130 countries. I came to believe back then that fully testing the business logic would be impossible. During that project, I consulted an expert from a BPM vendor who told me it is impossible to test all business logic, that we would likely only be able to test a small percentage of it in my project. He also told me that he had never seen a project where more than a small percentage of the business logic was tested. The Decision Model completely changes this.

Using The Decision Model testing methodology and automating it, the team thoroughly tested all of the logic in a short period. We quickly identified and fixed the source of each test case failure. And, whenever the SMEs asked for business logic to be changed, we were able to test the change and perform regression testing in a matter of minutes.

I normally dread the test phase of a project, but The Decision Model made this task a pleasure. The Decision Model is a testing game changer!

Executive Review Sessions
During this project, I held several executive review sessions. The top executives needed to understand the decisions and the business logic in each decision because these executives are responsible for those business decisions complying with the policy. These types of meetings are always challenging because the executives need to grasp a thorough, shared understanding of the business logic and process in a very short period.

I start these meetings by taking a few minutes to review the process model and point out where each decision occurs in the process. I then review decision models that are of interest to an executive to give an understanding of the structure of each decision. This generates discussion, and I am able to drill into Rule Families whenever an executive has questions or needs clarification of the business logic. Using this approach, I can guide an executive quickly from a comprehensive, high-level view to a detail view and back with amazing clarity and shared understanding.

I used to dread executive meetings because of the pressure to get participants to comprehend quickly a thorough, shared understand of a project. You can always expect the tough questions. I now look forward to these meetings because The Decision Model always delivers. The Decision Model methodology has not let me down in any of these meetings. The executive is always able to grasp the model and the business logic quickly!

Technology Implementation
IT implemented the decision models and process by creating a rules engine using Java. They initially coded each decision manually (even with manual coding, The Decision Model saved a significant amount of time) but they later improved it by adding the ability to run decision models directly from the Rule Families (We built the Rule Families using Excel). To deploy a decision model, we placed its Rule Families in a pre-determined directory and the rules engine automatically read the Rule Families and executed the decision. Whenever we update a decision model, there is no need for additional coding!

IT also automated the running of the test cases. When we completed a decision model, we documented the test cases in Excel and put them in a directory. The application automatically read the test cases and produced a detailed report. This dramatically reduced the time to complete the initial and regression testing cycles.

This decision modeling project was highly successful just like all decision model projects in which I have participated. In these projects, the business owners, SMEs, and IT became very excited and involved because they quickly saw that they could accomplish goals previously believed to be impossible. Here are a few of the comments that I have heard:

Comment 1: Within the first week of a project, one business SME told me that she had been working for years to accomplish what this project accomplished in a few weeks. She told me she previously could not envision a method to accomplish her vision because she had seen so many failed projects. She went on to tell me that her excitement was because the business logic was easy to understand, could only be interpreted one way, and it was not going to get buried in the code.

Comment 2: Another business SME told me that this was the most exciting project he had worked on since he was hired. He told me that he was excited because all of the logic was visible and easy for everyone to understand. This project quickly accomplished real results where other projects seemed to drag on and on, only to be abandoned or not achieve the project goals.

Comment 3: Within the first two weeks of another project, one SME told me that she had learned and accomplished more in two weeks than she had done in a previous project that took six months and failed.

You can see why I love my Job!

Author: David Pedersen is a Senior Decision Architect at Knowledge Partners International, LLC (KPI).  He has over 25 years of experience in finance, technology, and developing solutions for global infrastructure initiatives for industry and non-profit organizations.  Working with KPI clients, he leverages his diverse background and experience to implement BDM, BPM and Requirements solutions in a variety of industries. 

With over 3 years of experience in decision modeling, Mr. Pedersen is one of the most experienced decision modelers to date.  He has created decision models rich in complex and recursive calculations, containing DQ logic, and simplifying complex business processes.  His decision models include large ones, spanning many pages and having many Decision Model Views.  He has led tightly scoped time-boxed pilots as well as projects in which decision models solved challenges in business transformation and crisis.  As a leading authority on The Decision Model, Mr. Pedersen has contributed to its advancement including Decision Model Views, hidden fact types, data quality framework, and Decision Model messaging. 

Prior to KPI, he served as a Director at Ernst & Young, LLP, where he led the development, implementation and support of complex global infrastructure systems.  His work included Enterprise Architecture and the development and implementation of complex global business processes re-engineering/improvement initiatives that were among the firm’s top global priorities.

Mr. Pedersen is an author of many papers, a frequent contributing writer for the BPM Institute, BPTrends and a conference guest speaker. 

Article edited by Barbara von Halle and Larry Goldberg.

Like this article:
  1 members liked this article


zarfman posted on Saturday, April 9, 2011 11:59 PM

Hi David:

You wrote: We reduced the processing of 200 “transactions with issues” from 90 hours to 3 minutes and 30 seconds. We reduced the processing time of “transactions with no issues” from an average of 30 hours to 3 minutes and 30 seconds.

Zarfman writes: Under the old system of processing transactions with issues, was human intervention required? Was the new system for processing transactions with issues completely automated using the decision model? What DBMS was used to store the approved transactions? How many transactions/second were you able to achieve for transactions without issues?


Alan posted on Tuesday, April 12, 2011 8:23 PM
Hi David

Thank you for that very good article.

I am working through 'The Dcision Model' book, and the additional example of such a successful project was very helpful.

I was an auditor and computer auditor before I eventually progressed to becoming a BA, and I noticed that the data quality framework dimensions in your article are very similar to the Audit Internal Control Objectives we used 30 years ago.

For example:
- ACCURACY - Ensure all calculations are accurate and data obtained from the correct source.
- AUDIT TRAIL/TRACEABILITY - Ensure that can prove all source transactions are processed and their impact is reflected in end use summaries, and that the summary amounts in those systems can be traced back to source systems.
- AUTHORISATION - Ensure all transactions are the responsibility of the reporting entity and that only authorised parties are able to create, change and approve transactions and carry out other important activities.
- CLASSIFICATION - ensure apportionment and allocation of transactions is in accordance with available classifications and is consistently and correctly applied.
- COMPLETENESS - Prevent omission of relevant data from the system.
- TIMELINESS - Ensure that transactions and other changes are allocated to the correct period and that the transactions are recorded and completed within allowed timeframes.
- VALIDITY - Prevent fictitious, duplicate or non-existent transactions being introduced.

This is core knowledge that does not become old and useless, and I have usefully applied it to many projects. It is good to see it used outside of the audit area and that others also find it useful.

Many thanks

David Pedersen posted on Monday, May 23, 2011 10:41 AM
Hi Zarfman,

Under the old system, staff used existing systems (applications) to lookup information and then they made decisions manually. In the improved process, collectin the data to make decisions and the decision were automated but correcting erroneous data remained manual. We could not automatically correct data due to system limitations, not Decision Model limitations.

Even though data correction remained manual, this process was greatly improved because the Decision Model conclusions provided users with the exact cause of the problem and precise instructions on how to fix them.

Best Regards,

zarfman posted on Monday, May 23, 2011 5:00 PM
Hi David:

Now I’m thoroughly confused, HELP!

You wrote: Even though data correction remained manual, this process was greatly improved because the Decision Model conclusions provided users with the exact cause of the problem and precise instructions on how to fix them.

Zarfman writes: Doesn’t someone have to construct the decision model? Even IBM’s Watson had to be constructed and tested before it could compete on Jeopardy. As far as I know someone has to discover the problem, someone has to determine a solution, and then wait some period of time to find out if the proposed solution worked.

Not that you don’t have anything else to do, but you forgot to answer the transactions/second part of my question.


David Pedersen posted on Tuesday, May 31, 2011 7:23 AM
Hello Zarfman,

I'm sorry that I confused you. In regards to your first question about constructing the Decision Model; the as is process analysis, new Decision Models and the improved process were created, tested and deployed in 10 weeks.

In regards to your second question about transactions per second; processing time for 200 transactions that did not have any issues went from 30 hours to 3 minutes and 30 seconds (approximately 1 transaction per second). My client on this project chose to implement the Decision Models with a simple home grown rules engine built in Java. Had they used a industrial strength rules engine such as iLog or Blaze, I'm sure the throughput could have been higher but the technology that they chose more than met their needs.

I hope I have answered your questions.

Best Regards
randycole posted on Tuesday, June 14, 2011 12:05 PM
Hi David.

I found your article very interesting coming at it from a QA background. I have been looking for a process and tool to model data quality.

I have a question about how, after abstracting the decision logic from the process, did improving business decision logic map to improvement in application logic and how was alignment with business rules verifiied?

Also, how did you ensure business logic was validated and thoroughly tested and automated using the decision model testing methodology?

I am interested in the rules engine and how that was used to run decision models.
Do you have more details on any of these questions or suggestions on good resources for further investigation?

Thanks much.

Randy C
David Pedersen posted on Wednesday, July 27, 2011 12:11 PM
Hello Randy,

Your have some very good questions. My answers to each of your question follow:

RE: How did improving Business Decision logic map to improvement in application logic:

The Decision Model improved the application logic because of its fifteen principles, it is technology independent, and all stakeholders can only interpret it one way. For the Business this means that each peace of business logic belongs in only one place and they have complete confidence that their business logic is error free, does not contain duplicate or overlapping logic, and it aligns with the business intent. It also enables the business to accurately assess the impact of business logic changes.

For IT this means that they no longer have to interpret the business logic as written in requirements, which reduces implementation time and rework.

RE: Business logic verification:

The business SMEs provided all of the input to build the Decision Models in this project, and they validated and signed-off all of them. I have found, from practice, that business SMEs love the Decision Model because it is easy for them to understand, can be interpreted in only one way, and it gives them control of their business. Business SMEs like the Decision Model so much that they often become certified modelers.

RE: Testing Decision Models:

By examining a Rule Family you can see that test cases can be built from them directly. The Decision Model methodology tests Rule Families, Decisions, and the process that consumes Decisions. In this project, the client’s rules engine directly read the Rule Family and Decision Model test cases from spreadsheets and produces a report showing the results of each test case. We had 100% confidence that business logic ran correctly as specified in the Decision Models that the business SMEs approved.

RE: Rules engine and how that was used to run Decision Models:

In this project, my client chose to build a simple rules engine that reads the Decision Model documented in Excel files and then automatically ran them. This was a big advantage for this project because the business SMEs could update the logic in a Decision and run it within minutes. Other client have implementations variety of implementation methods ranging from coding the Decisions directly from their Decision Models to storing them in a repository and creating applications that automatically download the Decision Models into a rules engine such as iLog or Blaze. In the case of the later implementation solution, they created Decision Model governance and sign-off processes insuring that only approved Decision Models are moved to production.

RE: Suggestions on resources for further investigation:

You can find more information on the Decision Model and the success that other companies are experiencing on KPI’s web site ( KPI also offers Webinars, Lunch and Learn meeting, training, certification, and engagements to rapidly enable companies to implement the Decision Model.

I hope I have answered all of your questions. I apologize for not responding to your questions sooner but, for an unknown reason, the Modern Analyst site did not notify me that you entered a comment.

Best Regards,
Only registered users may post comments.



Copyright 2006-2024 by Modern Analyst Media LLC