The subject of linking Enterprise Architecture efforts and Master Data Management with Business Process Management is now frequently recurring. Let me just say that the direction is an absolute necessity. One approach is the theoretical top-down modeling as employed by for example DoDAF concepts, the US Department of Defense standards efforts, and the other using the free market forces of SW vendors acquiring various other vendors to integrate functionality. There is a kind of fuzziness in definitions of Enterprise and Business Architecture, but the EA is usually more IT oriented and contains details on applications and infrastructure. Because I am concerned with a real-time link to Business Architecture as the input to process goals, I prefer a third, consolidated approach, where the business (capability) model entities are defined as structures of the IT execution environment.
The first approach is for example well defined by the efforts of Michael zur Mühlen (MzM) in his work to integrate architectures and business processes around the DoDAF models and BPMN. I recognize much of it in the work he did for SAP and the ASAP 7 methodology. It is an all-encompassing and very theoretical approach that will fit slowly planning and operating government organizations. I am not sure if a large dynamic enterprise could survive the rigid structure. I would like to hear your opinion on this. Please comment.
Sandra Kemsley has given a great overview on MzM’s approach on her blog and it was there where I easily found the strong similarities and subtle differences to my approach. The differences can also be seen as substantial depending on your viewpoint. In short, while we model the same things I don’t work top-down, but open all definitions up to the business hierarchy and the customers in real-time for transparency.
The common approach is to place the Business Architecture or Model design on the strategic planning level and have the Master Data Management as design-time tool the IT execution side. That produces the very disconnect that is the problem. It also disconnects the process from strategy and targets. To make the planning capabilities available to business, I propose a business goal-driven approach with the same capabilities, activities, resources and performers (CARP for MzM) as the architectural base for process management. This is clearly visible in the FIVE ELEMENTS of ACM that are available to the business for interaction and not just to business and process designers. I propose to also design and adapt the business model in the executable IT environment.
Like Michael zur Mühlen, I came to the conclusion that the modeling requires more a business (organizational) perspective than a BPMN perspective. Which is why he decided to disallow a number of BPMN features and focus on milestones, handoffs, decisions and procedures. I propose the same and took it one step further. Rather than just using milestones as a structuring of process phases, I decided to use GOALS as the main structuring concept of a process or a case. It does not only structure the process, but it also directly defines the rationale and WHY and there is no need for later process monitoring. It is right there in the process, visible for everyone. Rather than handoffs, we use SUBGOALS to keep the process paths asynchronous. The big advantage is that these are event driven and can also be simply used across process/case boundaries. There is no longer a happy path, but simply fulfilled or unfulfilled goals, with the opportunity of alternative goals defining the happy path on the fly. All real world processes are event-driven and not flows and therefore an event should never throw an exception, but be simply dealt with through a new subgoal. At decision points users input path relevant data or decide to add a new goal to the process or cancel one that is there. When decision points are linked to customer outcome relevant processes then this point becomes in my diction a leverage point that directly influences business results.
In terms of the product-merging approach, Forrester’s Clay Richardson highlights the relationship between MDM and BPM by referencing Software AG buying Data Foundations. Clay is absolutely right too about this requirement, but I wonder about the suggested strength of the combination. Clay references a few more vendors who have BPM products and acquired MDM capability to strengthen his reasoning. No need. The benefits if it works are compelling. But does it work? Can a vendor simply plug an MDM tool into another product? I know from experience that there is no easy way to simply merge two products that were built by different labs and have their own distinctive customer base. OneData MDM has its own Model, Acquisition/Import, Create, Maintain, Distribute, Data Quality, Process Flow, Survivorship, Security, Governance, Workflow, Change Management, Collaboration, Business Rules, Messaging, SOA, Reporting, and Auditing. Will be interesting to see the integration. I prefer consolidation around a common executable model.
While it is great for the developer to have access to a MDM during development, what happens when someone changes anything in the MDM? How are changes migrated to all the products and applications that use compiled code? So a ‘design-time-only’ MDM is only part of the story. The business architecture and processes may be in ARIS with a link to OneData MDM and there goes your promised agility. Any change has to go through the ARIS toolkit and run the complete BPM bureaucracy plus the MDM, because ARIS doesn’t execute, but just manage and measure definitions and process resources. Development, deployment and execution has to be all done someplace else and controlled.
Even if you have a central MDM piece serving multiple SILOS, I haven’t seen one yet that allows for a change in the MDM definition that will automatically translate to changes in all the ECM, CRM and BPM fragments, to their DB tables, content definitions, program calls, business rules, processes and GUIs. Maybe once you will rewrite everything such a solution might be ready by then. Only a consolidated solution can do such a focused deployment by dynamically reading and interpreting the MDM repository at runtime. Let me reiterate that the MDM should also hold the data (object) definitions for the business architecture and enable their link to process management!
Conclusion: All these really brilliant people got it right. I am in total agreement with Michael zur Mühlen, as well as Sandra Kemsley, and Clay Richardson (I finally did meet both in Washington) and all those others who see process not simply as flowcharts but want to connect it to the business strategy and architecture efforts. YES, as directly as possible, please. But given the current lack of software consolidation, they still need to propose a large BPM bureaucracy that is supposedly a sign of process maturity. Common sense says that processes are mature when business users can achieve a customer outcome without needing weeks to months of process design and implementation. Humans are mature when they do their own thing and not when they need to be told what to do continuously. Hence, once you can get rid of the BPM bureaucracy then your processes might really be mature.