A lot has been written about process maturity. The different approaches proposed have as highest rating the ‘Optimized Stage’ of maturity, where continuous process improvement is enabled by feedback and by applying innovative ideas in an agile manner. Sounds good, so what is the point of process maturity assessments? In my mind nothing else than to sell consultancy services. Assessing process maturity is a pointless exercise that only leads to adding more bureaucracy to an already lacking approach. How can one start doing process management without making goal orientation and continuous improvement their starting point? Much worse than not well-defined processes are processes that can’t be optimized – regardless of the reason. That is why we do process management in the first place. Allow me to use the Higgs boson to guide you to my proposal why there is no need for expensive and slow process governance if you do it right.
Processes do not exist
BPM experts promote that processes have an independent existence (somehow disconnected from the real world) and follow odd rules that they seemingly make up when they are bored. First of all – in the most natural sense – is a process defined as no more than observing a sequence of (inter)actions that as whole have the consequence of some outcome. The idea to make this observed process now mandatory to achieve the same outcome over and over again only works in controlled environments such as factories or laboratories. It is not applicable in the real world of complex adaptive systems. Process flows are an over-simplification by ‘experts’ who do not understand the system that they are messing with. Because processes are event/action patterns and not rigid flows, I applied for a pattern-driven process mining patent in 1997, which I recently was awarded. But one can’t even start observing patterns effectively without a priori modeling. Yes, patterns can be seen without meaningful models, but then we need a long time to identify meaning – i.e. how a child learns for many years. One can’t model a flow of activity without the relevant capabilities (business objectives) that first need data or content models to make any sense. The illusion truly is in assuming predictability and controllability of a sequence of actions standalone – in my mind a consequence of human arrogance and pseudo-expertize!
The prediction of the Higgs Boson in the Standard Model
I have used quantum physical similes for explaining the above before and I would like to add another one here in honor of those who measured the Higgs mechanism on July 4th at the LHC. The Higgs boson has not been discovered or found as many say, but an energy signature was measured after several hundred trillions of particle collisions that can be interpreted to fit the predictions of the Standard Model for the Higgs boson with acceptable probability. In the real world hundreds of trillions of times the same process is being executed to smash protons into each other and only a few thousands times we can identify patterns that we can ‘probably approximately’ match with the model. That is the real world of processes and human interactions are no different. While the model requires (predicts?) the Higgs mechanism to be valid it can not predict the outcome of processes!
A real-world process is defined by nothing else than some starting condition or event that causes (inter)actions that lead to some outcome (more about gauge theory later).
How does the above relate to process models?
There are many models that one can create related to processes. The most common and at the same time most misused one is the flow-diagram. Without a complete model of why and how people and resources interact towards what end, a flow-diagram is utterly point- and meaningless. Without having a business data model one cannot even observe the patterns that so many now start to rave about without knowing what it really means. And to make it all work the models that are being used for observation and analysis have to be aligned with the one used for execution!
Ideally a process definition to be used by people contains an explanation as to why that outcome is desirable. In the real world a process cannot be achieved by a declaration of a work sequence because the start and work conditions will be chaotic. The process must thus defined by which knowledge is required to achieve the goals. Knowledge is not data and not information and not contained in a predefined sequence. Knowledge is in the head of the performers (as experience patterns) who interpret the resources. Both contextual business data and statistical information (Big Data?) and work instructions are process resources and not knowledge.
It thus makes no sense to start a process management effort without explicitly defining what goals, outcomes and targets actually are because no one will know if the process that is being designed or already executed makes any sense. But once you define it it has to be measured and it has to be made transparent to everyone. Now you must be aware that no BPM system does that today. Yes, there are some BPM suites that have a monitoring component but what they measure is service levels and not process outcomes.
To be able to do so, the business has to define the capabilities in the value stream, the related targets and performance indicators and where in the organization the process owners will take responsibility. This is called a Business Architecture. It is actually irrelevant how the organizational structure relates to the value stream and capabilities. By defining end-to-end value stream processes this way, the goals and targets of each milestone become the handover definitions between organizational disconnected capabilities. Process owners who perform these processes are empowered to do them in any way as long as goals and targets are achieved. BPM vendors now use the term ’empowerment’ when they let performers and customers participate in a rigid process for example via mobile or web. That is once again a very far-fetched use of the term, just as agile or adaptive for a complex governance bureaucracy.
Maturity implies self-sustaining independence for people
A company has process maturity when it no longer needs a huge bureaucracy to achieve its goals. A nanny-state government that writes laws about how to wipe your behind is not empowering its citizens but treats them like idiots. More rules and regulations do not make life and processes simpler because people don’t have to think, but overall it makes everything more complex. Compliance has become a major complexity problem for each and every corporation. Defining more rigid processes and rules makes it nearly impossible for people to use their knowledge and experience in fear of violating them. Any form of process rigidity kills the ability of people to engage the customer on a one-to-one basis and it also kills the creativity that drives innovation on any level. We live now a world in which knowing the rules (i.e. lawyers) is considered an act of being intelligent! Is this really where we want to go?
So starting to do BPM any other way than to define a top-down Business Architecture (which can be done for a single end-to-end process) and without using a goal-oriented, adaptive approach is doomed to fail. As much as you will hate to hear it, but if companies want to achieve process maturity, as a first step they will need to get rid of orthodox BPM software that lacks the embedded architecture capability needed for continuous improvement.
The Limits of Model Theory predictions (including processes)
So what can the measurement of the Higgs mechanism teach us? The only way we currently are able to predict anything in quantum physics is by applying so called gauge theories: an action integral which characterizes “allowable” physical situations according to the principle of least action. The observed system does not follow a single path whose action is stationary, but the behavior of the system depends on all permitted paths and the value of their action. The action corresponding to the various paths is used to calculate the path integral, that gives the probability amplitudes of the various outcomes.
The above means in layman terms that even on the lowest level of our universe there are no predictable processes! We find the most probable outcome by integrating over all possible ones. We cannot predict what the outcome will be but the principle of least action will take care of performing anything at the least expendable energy.
While we discovered in the 17th century that “light travels between two given points along the path of shortest time,” known as Fermat’s principle, our model theories do not explain how a photon/light wave knows which path it must fly to use the least amount of time (not in a straight line!) to its target. It just does!
I propose to empower people to allow them to follow Fermat’s principle as described by the words of Pierre Louis Maupertuis: “The laws of movement and of rest deduced from this principle being precisely the same as those observed in nature, we can admire the application of it to all phenomena. The movement of animals, the vegetative growth of plants … are only its consequences; and the spectacle of the universe becomes so much the grander, so much more beautiful, the worthier of its Author, when one knows that a small number of laws, most wisely established, suffice for all movements (and thus processes – MJP).”