Naive Intervention – Part 3: Illusions of Predictability in Investment Theory and BPM
In part 1 of these series I discussed the concept of Naive Intervention as a response to purely human need of causal narratives, while no such thing exists outside our brain. In part 2 I discussed the priority of survival over efficiency in our intuitive responses to events.
In part 3, I want to conclude my essay with a discussion about the similarities of illusions of managing a business with processes and illusionary investment theories. In both cases intellectuals use claims of unproven hypothetical benefits to justify acts of naive intervention.
Let me add once more that I am not referring to process management in manufacturing where the benefits of repeatable and solid processes are obvious, but to industries where service and customer interaction is the product. A lot can be learned from manufacturing, but you can’t turn the people who service your customers into robots or monkeys.
Less information about the more important thing makes decisions safer.
Investment theories use mathematical models to predict for example the future value of a stock, broadly based on the Efficient Market Hypothesis. I discussed in part 1 and 2 that it is arrogant to pretend that the vagaries of the business world and economy can be predicted by a few formulas. Similarly, Business Process Management uses symbols to express a model of how processes should provide a certain value in the future. Apart from a functional syntax, there is not even a definition what the true meaning of these symbols is. They express logic that does not exist in business interactions outside process management theory. It is an unproven assumption that BPMN models can actually express business interactions or even more important can represent what a business wants to deliver in value. The appealing simplicity of process graphs is understandable but their use is actually naive.
As a further point I propose, that deciding in a larger business ‘which processes to optimize how’ is similar to deciding into which stocks to invest on the stock market. Why? Because they follow the same investment principles, require a similar risk assessment and therefore will be exposed to similar fallacies. You buy a stock at a current market price and you take a risk of it going up or down. You might buy a future from a partner who guarantees you a stock at a certain price at some future date. Both parties take a risk in doing so. That risk estimation defines the risk premium. You ‘buy’ a better process through its implementation and maintenance cost. For that it provides a return in reduced costs or improved value. There is a slight difference in that you can’t sell the process to someone else, but some process outsourcers actually try to do so. The main return of stock is not its dividend but the price difference between purchase and sale. Overall that does not matter in a future value consideration. It is the return that counts regardless of how it is achieved. The risk is the uncertainty how much return the ‘process stock’ I invested in will deliver. With processes I am even less sure what the purchase price is as the implementation costs are upfront purely assumptions, as are the expected returns in terms of savings or improvements. More information about past process implementations does not help me at all in that consideration.
I thus suggest that a BPM process implementation could too be measured by an ALPHA (how much better are you doing than the average) and a BETA (average return on the market). Alpha makes BPM worthwhile, beta doesn’t. But how can we calculate these numbers in terms of the return of a process? It is really difficult. For that simple reason the main selling point of BPM in the past has been cost reduction. Firing people is a simplistic short-term cost benefit that any idiot can understand. But that does not automatically turn a BPM investment into long-term benefits and it does not make it a competitive measure. Cost reduction is a naive intervention performed by clueless management. An important point is that there are limits to cost reduction. There are only three reasons for the suggested increase in efficiency in industrial production and they are not related to managerial skill or BPM. It is miniaturization in electronics, automation in manufacturing and outsourcing to cheap third world countries. These reasons are running out of room to move. In the long run it is much harder to make money by lowering cost than by spending money to improve quality.
Apple has driven up perceived value and changed the world.
Others merely drove down cost in a spiral to extinction. Apple made customers pay a pleasure premium over the cost of the product simply because they offered a unique emotional benefit. The problem is that the pleasure is not a simple measurable quantity at all, and neither is displeasure from a cheaper, but lower quality product. BPM should thus be about improving perceived quality and not about reducing costs.
Because customer perceived value cannot be determined by a model or guaranteed by certain processes, all decisions in this arena must be based on intuition and not on probabilistic prediction. It cannot take you out of BETA territory. So the question is: how could we capture reality and make things better by performing to the real-time perception of value by the customer. Quality in the definition of current BPM methodology conforms to some abstract, usually measurable spec of a deliverable when the real thing is the emotional reaction of a real person only! Therefore we simply can’t predict or measure the true outcome of processes. We thus need to focus on creating enough potential for perceived value.
The prediction that a particular process will instill a certain perceived value reeks of ignorance. Even probabilities such as coin throws in the physical world are influenced by chains of uncertainties. By the law of large numbers, coin throws are utterly predictable in the average, but that won’t tell you what the next coin throw will show. Customers are further influenced by emotional people interactions rather than a platonic, theoretical perfection. In the real world where things are chaotic, small variations in starting conditions produce substantial variation in the causal chain. Probability chains in people interactions can’t be calculated at all. Emotional, intuitive response to a customer is the only real world measure to influence perceived value and thus true outcome. Only people who care are able to deliver such value. Process management’s ONLY real world benefit can be achieved by improving how people interact effectively, including how to make that interaction more efficient by not losing incidental information or missing goals. A hard-coded flow doesn’t do that.
There is however no point in asking people what they want, where they want to go or where they will be tomorrow. Like you they don’t know. They only know what they want when they had the experience of getting it. Even if there is a statistical distribution of people reacting to that physical outcome with different emotional perceptions, like with coin throws that does in no way predict an individual reaction. Which simply means that at best ALL your process management efforts will be no more than the BETA – you will never go beyond average. There is no potential upside. You are wasting your money. Simply do nothing. It’s cheaper.
BPM experts use the same approach as economists – simply ignore complexity!
The standard process flow assumes that only a minor variation around the mean will take place in the future. The radically naive assumption of BPM is that the variations of a process outcome will only be the same risk of diffusion of past observed variations by which the variation in perceived value is (like in stock markets) related to the square root of time.
Variations in processes must however be considered in three ways: variations of the outcome over time and the variation across different processes, and third the unpredictable outside influences that move the outcomes of all variations away from their starting values. From what I have seen, the consequences of that are not understood by most BPM experts. They do not even consider it but assume that a governance structure will take care of those problems. But as I pointed out previously, governance does not actually align the process with changes in context, it just demands more governance to enforce previously defined processes. The common practice of standardizing processes is proof of that and it assumes that if the principle outcome is the same then clearly the process and cost leading to it should be the same. The standard process is the holy grail.
That one can come to the same endpoint from many starting points and through different paths produces a huge potential gap between the true cost and the perceived value across all process variations and all process starting points. The problem is that these processes work with implied parameters, meaning that some common performance indicator is construed to be relevant across quite disparate processes. The optimal process must be the cheapest one and therefore if we apply it across all customers and across the whole business then it must also be good for the whole business and also good for the future. That in effect is idiocy!
Just because you averaged out indicators that does not mean that all your processes will perform at average. Cutting cost this way carries the risk that some processes may be quite out of line and your numbers can still look good in average. There are no risk distribution profiles you could use. Some processes may experience huge volatility in outcome over time and some may simply not work at all. Your models won’t tell you and the glorious BPM dashboards are no more than fairytales. The only one to tell you is your customer! If you now say that you are doing customer satsifaction surveys they might again give you an averaged-out indication much too late, but it won’t tell which part of what process is the culprit. Big-Data-like analysis on processes just produce more stasticial noise about less important numbers. The problem is that a singular bad experience in your customer’s perception can wipe out your great process performance over the years. Emotions do not average out over the years and across processes. They are REAL and NOW and they make the customer switch in a cinch.
Antifragile – A favorable assymetry of winning options!
Antifragility (allowing gain from disorder) applied to processes enables more options to win than to lose in a favorable asymmetry and no amount of risk assessment or future value assumptions can do better than that. It basically means to play it safe in the large and allow for the potential upside in the small and not vice-versa by cutting costs across all processes. This is not the same as trying to standardize 80% of a process, as you still have the same problem within them. The other 20% have no upside potential at all and remain part of the average. It is more important to not lose that one customer (which is the first step to lose your business) than to have a large cost saving potential that could however bankrupt you. If you chose to not service a particular customer segment then that is a different thing alltogether. To become efficient is secondary to becoming effective or safe! If you choose a process that offers more options it is more likely to satisfy the customer even if you are less sure about the future costs and the future outcomes. More options means more antifragility and more potential for upsides. As long as we have a bottom line of process performance we keep them all reasonably happy and where possible we excel!
The same is true for your staff that ought to perform these processes. It is their caring attitude that is the only thing between a customer staying or leaving. This age of highly educated individuals demands a change in thinking. We must acknowledge that they collaborate as social networks of autonomous individuals. Studies show that only security, autonomy and appreciation keeps them in their jobs. Managing a business by individual performance is utterly futile. Averaging out these individual performance numbers tells you nothing about how your business is doing. It is the people network that makes the individuals produce value and not a process straightjacket. The quality of their outcomes is proportional to the quality of the relationships they entertain. An executive does not manage individuals, he is just a node in a network of people with some stronger ties. It is well understood that it is the weak ties (Granovetter) that make the network tick!
The difference between humans and animals lies in the ability to thoughtfully collaborate. Purposeful collaboration (=business) has an explosive upside, an additive capability that leads to evolutionary emergence of new stuff. All you need to do is to create collaborative environments such as the Apple Appstore that a connects developers (musicians) and users (audience) or a system like ACM that empowers the various departments and their internal and external customers to collaborate freely. Your processes become antifragile — they benefit from the disorder!
Let me close with a summary of various snippets and conclusions from ‘Antifragile':
MBAs love strategic planning but there is no evidence that it works, rather the opposite. Don’t invest in business plans but in people. You don’t need a plan, but goals and an environment that enables people to collaborate towards them.
Everything theoretical in business and economy has been exposed as pseudo-sience. Evidence of absence is not absence of evidence — meaning that even if you can’t see it, it can still be there. Good news tend to be absent from past data but that does not mean that these are bad news. Empirical evidence therefore misses positives and underestimates overall benefits leading to the conclusion that something must be done. There is little evidence of good things that came from doing nothing.
The true question for a better future is not what we must add but what we can remove from our over-technologized world. Make it simpler but not by adding rules but by removing complexity. For cooking we still use the same tools as they used 300 years ago, just slightly improved to be non-stick or easier to clean. An iPad and most tablet computers are so appealing because they do not require technological knowledge to be used and they remind us mostly on how we used to work before we started to use computers. A two-year old can use them without being able to read or needing to be taught. Likewise, I propose that ACM is a return from the rigidity of BPM to the simplicity of people collaborating with content in the context of a case folder. There were no processes but each performer basically knew what to do with the piece of content. ACM further adds guidance, context and auditing. While simplification is good, BPM flowdiagrams are an oversimplification.
Governance required to do BPM does not simplify — it complicates by means of rules. The most hindrance in developing human capital is the soccer-mom as per E.O. Wilson (enforcing structure that keeps kids from experimenting and adventuring) and formal education or HR programs. According to Nietzsche, not all that is unintelligible to humans is necessarily unintelligent. Nietzsche saw two forces in us: the Appolonian (measured, balanced and reasonable) and the Dyonisian (visceral, wild and untamed) or as the Asians called it, Yin and Yang. For progress we need both.
Nietzsche rather than Joseph Schumpeter first spoke of creative destruction. It is the wild and untamed in us that will destroy what the measured and reasonable set up as boundaries. If you are an executive then you should be using both forces wisely. The larger and higher the walls of rules that we create are, the harder and more profound will the earthquake be that takes them down. Looking at what is going on in economy and politics, it may be unavoidable there.