Naive Intervention Part 2: It is a Matter of Survival, not Efficiency!

In Part 1 of this series I covered the problem that humans grossly overestimate our human rational capability and the power of non-emotional narrative in the form of theories and models. In this post on risk assessment and decision making I continue to quote freely from Taleb’s and Derman’s books.

We are demanding that the risks that we take are calculable.

While that is reasonably possible in the physical world, it is impossible in complex systems. Because our world is inherently unpredictable, human decision making does not utilize such rational and logical functionality but emotional weighting of available information from experience (Gigerenzer, Damasio, et.al.).

Most certainly you come to your decisions the same way:

  1. What can we know? -> How valid and plausible FEELS our input?
  2. What can we understand? -> How do we FEEL about the potentials (options and outcomes)?
  3. What can we influence? -> Do we FEEL capable of changing the course of things happening?

Our human ability to make decisions is thus not linked to reason as is mostly assumed. Russell Ackoff – who was a leading systems thinker – and others tried to structure that into the DIKW pyramid. Data, information, knowledge and wisdom are gained in that sequence to allow us to decide wisely. Ackoff preferred the concept of ‘understanding’ rather than wisdom in his explanations. I think because wisdom is clearly not about being logical or rational. Systems thinkers also consider themselves as overly rational and reasonable. That in itself can become a fallacy too. Many system thinkers build model illusions they then take for the reality. A true system thinker is foremost humble about his lack of knowledge.

Theories are right when they work (i.e. QED Quantum Electrodynamics is accurate to 12 digits) while models require explanation and verification. Just like an MRI scan doesn’t show a human’s emotions, an utility function like in the Efficient Market Hypothesis will not model a human’s buying decisions even if it seems statistically correct. Due to the individually acting agents of a complex adaptive system there is no such thing as predictable cause and effect in economics, finance and business.

More data is not automatically better!

Coming back to less is more or to remove what is wrong from part 1, we are entertaining another huge fallacy with Big Data. To come to good decisions you must REMOVE superfluous data until you retain the bare essentials that will keep you alive. Looking at everything through the illusion of Big Data analytics will most likely hide the real dangers in the noise. Daily changes in revenues or stock prices are no indicator where the economy or business is truly going. All they do is to provide an emotional background noise for people’s decisions. Obvious decisions require just a SINGLE good reason and not many and they don’t need statistical trends. Trends may be interrupted at any time by emotionally relevant news. So statistics are an observation but provide nothing for a prediction. They might actually do the opposite.

In the 80’s Time Magazine published an article that predicted that the then-current trend of diminishing oil prices would continue for some time and lead to prices way below $10 a barrel. This article caused OPEC to take notice and in an emergency meeting they decided to introduce strict export quotas for all member countries. While in my mind that makes OPEC is an illegal price fixing cartel there is in fact not much we or anyone can do it about except not to buy oil. As a consequence oil prices have been rising to its current levels ever since and most changes to the trends had little rational explanation up front. It was the observation of a trend with a naive model prediction that actually caused its reversal.

In practice, good decisions thus come from bad experience only!

… and luckily they don’t have to be our own experiences. Not clever thinking makes airplanes safer, but ugly and painful crashes. Even Fukushima was a valuable lesson to all PhD’s who design nuclear power plants and that in a country that created the word ‘Tsunami’. Intellectuals tend to focus on avoiding negative responses from fragility rather than recognizing their positive side-effects. That comes from working in theoretical lab environments that are however built to remove these outside influences!!! They tend to think that innovation comes from planning and an Harvard education rather than intuitive responses to bad experiences. In business, what nature and experienced people consider as a safety redundancy, MBAs purely see as inefficiency. Risk estimation and planning is just there to motivate people to take risks they DO NOT understand. But, not building a nuclear power plant is safer than building a theoretically safe one. Not buying a complex derivative is still a lot safer than buying one with an illusionary future value.

But there is no additional business in not doing something. So let’s give people a calculable risk and off we go. The calculated risk theoretically allows for higher risk premiums. People want to buy high-return, risky stocks with a calculated risk to make them less risky. Does no one see the paradox here? These are obviously not rational buying decisions! Yes, a well balanced portfolio might have the potential for a substantial upside on a small part of it with a smaller risk of loosing everything. But in the long run, if you rely on trends and statistics your return will be at best average with a substantial risk of loss as the system acts more volatile by everyone doing the same thing. If you DO NOT INVEST into risky stocks you might loose a little from inflation, but you will only lose if there is a big crash that takes everything down. As long as we believe we can calculate the risks of complex systems we are continuously increasing the risk of that total collapse.

In the long run it is a matter of survival!

Rationality and mathematics miss that past worst case events were at the time worse then all previous worst case events. Therefore decision making under uncertainty becomes even more relevant as we also have to consider matters of  survival. Fitness does not just mean to be just as strong as currently needed, but to be able to survive the next worst case. The notion of efficiency and optimization to improve profits is a naive rationalism that follows statistical theory and brings no other information than that it all fits under the Gauss curve. Survival issues are considered as too statistically rare to be of current relevance. As you make your business more efficient and more stable (by for example using BPM or outsourcing) you are unavoidably reducing its resilience by not being able to react continuously to changing external events.

But this is not just about business! Naive medical intervention (as recognized medical errors) kills several times more people than random car accidents. As cars crash frequently they get safer and safer continuously and people learn to avoid danger. Medical procedures do not change as fast as their use is highly regulated and their benefits are only considered in terms of statistical average. Additionally, the patients are given statistical information about risks to make an INFORMED decision. In reality doctors are misinforming patients by saying ‘This procedure is a calculable risk, so it is ok to take it.’ And if it fails it was not their decision. Some of that has to do with the practice of making money (lawyers and patients) through malpractice lawsuits. The immense cost and ineffectiveness of the American healthcare environment is largely caused by sortsighted gain of a few that misuse the naive intervention of the justice system.

Most medications are Naive Interventions with unknown risks.

Too many people who faught a supposedly heoric battle against cancer (like my sister in law) were in fact fighting to stay alive despite the medical treatment. Cancer treatment has only two treatment goals: tumor size and average survival after diagnosis. It does not take quality of life or time of survival without treatment into account! As we are diagnosing more (also less risky) cancers earlier the statistics do show more and longer survivors but to attribute that to treatment is a statistical lie. Someone who accepts the illness and doesn’t fight is branded a coward and quitter. How horribly arrogant and cruel can we become? When you know how treatment studies are performed (my late mother was a MD), then the pharmaceutical industry is not a life saver but purely a money printing press. And most of their grand discoveries (i.e. Penicillin and Viagra) were pure chance and not directed research! The combined side-effects of the many naive intervention medications (i.e. to lower cholesterol) that people are taking today are way beyond even being calculable!

You can get REALLY sick in LARGE hospitals where only antibiotics-resistant strains of bacteria survive. The constant antibiotics stress makes bacteria mutate to resistant variants while destroying our bacteriological microbiome with antibiotics makes us weaker. Many viral infections are treated with cure-all antibiotics while they only have a negative effect on bacteria in our microbiome. Treating children too early with antibiotics is now suspected to be a cause of the rise in allergies. Getting an infection is in fact healthy as it tunes the immune system to the changes in the environment. A natural birth and the immediate contact of the baby with the mother are so important because that transfers the most common bacteria from mother to child to jump-start the baby’s immune system. Without the constant stress of new bacteria we do not develop the ability to survive infections. Removing bacterial stress by too much hygiene is actually making us a lot less resilient. Most standard medical practice today completely ignores the long-term effect of treatments on our microbiome, making it largely naive intervention to suppress the apparent symptoms. Also a tumor is only a symptom. The real disease is a miscommunication between a body of cells and its biological context.

The hidden fragility of large systems with ‘industrial strength’ components.

A cat will survive a fall from ten feet unhurt, while an elephant most likely won’t. Because banks are so big, bailing them out, fixing prices or eliminating small scale speculation (similar to killing bacteria with disinfectants outside and antibiotics inside body) brings only illusionary stability until the crash. Suppressed volatility hides the truly existing risks until systems implode. In a complex adaptive system constant stress is not to be mistaken as overreacting to noise but must be understood as environmental tuning information.

What is claimed to be robust or ‘industrial strength’ is not, and it is also not the simple opposite of fragile. Robust will fail at some point as much as fragile, because all it offers is a tested strength to resist a well-known stress. ANTIFRAGILITY is a property of (complex adaptive) systems that improve when they are stressed. We need to re-learn that in a complex world the notion of a single logical cause or a predictable outcome of an action is suspect. The constant, random stress is information that aligns the small anti-fragile system with the changes in its environment. Large, apparently strong and efficient systems that have lost their ability to react to constant stresses are truly extremely fragile once the next large event happens or the system jitters. The true fragility of a system multiplies when most if its too-large entities are apparently robust – hence our financial system.

The same is true for projects. The larger the project the worse the outcome, unless the project is cut into many independently run elements. In too-large, too-rigid systems, variations never produce a positive effect but just worsen the situation as they produce more intervention to avoid them. Governments and global corporations completely underestimate non-linear convexity effects (see Jensen’s Inequality) with the multiplication of risks that come with size. The economy and businesses seem to become more and more efficient but the resulting fragility causes the outcomes of errors and/or events to be substantially worse.

Where is the evidence for anti-fragile benefits in Social systems?

Such evidence can only be found in real-world situations. Lab environments are not able to simulate real-world complexity. Simulating business processes is thus utter nonsense. In the Netherlands town of Drachten they removed all street signs in a traffic concept they call OpenSpace. As a result the traffic became a lot safer AND more efficient as pedestrians and drivers became more alert and active in participation. Less rules, more common sense. Good decisions are not about ‘knowing the future,’ predicting, calculating or enforcing outcomes but about creating an asymmetric potential of more positive opportunities versus less bad outcomes. Good decisions focus on survival or better ‘not biting the dust.’ Less people being hurt or killed is VERY efficient in the long run. Survival is more important than current profit. It makes no sense to be efficiently dead.

When however survival of ‘too-large-to-fail’ global corporations brings about the response to transfer their fragility to our economic system through bailouts or large loans then interventionism is no longer naive, it becomes outright stupid and ignorant!

It would be time to stop being so arrogant in pretending that we know it all and have it all under control. We obviously do not!

In Part 3, I will discuss the similarities of illusions in Investment Theory and Business Process Management.

I am the founder and Chief Technology Officer of Papyrus Software, a medium size software company offering solutions in communications and process management around the globe. I am also the owner and CEO of MJP Racing, a motorsports company focused on Rallycross or RX, a form of circuit racing on mixed surfaces that has been around for 40 years. I hold 8 national and international championship titles in RX. My team participates in the World Championship along Petter Solberg, Sebastian Loeb and Ken Block.

Tagged with: , , , , , , , , , , ,
Posted in Business Intelligence, Complexity, Executives, Predictive Analysis
3 comments on “Naive Intervention Part 2: It is a Matter of Survival, not Efficiency!
  1. […] by Max Pucher in his three part series:  (1) From Antifragile to Models Behaving Badly, (2) It is a Matter of Survival, not Efficiency, and (3) Illusions of Predictability in Investment Theory and BPM.  I will just give a short […]

    Like

  2. I’m a little disappointed in the example used as evidence. By the Shared Spaces reports own conclusions it was not appropriate for scaling, and certainly not appropriate in all traffic related situations – does it therefore give us any cause for evidential hope in the very different cases you cited as problematic. Don’t get me wrong – I’m very interested in the positions you are outlining – I’d love to have some better examples of potential alternatives to be investigating if you know of where I can read more on this I would appreciate it…

    Like

    • Martin, thanks for reading and commenting. Systems theory does propose that not all systemic concepts can be scaled. But Shared Spaces does not need to scale. Each traffic intersection can be its own shared space.

      Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Max J. Pucher
© 2007-19

by Max J. Pucher. All rights reserved.

Real World Statistics
  • 239,669 readers

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 366 other subscribers
ISIS Papyrus on Twitter