Black box thinking: The surprising truth about success - Matthew Syed
This book talks about failures, how we treat them citing examples from industries ranging from aviation, healthcare, coding, cycling teams.
I learnt about this book on #484 of the Tim Ferris show, with Daniel Ek, the CEO of Spotify.
This book talks about failures, how we treat them citing examples from industries ranging from aviation, healthcare, coding, cycling teams. It talks about the psychology behind our attitudes to failure, and how we view it. It talks about how failures serve as the core building block for both marginal and evolutionary successes.
It talks about the techniques used by some of the most innovative people and industries in the world. It also talks about the dangers of failing to learn from our mistakes.
Highlights:
"We create vague goals, so that nobody can point the finger when we don’t achieve them."
"We all have a sophisticated ability to delete failures from memory"
"Self - justification , allied to a wider cultural allergy to failure, morphs into an almost insurmountable barrier to progress."
"When people don’t interrogate errors, they sometimes don’t even know they have made one (even if they suspect they may have)."
"Learning from mistakes is not a drain on resources; it is the most effective way of safeguarding resources – and lives."
Learning from mistakes can be done under two conditions, or environments:
- Under practice conditions, when the aim is to learn from mistakes, push the boundaries.
- Under real world conditions, when the tendency is there to avoid mistakes.
"When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether."
Why is it so hard to learn from mistakes even in face of evidence to the contrary?
Because all of us think of ourselves as rational and intelligent. And when evidence comes up which shows that the view we held was wrong, we have two options:
- To pause and accept that we were wrong. The issue with this approach is that it is threatening. It forces us to acknowledge we might be wrong about things on which we have staked a great deal.
- The second option is to simply deny the evidence. We reframe the evidence, filter it, say it did not happen.
This cognitive dissonance also becomes important in cases where we have made an effort to be a part of a group for example. Even if the group is bad, we would feel motivated to ignore everything, apply cognitive filters. Because our ego is involved. Similar example would be when you purchase a gadget, a costly gadget and it fails. You cannot say that it was a bad gadget. You purchased it. Your ego is staked on the decision.
"The pattern is rarely uncovered unless subjects are willing to make mistakes – that is, to test numbers that violate their belief. Instead most people get stuck in a narrow and wrong hypothesis, as often happens in real life, such that their only way out is to make a mistake that turns out not to be a mistake after all. Sometimes, committing errors is not just the fastest way to the correct answer; it’s the only way."
This is an example of confirmation bias in action. Confirmation bias basically states:
We find it much easier to retain facts that conform to what we believe in.
Which, of course has its pitfalls.
Trying to falsify our hypothesis leads to finding the correct hypothesis quicker.
This is why confirmation bias is dangerous. It is easy to form an early hypothesis and then continue finding additional data to conform to that hypothesis. However if approached more critically, it is far easier to know if this theory can be falsified or not.
The higher people are up the hierarchy, the more profound the affects of cognitive dissonance. The harder it gets to admit mistake.
"Memory is a system dispersed throughout the brain, and is subject to all sorts of biases. Memories are suggestible. We often assemble fragments of entirely different experiences and weave them together into what seems like a coherent whole. With each recollection, we engage in editing."
"Cumulative selection works, then, if there is some form of ‘memory’: i.e. if the results of one selection test are fed into the next, and into the next, and so on. This process is so powerful that, in the natural world, it confers what has been called ‘the illusion of design’: animals that look as if they were designed by a vast intelligence when they were, in fact, created by a blind process."
Cumulative selection/evolutionary priniciple talks about trials and errors, and learning from the errors and improving the system.
"If we view the world as simple, we are going to expect to understand it without the need for testing and learning. The narrative fallacy, in effect, biases us towards top-down rather than bottom-up. We are going to trust our hunches, our existing knowledge, and the stories that we tell ourselves about the problems we face, rather than testing our assumptions, seeing their flaws, and learning."
Narrative fallacy means that in practical situations we would consider error/failure to not be an inevitable result of the gap between our understanding of it, and the actual complexity of the system. A reality in which an error/failure is a fairly normal act. Instead, if there is no complexity then the ego gets involved, and confirmation bias comes into the picture, and then it becomes very difficult to accept errors/failures.
In coding, and as I think about it, in writing as well, this top-down approach of planning everything and not leaving anything to trials and feedbacks can make one not do anything.
As an example, I want to put poems on the web. Now instead of just doing that, I decided that I need a website, which would require me to learn all sorts of backend and front-end technologies. And so, I end up doing nothing.
In case of a coder, she decides that the system would perform better, if she designs a new language. That takes four years, and she ends up putting out nothing.
The ideal approach has to be a hybrid of planning everything before hand, and stopping yourself from not doing any writing because you are too busy planning.
"Success is not just dependent on before-the-event reasoning, it is also about after-the-trigger adaptation."
The desire for perfectionism rests upon two fallacies:
- A miscalculation that you can create the perfect thing, sitting at home.
- Fear of failure. Basically pre-empting the closed loop.So worried about messing up that you do not enter the arena.
"Closed loops are often perpetuated by people covering up mistakes. They are also kept in place when people spin their mistakes, rather than confronting them head on. But there is a third way that closed loops are sustained over time: through skewed interpretation."
This whole evolutionary method of thinking about success would not be successful without a proper control test, i.e. a RCT or Random Control Trial. What this entails, basically is to test against a control set, for which no change happens, i.e. they are not give a dose of a vaccine, or a visit to a new website, etc. Because there can be many reasons for success, outside of the change(s) we made.
"'marginal gains,’ he said. ‘The approach comes from the idea that if you break down a big goal into small parts, and then improve on each of them, you will deliver a huge increase when you put them all together.’"
"Marginal gains is not about making small changes and hoping they fly. Rather, it is about breaking down a big problem into small parts in order to rigorously establish what works and what doesn’t. Ultimately the approach emerges from a basic property of empirical evidence: to find out if something is working, you must isolate its effect."
"Creativity not guided by a feedback mechanism is little more than white noise. Success is a complex interplay between creativity and measurement, the two operating together, the two sides of the optimisation loop."
Marginal gains talks about gaining a local maxima by dividing a big thing into smaller portions and keep on improving those smaller portions. These small improvements when combined, result in a bigger postive change. The issue with this approach, basically, is that there very well can be an even bigger maxima, which lies in the vicinity of this local maxima, but we can't reach that using marginal gains. What we need is a big leap, a jump of faith.
Creative process starts with a problem. A failure. Something that does not work. And then comes the response to the broken thing, the problem.
"Imagination is not fragile. It feeds off flaws, difficulties and problems."
This is true while writing. While writing, critiquing, or thinking in terms of plausability, usually leads to more and better ideas.
"Contradictory information jars, in much the same way that error jars. It encourages us to engage in a new way."
"Creativity is just connecting things."
In most cases, creativity is joining two things which exist across different domains, and make something new.
These epiphanies happen in two sorts of environments:
- When we are switching off. We have to take a step back for the associative state to emerge.
- When we are being sparked by the dissent of others, i.e. when we have to respond to challenges and critiques.
Epiphany is basically the start of the creative process. Once you get epiphany, you have to use the marignal gains priniciples to make the idea work.
"We have to engage with the complexity of the world if we are to learn from it; we have to resist the hardwired tendency to blame instantly, and look deeper into the factors surrounding error if we are going to figure out what really happened and thus create a culture based upon openness and honesty rather than defensiveness and back-covering."
The people working on the ground, have important data to share in case of any failures. However, if an opaque culture exists in the company then blaming other for failures becomes the easy way out. Blaming others for failures, and taking credit for work others have done. Both scenarios are bad.
Most people think this is what accountability looks like. A disproportionate response to failure. It is supposed to scare the blamed people straight. But, it does not.
"Trying to increase discipline and accountability in the absence of a just culture has precisely the opposite effect. It destroys morale, increases defensiveness and drives vital information deep underground."
People with a growth mindset are more likely to look at failure as an opportunity to learn and get better. Because for them their capabilities are a mixture of talent and practice. And errors are an inherent part of practice.
Techniques to improve:
In a pre-mortem technique, at the off-set the leader tells the group that the project is dead. Now tell us why it has died. Similar to Ali's video where he talks about a goal, and also lists out the reasons why he might not be able to complete the goal.
This approach does not kill the project, but rather strengthens it.