Decision making in a complex world
Streetlights and Shadows is a book by Gary Klein that takes commonly held maxims for decision making and overturns them, revealing cases where these practices break down. The book’s title comes from the story of the man who went looking for his lost housekeys under a streetlight instead of where he lost them in the shadows – he went for where there was more light. We do the same when we resort to old maxims that only help in ordered situations without realizing we need something different. Klein’s book does an impressive job showing us where these problems occur and he prescribes how we can be more resilient decision makers in these scenarios.
He begins by outlining ten of the most widely held beliefs taught about systems and decision making, and after telling a few stories about each he describes a replacement maxim. The ten claims and their replacements are:
- Teaching people procedures helps them perform tasks more skillfully. Replacement: In complex situations people will need judgment skills to follow procedures effectively and go beyond them when necessary.
- Decision biases distort our thinking. Replacement: Decision biases reflect our thinking. Rather than discouraging people from using heuristics, we should help them build expertise so they can use their heuristics more effectively.
- (Sub-point to 2) Successful decision makers rely on logic and statistics instead of intuition. Replacement: We need to blend systemic analysis and intuition.
- To make a decision, generate several options and compare them to pick the best one. Replacement: Good decision makers use their experience to recognize effective options and evaluate them through mental simulation.
- We can reduce uncertainty by gathering more information. Too much information can get in our way. Replacement: In complex environments, what we need isn’t the right information but the right way to understand the information we have.
- It’s bad to jump to conclusions – wait to see the evidence. Replacement: Speculate, but test your speculations instead of committing to them.
- To get people to learn, give them feedback on the consequences of their actions. Replacement: We can’t just give feedback; we have to find ways to make it understandable.
- To make sense of a situation, we draw inferences from the data. Replacement: We make sense of data by fitting them into stories and other frames, but the reverse also happens: our frames determine what counts as data.
- The starting point for any project is a clear description of the goal. Replacement: When facing wicked problems, we should redefine goals as we try to reach them.
- Our plans will succeed more often if we identify the biggest risks and then find ways to eliminate them. Replacement: We should cope with risk in complex situations by relying on resilience engineering rather than attempting to identify and prevent risks.
- Leaders can create common ground by assigning roles and setting ground rules in advance. Replacement: All team members are responsible for continually monitoring common ground for breakdowns and repairing the breakdown when necessary.
All of his replacement claims have common themes of adaptability, experience, and expertise winning over stricture, linear thinking, and risk management. I’ve found almost all of his ideas to be directly applicable to running a startup and the challenges of making decisions in that kind of environment. You are constantly in the shadows, with only hints of illumination peeking through.
Some of his replacements are no-brainers on a surface level. Procedures are dangerous and often have glaring holes you may suspect but may not be able to pinpoint. Of course data and your frame for looking at it are equally important, and one would expect that over time and with experience you will be able to make more decisions successfully from intuition alone over careful analysis.
His more subtle examples include statements about goal setting, risk management, and heuristics. These examples are especially applicable to teams of people and early stage projects. It’s nearly impossible to set goals in the beginning of projects when there are so many unknowns. Admitting this, and acknowledging that you will have to discover and revise your goals along the way, let’s you clear your mind of these issues and get back to doing. During the process, if you stay aware, the goals you were looking for will emerge on their own.
An early stage company is essentially a group of people trying to do something together. Each person will have their quirks and challenges that make them difficult to work with. You can tinker with a lot of things in a startup’s early days, but not your founding team and stars. By their nature they will usually add a ton of value, but they are unlikely to change. It’s up to you to build a culture where the erratic geniuses around you can get their work done, while minimizing their disturbances to progress elsewhere. A lot of my job is just supporting systems that create a more resilient environment where these folks can still thrive.
These people are just another kind of risk. Accepting you can’t change or get rid of them is the same as accepting that there are risks you are blind to, that you are in a complex situation where instead of a focus on risk elimination you should focus on risk tolerance. You will never be able to list everything that could go wrong and rat out the issues, so you must create an environment that is robust and resilient against these risks.
Some of the biggest ideas missing from Klein’s book come from war. Although he used a few examples from the modern military to show where local problems occur due to a misguided belief in some of the claims above, he made no mention of Boyd and his work on adaptability and the OODA loop, in which expertise and intuition are the keyholes. Klein even has a diagram of his own that looks like a dumbed down version of the loop.
Clausewitz’ idea of friction also gives us a unique stance from which to appreciate Klein’s work. Friction is a very apt term to use when thinking about startup problems. Acknowledgement of its ever-presence, with the care to combat it using the toolkit Klein provides, just might help us get better at making good decisions in our complex world.