The Suicide Bomber Approach to Decisions

Conviction, of choice and reason, has been one of the hardest qualities of good leadership for me to develop.  An appreciative and empathetic mind can usually find multiple angles from which to appreciate a story, which makes it hard to call someone or something clearly wrong.  Usually if you’ve entered the mind of the other party or explored enough angles, you recognize why the other story may make sense.

Somewhere, there is a line to make between being overly empathetic and a fear of being human – of experiencing guilt.  Hanging up on a decision is a key way to flame out as a leader.  In such a case you become no more than an invisible man, someone who refuses to run the risk of their own humanity.

There can be ways around this.  One of my favorite anecdotes is from Zizek’s story about suicide bombers and their own belief process.  In how he tells it, the suicide bomber isn’t as convinced of their cause as we think they are before they commit their act.  All the way up to it they have doubt weighing on their mind, and it’s only through the act of the suicide that they prove it to themselves that they were actually committed to the cause.

Empathy can muddy the waters in a way that becomes entrenching and ambiguous.  You reach a point over time where you are mired in cognitive dissonance – balancing two opposing mindsets at the same time.  Recently I discovered a possible out to this process.

Picture the mind of a suicide bomber just before he blows himself up.  Do you really think it’s very different than your own at this moment, where you have been teetering over a choice for the last hour or even days?  The severity of his choice is greater than your own, but this is precisely why we shouldn’t expect him to have any easier time convincing himself this is the right course of action.  Where your choice is between sticking to your diet for your upcoming birthday dinner, he is erasing himself from his family and culture, all with the aim of serving his country or cause.

If a person in that situation can commit themselves to an action so severe, while you, with your decades of education and first world dilemmas, deliberate back and forth over some inconsequential matter, how does that reflect on yourself?

The example is important because I think it points to a very special mechanism you have to develop if you’re going to be a leader.  You have to have a “suicide button” for yourself, a button that says you’ve committed to something with absolutely no regard for the consequences of it to yourself.  In fact, you might as well look at the consequences in the same light – certain death.  This becomes like the Stoic practice of negative visualization.  The worst thing that can happen is that you are destroyed.  If you’re prepared for this in light of the choice and the possible benefits for you and the organization, you might just have the conviction needed to carry the group through.

The suicide button is a model for yourself to use against the back-and-forth.  For the bomber, hitting his button is actually the final note by which he convinces himself that he is indeed a believer in the cause.

Where leadership depends so much on a vision-driver, a person who holds the accountability stick, you have to be very sure of your decisions and the ultimate success of your organization, even against overwhelming evidence to the contrary.  Your job is to know when and where you must push your suicide button and just dive in.

In a remarkable irony, I originally wrote this draft post several years ago and shied away from publishing for whatever reason, probably based in fear/embarrassment.  I’m just now coming back out of blog hiding, hoping nor to repeat that pattern.

Decision making in a complex world

Streetlights and Shadows is a book by Gary Klein that takes commonly held maxims for decision making and overturns them, revealing cases where these practices break down.  The book’s title comes from the story of the man who went looking for his lost housekeys under a streetlight instead of where he lost them in the shadows – he went for where there was more light.  We do the same when we resort to old maxims that only help in ordered situations without realizing we need something different.  Klein’s book does an impressive job showing us where these problems occur and he prescribes how we can be more resilient decision makers in these scenarios.

He begins by outlining ten of the most widely held beliefs taught about systems and decision making, and after telling a few stories about each he describes a replacement maxim.  The ten claims and their replacements are:

  1. Teaching people procedures helps them perform tasks more skillfully. Replacement: In complex situations people will need judgment skills to follow procedures effectively and go beyond them when necessary.
  2. Decision biases distort our thinking.  Replacement: Decision biases reflect our thinking.  Rather than discouraging people from using heuristics, we should help them build expertise so they can use their heuristics more effectively.
  3. (Sub-point to 2) Successful decision makers rely on logic and statistics instead of intuition.  Replacement: We need to blend systemic analysis and intuition.
  4. To make a decision, generate several options and compare them to pick the best one.  Replacement: Good decision makers use their experience to recognize effective options and evaluate them through mental simulation.
  5. We can reduce uncertainty by gathering more information.  Too much information can get in our way.  Replacement: In complex environments, what we need isn’t the right information but the right way to understand the information we have.
  6. It’s bad to jump to conclusions – wait to see the evidence.  Replacement: Speculate, but test your speculations instead of committing to them.
  7. To get people to learn, give them feedback on the consequences of their actions.  Replacement: We can’t just give feedback; we have to find ways to make it understandable.
  8. To make sense of a situation, we draw inferences from the data.  Replacement: We make sense of data by fitting them into stories and other frames, but the reverse also happens: our frames determine what counts as data.
  9. The starting point for any project is a clear description of the goal.  Replacement: When facing wicked problems, we should redefine goals as we try to reach them.
  10. Our plans will succeed more often if we identify the biggest risks and then find ways to eliminate them.  Replacement: We should cope with risk in complex situations by relying on resilience engineering rather than attempting to identify and prevent risks.
  11. Leaders can create common ground by assigning roles and setting ground rules in advance.  Replacement: All team members are responsible for continually monitoring common ground for breakdowns and repairing the breakdown when necessary.

All of his replacement claims have common themes of adaptability, experience, and expertise winning over stricture, linear thinking, and risk management.  I’ve found almost all of his ideas to be directly applicable to running a startup and the challenges of making decisions in that kind of environment.  You are constantly in the shadows, with only hints of illumination peeking through.

Some of his replacements are no-brainers on a surface level.  Procedures are dangerous and often have glaring holes you may suspect but may not be able to pinpoint.  Of course data and your frame for looking at it are equally important, and one would expect that over time and with experience you will be able to make more decisions successfully from intuition alone over careful analysis.

His more subtle examples include statements about goal setting, risk management, and heuristics.  These examples are especially applicable to teams of people and early stage projects.  It’s nearly impossible to set goals in the beginning of projects when there are so many unknowns.  Admitting this, and acknowledging that you will have to discover and revise your goals along the way, let’s you clear your mind of these issues and get back to doing.  During the process, if you stay aware, the goals you were looking for will emerge on their own.

An early stage company is essentially a group of people trying to do something together.  Each person will have their quirks and challenges that make them difficult to work with.  You can tinker with a lot of things in a startup’s early days, but not your founding team and stars.  By their nature they will usually add a ton of value, but they are unlikely to change.  It’s up to you to build a culture where the erratic geniuses around you can get their work done, while minimizing their disturbances to progress elsewhere.  A lot of my job is just supporting systems that create a more resilient environment where these folks can still thrive.

These people are just another kind of risk.  Accepting you can’t change or get rid of them is the same as accepting that there are risks you are blind to, that you are in a complex situation where instead of a focus on risk elimination you should focus on risk tolerance.  You will never be able to list everything that could go wrong and rat out the issues, so you must create an environment that is robust and resilient against these risks.

Some of the biggest ideas missing from Klein’s book come from war.  Although he used a few examples from the modern military to show where local problems occur due to a misguided belief in some of the claims above, he made no mention of Boyd and his work on adaptability and the OODA loop, in which expertise and intuition are the keyholes.  Klein even has a diagram of his own that looks like a dumbed down version of the loop.

Clausewitz’ idea of friction also gives us a unique stance from which to appreciate Klein’s work.  Friction is a very apt term to use when thinking about startup problems.  Acknowledgement of its ever-presence, with the care to combat it using the toolkit Klein provides, just might help us get better at making good decisions in our complex world.

Making decisions

My friend just returned from the Dominican Republic.  He is a photographer and took dozens of pictures from his stay in Santo Domingo.  One house in particular fascinated him.  A gorgeous palace of stone and brick, with a beautiful wooden interior.  It was built a few hundred years ago by a wealthy man from Italy.  His daughter missed their home country and he had this house built in an Italian style to remind her of home.  The father would have spent a fortune on this building.

This got me thinking.  A millionaire by today’s standards and he is still making decisions on basic emotions.  His daughter cries for home so he pulls out the purse and provides work and the livelihood for the architect, suppliers, and laborers on a massive project.  Imagine that – his want to make his daughter happy supported hundreds of people because of his decision.

Now imagine Ebay’s former CEO Meg Whitman championing the purchase of Skype for $2.6B.  Everyone around her agreeing to it, an $8.5B bureaucratic machine of lawyers and bean-counters mobilizing to make this happen.  In the end it was a bad decision, a poor fit.  Should have never happened.

What emotions were at play here?  What basic feelings made her and the people around her go through with this?  Whatever they were they weren’t enough.  The real issues weren’t strong enough on the table, and the quest was wrong.

Infochimps is still a small startup, and any decisions we make only have a swath of 3-10 people.  But it is so important for us to maintain our focus while we burn cash.  A slight deviation from the course at this stage can lead to a difference of hundreds of miles at the end.

I try to ask these these questions whenever an important decision is on our minds.  What is really pulling us in this direction?  Where are there biases?  Why should we do this?  What would our advisors say?  What could this mean 6 months, 1 year, and 5 years down the road?  What have we learned from decisions like this in the past that can make this experience better?