Emotions play a significant part in the decision-making process, according to the latest in our academic essays series


The 13th century Persian poet Ibn Yamin Faryumadi described four types of man:

One who knows and knows that he knows - his horse of wisdom will reach the skies

One who knows, but doesn’t know that he knows - he is fast asleep, so you should wake him

One who doesn’t know, but knows that he doesn’t know - his limping mule will eventually get him home

One who doesn’t know and doesn’t know that he doesn’t know - he will be eternally lost in his hopeless oblivion! [Wikipedia 2012].

The words were re-used and popularised afresh by Donald Rumsfeld, then US Secretary of Defence, in a press briefing in 2002 (Defence Government Transcripts), in which he refer­red to known unknowns and unknown unknowns when question­­ed about the Iraq war and the existence of weapons of mass destruction.

So, the strategy of the USA and UK governments in their offensive action against Iraq early in the millennium was founded on the ‘known known’ that weapons of mass destruction existed inside Iraqi borders and that Saddam Hussein was capable of deploying them. As we now know, this information was flawed and in reality the strategy was predicated on an ‘unknown unknown’; US and UK governments did not know the actual state of weapon development in Iraq and appeared not to be aware of their imperfect knowledge.

In his briefing to the press, Rumsfeld said that without infinite knowledge there were limits to his abilities. This is a brave and unusual admission for leaders of high rank and responsibility to make, for surely it is the reason they are at the top of the hierarchy and are making strategic decisions; because they have proven themselves to be reasoned and capable analysts?

Also, he must have access to vast amounts of intelligence as well as having years of experience to draw on? Even with all this at his disposal, Rumsfeld states that his effectiveness is limited by the knowledge he doesn’t have and doesn’t know he doesn’t have.

More than 60 years ago, Herbert Simon (1947) observed that people in organisations do not always act on perfect information, or make purely rational decisions. He proposed the reason for this was the cognitive limitations of our minds. Simon used the term “bounded rationality” to describe this.

Despite this early seminal work and since then, many other excellent and persuasive studies, the popular ‘text book’ description of strategy remains a logical, optimising process that largely depends on ‘unbounded rationality’, that is:

• decisions are made on sound information;

• thorough analysis reveals all the information that is relevant; and

• choices are made between the available options that maximise the chance of goal achievement for the decision-maker.

It seems that, acting within the ‘known, knowns’ domain is highly attractive to us, even when common sense tells us this perfection is unattainable. Why is this?

Influential researchers in the field of decision-making take a fairly dim view of our inability to act rationally. They describe how we behave as flawed - rather than natural. For example, we fall into traps (Hammond, Keeney and Raiffa, 1998), we are blinded (Bazerman and Chugh, 2006), we have delusions and emotions (Lovallo and Kahneman, 2003, Morse, 2006), we are subject to bias and act on gut feel (Kahneman et al, 2011, McKinsey, 2010).

The language used to describe our behaviour reinforces the taken-for-granted superiority of rationality. Is this helpful? Is intuition always bad? And if it is unnatural for us to think rationally all the time, will we be able to cure our behaviour in order to be good, rational strategists? If not, what can we do?

Because we have only so much brainpower and only so much time, we often solve difficult problems quickly rather than rationally. We adopt rules of thumb (heuristics), rely on memory and past experience, use our gut feeling and intuition as shortcuts and as ways to economise on the use of our cognitive faculties. According to the rational view, this translates into bias and mistakes in our behaviour and, ultimately, poor strategic performance.

We admire confident people

So, it is natural for us to behave as if we know more than is possible to know and as if we are more in control than it is possible to be. Confidence attracts followers and this has organisational consequences.

In a recent conversation between two of the leading researchers in the field of decision-making, Kahneman and Klein agreed that leaders are selected more often for their confident risk-taking than for inherent wisdom (McKinsey Quarterly, 2010a). Kahneman said: “There really is a strong expectation that leaders will be decisive and act quickly. We deeply want to be led by people who know what they are doing and who don’t have to think about it too much …”

He went on to link this with hindsight bias; explaining how many lucky plays are converted into post-rationalised, stories about clever, deliberate strategies created by leaders gifted with foresight. This builds people’s confidence in their choice of who to follow and in the depth of that leader’s knowledge.

There is also evidence that leaders and ‘experts’, over periods of time, become disproportionately more assured of their own abilities, which is often an unjustified belief (Tetlock, 2005). Coupled with the behaviours described in the previous paragraph, we have a self-supporting circle of false confidence with no get-out. This is how strategies can develop based on charisma and luck, and when they happen to be successful, are often justified in the telling as visionary and heroic.

Researchers in the field of decision-making take a fairly dim view of our inability to act rationally”

Julie Verity, Cass Business School

Emotion is instrumental in creating this behaviour. Counter-intuitively perhaps, fear is probably providing us with this shield of confidence. Our reputation is our credibility so we take care of what others think about us, motivated by the fear of loss of status.

This makes it harder to admit mistakes than place the blame elsewhere, on external causes or ‘the system’ for example (sometimes labelled attribution errors [Ross, 1977]), this is more likely to occur when there is hectic activity, pressure or stress within the system (Senge, 2006, Omerod, 2005, Edmonson and Cannon, 2005).

Therefore, we are predisposed to follow confidence and trust confident people. To admit openly that we doubt our actions or listen actively to others who have a different point of view is subconsciously painful and unlikely to bring us organisational respect.

Armed with stories of small, past successes, most likely not representative of reality, we can keep anxiety and doubt at bay and, on the positive side, we are able to take decisive action even though it is often unlikely to be rationally sound.

We are anxious when uncertain

Emotions are always part of the decision-making process (Damasio, 2000, Pinker, 1997). Powerful drivers of behaviour and, counter to what we want to believe, they very often call the tune over reasoning and logic. Emotional responses are triggered by some of the most primitive parts of the brain and have been responsible for our survival over millennia.

Fear, anger, disgust, surprise, sadness and joy the six universally recognisable human emotions (Ekmann, 1993) enable fast and mindless reactions to external events that would have been highly beneficial in the ‘on edge’ dangerous world of our primitive ancestors. Because these behaviours happen with speed and without conscious thought, they are particularly hard to temper with rational reasonableness, even when their relevance is limited in our modern context [fear of snakes is commonplace but, in most contemporary situations, is rarely needed as a protection device].

The fear of loss, for example, makes us cautious and possessive about what we have, even when it might have little value for us (the endowment effect). We feel the pain of loss disproportionately more than we feel good about gains; the evidence suggests that gaining has to outweigh losses 2:1 before we are persuaded to risk making a choice for change (Kahneman and Tversky, 1979).

Thinking about uncertainty also triggers parts of our brain to make us physically afraid (Camerer et al, 2004. Morse, 2006). This gives us a predisposition in favour of the status quo. It also explains the sunk cost trap that describes our reluctance to abandon projects we have invested time, money, energy, passion and reputation in - all of which will be lost if the project is abandoned.

Loss aversion can also explain why we are often poor at making choices. Provided with a choice where the difference is small (an orange flavoured ice cream versus a lemon flavoured one), assuming both flavours are equally palatable, it is hard to give one up. Having both seems the best option since then one does not have to be lost!

Emotional responses are triggered by some of the most primitive parts of the brain and have been responsible for our survival”

Julie Verity, Cass Business School

When choices are widely different or complicated we also struggle, but probably for different reasons. In this situation, our brains appear lazy, preferring the shortcut and avoiding the necessary analysis to enable taking the decision. Ariely (Ted talks) illustrates this, powerfully showing how we defer to the default option rather than have to consider a weighty choice. Again, it is easier to defer the decision to the status quo or what is already decided than to take the effort to consider a complex problem.

The human predisposition favours stability, especially when we are in a comfortable situation. Habits, routines, the expected, defaulting to the status quo are reassuring and allow our minds to expend less effort. This explains a lot about organisational life and the issues many organisations have trying to implement strategic change.

While in their comfort zones, people find it hard to risk change because of the losses they are likely to incur. Strangely, when there is uncertainty and anxiety is heightened, productivity often rises, our natural reaction being to work hard to protect against loss. But, when loss is certain and cannot be avoided, we will scramble and take many more and bigger risks in an attempt to cover-for and/or recover from our losses, often adopting a mindset of: what else have we got to lose?

As Kahneman summarises: “Utility [the value people place on something] cannot be divorced from emotion, and emotion is triggered by changes. A theory of choice that completely ignores feelings such as pain of losses and the regret of mistakes is …unrealistic”.


Julie Verity is a senior visiting lecturer at Cass Business School.

This essay is an excerpt from Chapter 5 of The New Strategic Landscape: Innovative Perspectives on Strategy by Julie Verity, published by Cass Business Press/Palgrave Macmillan (2012) ISBN 0230358373 and reproduced with permission