Psychologist logo
BPS updates

New Voice: Does it pay to be analytical?

Stephanie Rhodes with the latest in our series for budding writers (see www.bps.org.uk/newvoices for more information)

22 May 2012

If you were to ask a group of people if they would like to be more rational and analytical when forming decisions in life many would nod avidly in agreement. Sometimes it seems that we are mere slaves to our emotions; ruled by our hearts rather than our heads. If only we had a switch that could turn off that gut feeling or annoying, lingering suspicion that led us to make that decision, maybe things could have turned out differently and for the better.

Social psychologist Jonathan Haidt applies a metaphor that exemplifies this battle of wills perfectly. He describes the roles of rational choice and intuition as ‘a rider on the back of an elephant’ (Haidt, 2006, p.4). This simplistic, illustrative concept of the divided mind mirrors well-known theories of the dual process approach to decision making within the realm of human cognitive psychology.

The core message of dual process theory is that human thinking is divided into two distinct but interacting systems known inventively as System 1 and System 2 (Stanovich, 1999). System 1, our elephant, is described as a default mechanism that produces quick, automatic and intuitive answers to decision-making dilemmas. It is most commonly referred to as the autonomous system (Stanovich, 2009). In contrast, System 2, our rider, is slow, far more analytical in its approach, and it has been associated with conscious activation and an excessive uptake of cognitive resources.

Our poor elephant often receives a bad press. It is fair to say that our intuitions are deemed less reliable than our deep analytical thoughts, given that they occur quickly and are driven by our affective responses. Our rider, on the other hand, may have been granted too much credit. If our analytical decisions are indeed slow and cognitively demanding, then surely an analytical thinking style must produce correct and/or optimum answers, but does this outcome always occur in reality? And what does happen if your rider gets it wrong?

Let’s be hasty…
Hasty decision making is considered to be routine practice amongst patients suffering schizophrenia and experiencing delusional beliefs. This inclination to jump to conclusions when faced with decision-making tasks has been labelled a ‘tendency or bias to the early acceptance and, to a lesser extent, the early rejection of hypotheses’ (Garety & Freeman, 1999, p.127). In plainer terms, those that jump to conclusions are prone to making snap judgements and may form decisions quickly on the basis of little evidence (Dudley & Over, 2003).

Not all people experiencing delusional beliefs exhibit the jump-to-conclusions bias. As Freeman et al. (2008) outlined, only one half of people with delusions jump to conclusions on probabilistic reasoning tasks, and interestingly 10–20 per cent of individuals without delusions display the data-gathering bias (Fine et al., 2007). Nevertheless, whether you engage in delusional thinking or not, a tendency to jump to conclusions when forming decisions has been labelled a vital indicator of flawed reasoning.

Interestingly however, it has recently been proposed that delusion-prone individuals may display a tendency to jump to conclusions in scenarios that induce a sense of feeling rushed (White & Mansell, 2009). Similarly, it has been found that delusion-prone individuals report greater confidence in their ideas when they encounter a stressful situation or feel particularly hurried in their decision making (Keefe & Warman, 2011).

To rely on your intuitions in scenarios such as this, and consequently jump to conclusions, may make perfect evolutionary sense. In situations of perceived potential danger and stress, it would serve no use to engage the analytical system and to ponder on the best possible action to take. To ponder is to waste time and prolong exposure to stress. Hasty decisions ensure speedy action and a swift response, and, as highlighted by Sutherland (1992, p.137), ‘It is better to be wrong than eaten’.

Hastiness in decision making may reflect an adaptive response to perceived stressors and may promote the generation of efficient and speedy decisions, free from the interference of our analytical deliberation. Indeed, this concept is certainly not a new one. Gerd Gigerenzer was one of the first psychologists to emphasise the importance of fast and frugal heuristics (Gigerenzer & Goldstein, 1996). A fast and frugal heuristic is essentially ‘a strategy, conscious or unconscious, that searches for minimal information’ (Gigerenzer, 2008, p.22). Gigerenzer stressed that heuristics, our cognitive shortcuts, could produce good solutions to an array of complex decisions in life including whom we choose to marry, which job offer to accept and which chess piece to move next. It is possible that our intuitive inclination systems are capable of producing accurate decision-making judgements far quicker than our analytical systems ever could.

Analytically incorrect...?                                                                  Bayes’ rule, or as commonly known, the Bayesian theorem, is a mathematical method of describing precisely the degree by which a prior belief will change in response to new information, resulting in the posterior degree of belief. Bayes’ rule incorporates the hypotheses together with the data and/or evidence in order to provide a normative system for probability judgement; highlighting the preferable judgements that one should make given a particular scenario.

Whilst the application of Bayesian theory to probability judgement should result in the most rational outcomes, given that the vital probabilities are recognised, it appears that people are particularly clumsy and misguided when estimating outcomes given certain probabilities, as highlighted by Kahneman and Tversky (1974).

One specific weakness in our basic ability to reason about probabilities relates to the emergence of the conjunction fallacy demonstrated by Kahneman and Tversky (1982) in the renowned ‘Linda the bank teller’ scenario. The problem presents participants with an account of Linda, outlining her personality characteristics, hobbies and interests. Participants are then instructed to identify which descriptions of Linda are most likely to be true and rank the given descriptions in order of probability. Participants who demonstrate this fallacy erroneously report that the conjunction of two events (Linda is a bank clerk and is active in the feminist movement) is more likely to occur than one of the events alone (e.g. Linda is a bank clerk) (Tversky & Kahneman, 1983). This apparent deviation from the system and logic of Bayes’ rule has been identified in numerous studies (Moro, 2008) despite attempts to eliminate its recurrence (Stolarz-Fantino & Fantino, 1990).

People also appear to be particularly vulnerable to neglecting base rate information when calculating probabilities (Kahneman & Tversky, 1972). Whilst it has been argued that such slip-ups are products of our heuristic-based intuitive systems (Kahneman & Tversky, 1972), this research could provide equally strong evidence to suggest that both our analytical and intuitive thinking system are failing us. If we can’t rely on our rider to lead the way, we are at the mercy of the haphazard directions of our elephant. Jonathan Evans (2010) has explicitly emphasised this point and stressed that neither intuitive nor reflective processes can be directly and independently attributed to accurate or inaccurate decision outcomes.

Flaws in our analytical reasoning may not be specific to our calculations of probabilities; there is evidence to suggest that our analytical thinking system may also struggle to calculate and anticipate our future affective states. It has been shown that analysing alternatives when decision making can lead to an in-depth and over-weighted analysis of our future emotions (as is the case with regret theory: Loomes & Sugden, 1982).

In other words, we reflect and attempt to anticipate the consequences of our decision options upon our future emotional state. However, it has been shown that we are often misguided in this endeavour and are surprisingly poor at predicting how future events will make us feel in reality (Gilbert, 2006).  

If faced with the prospect of either winning the lottery jackpot or being paralysed from the neck down, many would securely and confidently claim that the former scenario would lead to much greater happiness for the years to come compared to the latter. Analytically speaking, the happiness of winning the lottery would be purely based on gains not losses. We could secure a luxury lifestyle; an expensive house, a lavish car and long and exotic holidays. You could conclude that your happiness would soar for the long and foreseeable future. Being paralysed, however, could only strip you of your current happiness, taking your independence along with it. However, in a classic and intriguing study, within the course of only 12 months, lottery winners and paraplegics had both returned to their baseline levels of happiness (Brickman et al., 1978).

You may argue that it is your intuition hat may lead you to draw such a conclusion; winning the lottery must make you happier in the long term because it provides opportunities whilst paralysis takes them away. Lottery winners can indulge in pleasant activities, they look happier, therefore they are happier. But perhaps your rider has failed to inform your elephant that it is going the wrong way. Your rational system has failed to engage and you haven’t taken into account the losses that winning the lottery jackpot can bring, such as losing close friends, relationships and trust. Equally, you have failed to consider the gains of experiencing paralysis such as becoming closer to loved ones and the potential of realising what really matters in life. Essentially, you have failed to appreciate your ability to adapt (see Ariely, 2010, for a discussion of the impact of adaptation upon happiness).  

Relying on our intuitions
It appears that our analytical thinking system is forever failing to intervene. For example, in the throes of passionate love we form decisions and ultimately relationships that we may come later
to regret; swept away in the spur of the moment by our hearts rather than our heads. But even if our analytical side had engaged and discouraged us from becoming so romantically involved, there is no guarantee that the decision is the right one. Passionate love is, evolutionarily speaking, vital. It encourages partners to commit and ultimately to conceive. Maybe ignoring the advice of our rider is completely evolutionarily rational.

Logical calculations of the prior and posterior odds of probability judgements appear so impressive that surely they must produce reliable and accurate decisions. And they may do, if calculated correctly, which many humans struggle to do daily. When a computer is presented with a logical conundrum that contests the rules and logic of its technological programming, the computer is at a complete loss to respond. Humans however, may have a completely unique back-up system.

It has been consistently stressed that System 2, our analytical system, is required to intervene upon our autonomous processes where necessary (Stanovich, 2004). However, it is possible that our intuitions may exist as an insurance policy when our analytical system fails to pay out. It has long been assumed that our rational system is superior, given that our analytical processes are slower and cognitively demanding. But we must also consider the possibility that our analytical system is prone to error (Evans, 2010), and when our rider makes a mistake our elephant must realise this quickly and respond appropriately. So the next time it appears that your elephant has taken charge, don’t feel bad about it, maybe it’s because your rider can’t read the map. 

Stephanie K. Rhodes
is Demonstrator in Psychology and part-time PhD student at the University of Wolverhampton
[email protected]