Behavioural science has given us concepts, theories, and frameworks that help make sense of the complexities of human behaviour and how people make decisions. Some of these focus on understanding the critical role of context or explore concepts such as anchoring or framing of information. Others look at how everyone is wired with inherent pre-dispositions, otherwise known as cognitive biases, that cause us to act in certain ways. Examples of these biases include availability bias and confirmation bias.
Arguably the most famous theory in the behavioural science world was popularised by Nobel Laureate Daniel Kahneman and describes the process of ‘thinking fast and slow’ otherwise known as System 1 and System 2 thinking. This two-system model has been widely adopted due to its simplicity and intuitive nature. Nowadays, even if you don’t know anything about behavioural science, you’ve probably heard of Kahneman and would recognise the phrase System 1 and 2’.
This article, the fifth in our series exploring new frontiers in behavioural science, is all about System 1 and System 2 thinking. At this point in the series we want to take a pause from looking forward to re-establish the foundations that underpin our behavioural science knowledge. We cannot explore new frontiers on unstable footing, and as such we need to investigate how misperceptions about this pivotal theory have arisen over the years.
Firstly, we summarise System 1 and 2 as described by Kahneman and other behavioural scientists, before examining the claim that the theory is an oversimplification of the human mind. We then identify three key misconceptions that have developed following the extensive discussion of system 1 and 2 in the popular media, outlining evidence which ‘debunks’ these myths, improves our understanding, and strengthens our behavioural foundations.
System 1 & 2 - A Refresh
For centuries, philosophers, psychologists, and scientists alike have distinguished between intuitive and conscious reasoning; from Decartes’ mind-body dualism in the 17th century to Posner and Synder’s formal depiction of the (first) dual process model of the mind in 1975. However, it was not until Daniel Kahneman included the terms system 1 and system 2 in his 2011 bestselling book “Thinking Fast and Slow” that the distinction between automatic and deliberate thought processes became popularised (it is worth noting that he was not the first to use these terms, that honour goes to Stanovich & West in 2000[1])
Kahneman’s model divides the mind’s processes into two distinct systems:
- System 1 “is the brain’s fast, automatic, intuitive approach”[2]. System 1 activity includes the innate mental activities that we are born with, such as a preparedness to perceive the world around us, recognise objects, orient attention, avoid losses - and fear spiders! Other mental activities become fast and automatic through prolonged practice.
- System 2 is “the mind’s slower, analytical mode, where reason dominates” [3]. Usually, system 2 activity is activated when we do something that does not come naturally and requires some sort of conscious mental exertion.
A common example used to demonstrate the two systems is the following puzzle: A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?
Faced with this puzzle, the majority of people instantly guess 10 cents. The correct answer, however, is 5 cents - which, again, most people can work out after spending more time thinking about the question. For years, this has been used as a perfect example of how the way we think is ruled by two types of mental processes: fast and intuitive, versus slow and analytical.
The intention of the theory was to provide a helpful analogy that can guide our understanding of how our minds process information - and it does an admirable job of this. Analysing behaviours through a system 1 and 2 lens has been invaluable for furthering our understanding of human decision-making and behaviour, as well as exploring ways we can influence or ‘nudge’ behaviour in different directions.
This theory quickly moved from academia to the non-academic world
The distinction between system 1 and 2 is appealing and the dual-system theory has travelled from the world of academia into popular language and mainstream thinking, in part due to its accessible nature. Although it is not the easiest read, the myriad examples Kahneman uses to illustrate concepts and ideas throughout the chapters result in a book that is entertaining and can be appreciated across the board.
On the one hand this is a welcome bridge for the often-criticised gap that exists amongst academics and the ‘real-world’. On the other, however, in its transition from academia to pop-culture, the original theory of system 1 and 2 seems to have lost some of its depth, nuance and detail, and has been replaced with over-generalisations and (often false) simplifications of how the human mind operates. Central to this misunderstanding seems to be the idea that system 1 and system 2 are literal representations of our brain structure. Additionally, many of the more popular ‘sound-bites’ from the book have been reproduced and disseminated without the context and constraints provided when read in-situ.
As the dual-system theory sticks in ‘everyday language’ and is used liberally by a variety of non-academic sources, the original descriptions of the system 1 and 2 are somewhat glossed over, resulting in a variety of oversimplified assertions (myths) about how our mind operates. As behavioural science progresses, it is important to be wary of these myths that have arisen and to remain vigilant in our understanding.
This article aims to ‘de-bunk’ three key myths that have emerged in popular media by focussing on the following facts - as per our current understanding - of system 1 and 2:
- The brain is not literally divided into two!
- System 1 and 2 work in tandem, not as separate entities
- Both systems can be biased and can make mistakes - Neither one is categorically “good” or “bad”
In this way we ‘re-establish’ an important section of the foundations of behavioural science, which will ultimately allow us continue to move forward towards new frontiers.
FACT 1: The brain is not literally divided into two
Just as the common myth that people are either right or left brained has been proved false, we also know there aren’t actually sections of the brain with system 1 or system 2 stamped on them. A misconception many people have is that our brain is physically divided into two parts, but this is not the case. Indeed, Kahneman clearly states that “there is no one part of the brain that either of the systems would call home”[4].
The idea of left-brain and right-brain thinking is persistent, and many people continue to believe that the left side of the brain is responsible for analytical thinking, whilst the right side is more creative. It is easy to understand why system 1 and system 2 type thinking have been mistakenly associated with this idea. System 2’s rational, logical thinking is analogous with the ‘left brain’ and similarly system 1 thinking seems easily associated with the idea of an intuitive, artistic right brain.
These ideas are fundamentally incorrect, however. The brain is not physically divided in any way and as such system 1 and system 2 type thinking cannot be physically divided either. This debunks our first myth.
All this being said, neuroscientists have found that some regions of the brain are slightly more associated with one of the two systems[5].
For example, this body of evidence indicates that affective cognition (system 1-type thinking for emotional responses) is located in the mesolimbic dopamine reward system. This pathway is responsible for the release of dopamine. Given that human beings tend to seek instant gratification, dopamine plays a key role in “thinking fast”[6]. On the other hand, the frontal and parietal cortex have been linked to the analytic system of decision-making (system 2), therefore this region is more associated with our complex reasoning and higher-order “slow” thinking.
The separation of brain functions for decision-making and perceived specialisation has given rise to the multiple systems hypothesis. However, it is important to note that it is the combination of information gathered from the multiple systems - mesolimbic pathway, frontal cortex, and parietal cortex - that help to produce our decisions. In other words, whilst different regions may be more or less relevant for either system 1 or system 2, neither hemisphere is restricted to solely system 1-type decision-making and the other for system 2.
FACT 2: System 1 and 2 work in tandem, not as separate entities
Another myth or common misconception is that system 1 and 2 are hierarchical processes with one occurring before the other. In more general terms this means that people often think system 1 thinking occurs first and system 2 thinking following later if necessary. The dual-system approach actually imagines the two forms of reasoning as integrated and mutually supportive. Indeed, Kahneman points out that almost all processes are a mix of both systems, and it is important to emphasise that the systems are complementary.
Importantly, unconscious processes such as emotion (system 1) play a vital role in our more logical reasoning (system 2), and it is this integrative approach that makes our decision-making meaningful, and often more effective and purposeful[7]. The philosopher David Hume, for example, recognises the importance of the heart (system 1) for the head (system 2) in decision-making as reason alone rarely provides any clear motivation and drive. Without emotion or feeling, reason is merely a cold, mechanical method of calculation; informing us of what the consequences of our actions may be, but not whether they are desirable.
Ellen Peters and her colleagues provide further evidence of the mutually supportive nature of system 1 and 2, demonstrating how decisions are most effective when drawing from both systems. They conducted an experiment in which they gave participants tasks that required processing numbers. Unsurprisingly, participants who had high levels of numeracy outperformed those who were less numerate. Numeracy has been previously linked to an improved ability to use system 2 reasoning effectively.
However, they also found that participants with great numeracy skills were also able to use system 1 reasoning more frequently and reliably. They found that the more numerically able participants’ unconscious responses guided their initial unconscious decisions, which then triggered the use of the conscious thought needed to complete the task. Importantly, they also found that over time, the consistent and effective use of system 2 reasoning calibrates system 1 processing making that more effective, which in turn promotes better systematic (system 2) reasoning, essentially creating a feedback loop. These findings suggest that the two systems do not work in isolation but are in fact integrated and mutually influential on each other.
Outside of an experimental setting, everyday tasks provide further evidence for the teamwork of systems 1 and 2. Language is just one example; we communicate deliberately, but during the flow of conversation we don’t tend to rehearse grammatical rules which are taken into account without conscious thought. Physical activity is another. Recent research suggests that exercise is partly habit-driven, yet also requires conscious oversight to be successfully completed[8]. We can also see the integrated nature of systems 1 and 2 in tasks such as driving a familiar route, typing, or playing a well-rehearsed tune on an instrument[9], all of which require a combination of deliberate and automatic action.
FACT 3: Both systems can be biased and can make mistakes - neither one is categorically “good” or “bad”
A particularly interesting myth that has developed around systems 1 and 2 is the idea that system 1 is the source of bias, and system 2 is called up as the ‘voice of reason’ to correct such biases in our thinking. This may have developed off the back of the common mistake of using the terms ‘emotional’ and ‘irrational’ interchangeably - particularly in the context of describing a person’s disposition.
Whatever the reason may be for the development of this misconception, it is in fact, a myth. It is not the case that system 1 is biased and system 2 is not. Both are actually susceptible to bias and both can make mistakes. For example, system 1 may have gathered accurate information, yet system 2 may process this poorly and make a mistake. Conversely, system 1 may have gathered biased information and so despite system 2 processing it accurately, the conclusion may be incorrect due to a biased starting point.
Confirmation bias is a good example of how both systems can be affected by bias: we may notice and more easily remember information that supports our existing beliefs (a system 1 activity), while also being motivated to analyse new information in a way that supports our existing belief (a system 2 activity).
This is to say, system 2 is just as prone to error as is system 1; we ignore evidence we dislike, overthink seemingly simple/ irrelevant decisions, rationalise our biases and produce questionable justifications for bad decisions: I only had a small breakfast, so it is fine to have a big slice of cake.
In the medical field, for example, it was long thought that diagnostic errors were caused by system 1 type reasoning and clinicians were consequently advised to think more slowly and gather as much information as possible. However, later reviews found that experts were just as likely to make errors when attempting to be systematic and analytical. Research by behavioural scientists such as Gerd Gigerenzer has shown that more information and slower processing does not always lead to the most accurate answer. Diagnosing patients and making treatment decisions using mental shortcuts and evidence-based rules of thumb can perform just as well, if not better. This discovery lead to the creation of ‘fast and frugal’ decisions trees for patient diagnosis, where doctors only needed to ask three crucial diagnostic questions. When tested, using this method improved accurate diagnosis of heart disease by between 15-25%[10].
Indeed, more information, more computation, and more time do not always result in better performance; system 1 type strategies can often be just as effective, if not more, in certain circumstances than more complex decision-making strategies. Researchers decided to test the accuracy of various heuristics (a system 1 activity) across a number of real-world situations and compare this accuracy against more complex decision-making strategies as the benchmark[11].
One of the tasks in their experiment was to predict which of two cities (Los Angeles or Chicago) had a higher rate of homelessness based on some basic initial data points provided. They compared the accuracy of prediction using three common heuristics (take-the-best, tallying and minimalist[12] ) to two baseline complex predictive strategies (linear regression and naive Bayes[13] ) and found that when faced with limited initial data, heuristic strategies actually outperform complex strategies. These results suggest that the increased effort does not always result in increased accuracy as a general rule[14].
Kahneman finds the misconception that system 1 is error-prone and system 2 is analytic and therefore correct “ridiculous”. He explains that “system 1 is not a machine for making errors, it usually functions beautifully”[15]. Indeed, critics and proponents of system 1 and 2 alike agree on the pressing need to dispel the ‘good/bad’ (biased/ unbiased) fallacy.
Conclusion
Since Kahneman first published ‘Thinking Fast and Slow’, the theory of system 1 and system 2 thinking has quickly spread through both the academic and non-academic worlds, referenced not only in the behavioural sciences but across a variety of disciplines and in popular media. Its popularity rests, in part, in its intuitive simplicity but this had led to misunderstandings and misconceptions about how this dual-system of decision making actually works.
In this article we have debunked three pervasive myths by demonstrating the true facts behind the fictions.
- Fact 1 - The two systems are not physically tied to any specific area of the brain
- Fact 2 – System 1 & 2 are complementary systems that work in tandem to produce more effective and efficient decision-making
- Fact 3 - Neither system is accurate 100% of the time, both can make mistakes!
While these myths possess considerable intuitive appeal, it would be a shame – and more importantly, damaging to the field - if their simplistic descriptions drowned out the more fascinating story of how our brains really work. The theory of system 1 and system 2 is incredibly useful as a way to understand the complexities of human decision making. Clearing up these pervasive myths can only help us utilise the theory even more effectively in the future.
New Frontiers in Behavioural Science Series:
Article 1 - The Past, The Present and The Future
Article 2 - Default Settings - The most powerful tool in the behavioural scientist's toolbox
Article 3 - Social norms and conformity part 1
Article 4 - Social norms and conformity part 2
About the authors:
Crawford Hollingworth is co-Founder of The Behavioural Architects, which he launched in 2011 with co-Founders Sian Davies and Sarah Davies. He was also founder of HeadlightVision in London and New York, a behavioural trends research consultancy. HeadlightVision was acquired by WPP in 2003. He has written and spoken widely on the subject of behavioural economics for various institutions and publications, including the Market Research Society, Marketing Society, Market Leader, Aura, AQR, London Business School and Impact magazine. Crawford is a Fellow of The Marketing Society and Royal Society of Arts.
Liz Barker is Global Head of BE Intelligence & Networks at The Behavioural Architects, advancing the application of behavioural science by bridging the worlds of academia and business. Her background is in Economics, particularly the application of behavioural economics across a wide range of fields, from global business and finance to international development. Liz has a BA and MSc in Economics from Cambridge and Oxford.
[1] Stanovich, K.E. & West, R.F. (2000). Individual Differences in Reasoning: Implications for the Rationality Debate. Behavioural and Brain Sciences, 23, 645-665
[2] The Harvard Gazette (2014). Layers of choice. Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/
[3] The Harvard Gazette (2014). Layers of choice. Retrieved from https://news.harvard.edu/gazette/story/2014/02/layers-of-choice/
[4] Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux. Pg29.
[5] Camerer, C., Loewenstein, G., & Prelec, D. (2005). Neuroeconomics: How Neuroscience Can Inform Economics. Journal of Economic Literature, 43(1), 9-64.
[6] TheEconReview. (2017). What Neuroscience Has to Say about Decision-Making. Retrieved from https://theeconreview.com/2017/01/13/what-neuroscience-has-to-say-about-decision-making/
[7] Peters, E., Västfjäll, D., Slovic, P., Mertz, C. K., Mazzocco, K., & Dickert, S. (2006). Numeracy and decision making. Psychological science, 17(5), 407-413.
[8] Gardner, B., & Rebar, A. L. (2019). Habit Formation and Behaviour Change. In Oxford Research Encyclopaedia of Psychology; Rhodes, R. E., & Rebar, A. L. (2018). Physical activity habit: Complexities and controversies. In the Psychology of Habit. 91-109. New York: Springer
[9] New Scientist. (2018). We've got thinking all wrong. This is how your mind really works. Retrieved from https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/
[10] Green, L., & Mehr, D. R. (1997). What alters physicians’ decisions to admit to the coronary care unit? The Journal of Family Practice, 45(3), 219–226.
[11] Katsikopoulos, K.V., Schooler, L.J., Hertwig, R. (2010). The robust beauty of ordinary information. Psychological Review, 117, 1259–1266.
[12] The take-the-best heuristic assumes that cues are processed in order of validity, and it compares both alternatives on a single cue, one at a time, until a cue is found that distinguishes between the alternatives; tallying is a heuristics that simply tallies cues for or against each alternative; and minimalist heuristic assesses options against random cues, and applies a stopping rule when one of the alternatives has a positive cue, and the other does not (all cues receive the same weight for tallying and minimalistic heuristics).
[13] A linear regression predicts a statistical relationship between cues with a linear functional form; naive Bayes selects the alternative with the higher probability of having the higher criterion value, given the alternatives’ entire cue profile
[14] Hertwig, R., & Pachur, T. (2015). Heuristics, history of. In International encyclopaedia of the social & behavioural sciences. Elsevier.
[15] New Scientist. (2018). We've got thinking all wrong. This is how your mind really works. Retrieved from https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/
Newsletter
Enjoy this? Get more.
Our monthly newsletter, The Edit, curates the very best of our latest content including articles, podcasts, video.
Become a member
Not a member yet?
Now it's time for you and your team to get involved. Get access to world-class events, exclusive publications, professional development, partner discounts and the chance to grow your network.