
10 Mar From psychology to economics (Daniel Kahneman) | Part A’
‘Economists think about what people ought to do. Psychologists watch what they actually do.’
Daniel Kahneman
Kahneman won the Nobel Prize for economic sciences in 2002. Kahneman’s citation was for ‘integrating insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty – thereby laying the foundation for a new field of research’.
But from a broader perspective Kahneman is the prime example of a growing trend towards taking a multidisciplinary approach to economic problems, drawing on maths, computer science, biology, evolutionary science, geology and psychology as well the core areas of economics and finance.
The idea of a homo economicus – or economic human – motivated by self-interest and capable of rational decision-making is now increasingly seen as a model that fails to explain how the economy operates in practice. People can behave in ways that appear irrational, and a failure to acknowledge that is likely to lead to a misinterpretation of how the economy works.
Daniel Kahneman is a key mover behind the discipline of behavioural economics. It seeks to improve our understanding of how people make economic and financial choices and decisions. By blending economic analysis with techniques from psychology we can gain a better understanding of how people make important economic decisions that affect both their own well-being and that of the economy as a whole.
At the heart of Kahneman’s contribution to economics is his analysis of how people make decisions and particularly financial decisions. Given that an integral part of economics is the collective weight of the decisions that millions of people and businesses make every day, it is important to understand how each one makes those decisions especially in the face of uncertain outcomes. The first step in this analysis is to accept that people often appear not to behave rationally.
There is a division within rationality: between what people understand to be rational behaviour and how they in fact act. In other words there is a gap between how people think they should behave, and how they do behave in real life (what both economists and psychologists call normative and positive). It became increasingly obvious to psychologists such as Kahneman that traditional models of rational decision-making, which make assumptions about people’s behaviour, do not fully reflect human nature. For instance, it is unclear why homo economicus would help a friend, care for other people or donate money to charity.
One of Kahneman and Tversky’s first insights was that people made decisions either by intuition or by deliberate reasoning, which they called System 1 and System 2 respectively. Kahneman described these intuitive decisions as being fast, automatic and effortless, generating impressions of what was going on. In other words, people took mental short cuts even when making extremely large and important financial decisions. Reasoned judgements were taken more slowly and in a serial, effortful and deliberately controlled fashion.
If people tend to make many decisions on an intuitive basis it is critical to understand what the thought processes are that guide so-called rational people to make decisions that might appear in the cold light of day to be irrational.
Kahneman focused on the mental short cuts that people use when faced with uncertain and often complex decisions and which psychologists call heuristics (from the ancient Greek meaning to find out or discover), or what we might call rules of thumb. While these work a lot of the time and are tools people find useful, they involve making mental connections that are not logical and can lead to severe and systematic errors. Kahneman and Tversky called the errors that follow ‘cognitive biases’.
Thornton, Phil. The Great Economists: Ten Economists whose thinking changed the way we live (Kindle Locations 2801-2806). Pearson Education Limited. Kindle Edition.
In their 1974 paper Kahneman and Tversky identified three heuristics:
Availability, where someone assesses the probability of an event by the ease with which other occurrences can be brought to mind. For instance, a middle-aged person who has friends who have suffered heart attacks will attach a relatively higher risk to that than other more statistically important threats.
Representativeness, where people look to see whether two events resemble each other to help them to decide whether there is a connection between the two. For example, when hearing someone described as bookish, shy and retiring and asked whether they are more likely to be a librarian or a shop worker, many people will choose librarian. While this seems instinctively right (using System 1), it is only when one remembers (using System 2) that there are many times more shop workers than librarians in the working population that one realises that the odds of anyone being a librarian are tiny.
Anchoring and adjustment, where different starting points or references lead to different estimates. For instance, workers negotiating a pay rise will probably start bargaining against the first offer made by their employer rather than their goal.
Each of these heuristics led to a range of biases that emerged from the misapplication of these decision-making short cuts.
What these complex-sounding phenomena have in common is that people use similarities or associations rather than logical deduction to come to a conclusion.
The availability heuristic in turn leads to biases, including the retrievability, imaginability and illusory correlation biases. Events that are easy to recall or which readily spring to mind – and so can easily be ‘retrieved’ – are more likely to be used by someone making a quick decision than information that would have to be researched. Similarly, events that can be imagined – such as the possible risks involved in taking an airplane – will be given greater weight than ones that are harder to conceive. Thus a bomb on board a plane – a thankfully rare event – may seem a greater risk than the failure of a microprocessor in an engine or human error.
Biases emerging from the anchoring heuristic include the insufficient adjustment bias that results from people’s inability to abandon their first impressions when new information arrives late in the process. Kahneman and Tversky highlighted the ease with which it is possible to ‘anchor’ people’s thinking.
Part B’:http://www.lecturesbureau.gr/1/from-psychology-to-economics-part-b-1209/?lang=en
The Essays of Montaigne
Michel de Montaigne
Image: https://www.wobi.com/wbf-nyc/daniel-kahneman/