Nobel Lectures — Introductory Distillations, a personal reading project Economics · 2002
Behavioural Economics & Psychology

Maps of Bounded Rationality

How a psychologist infiltrated economics — and proved that the human mind, brilliant as it is, cuts corners in predictable ways.

Lecturer Daniel Kahneman
Delivered December 8, 2002 · Stockholm
Prize Nobel Memorial Prize in Economic Sciences
Co-authored with Amos Tversky (1937–1996)
Read the original lecture Scroll

Kahneman won the Nobel Prize in Economics despite never having taken an economics course. His weapon was something far older: careful observation of how ordinary people actually think — and how reliably they go wrong.

01 — The Big Picture

A Friendship That Changed How We See the Mind

This lecture is not the work of one man. Kahneman opens it with a dedication to Amos Tversky, his longtime collaborator who died in 1996. Their partnership, which began in Jerusalem in the late 1960s, produced decades of research that upended a foundational assumption of modern economics: that humans are rational.

Economists had long built their models on the idea of the rational agent — a person who weighs options carefully, assigns accurate probabilities, and makes decisions that maximise their own wellbeing. Kahneman and Tversky spent their careers documenting, systematically and patiently, all the ways this picture is flatly wrong.

The lecture presents a unified theory of why our minds err — not randomly, not occasionally, but in predictable, mappable ways. Think of it as a field guide to the shortcuts your brain takes when it thinks no one is looking.

02 — The Framework

Two Systems Inside Your Head

The lecture's central architecture is a distinction between two modes of thought. Kahneman calls them, simply, System 1 and System 2.

System 1
The Instinctive Mind
  • Fast & automatic
  • Runs without effort
  • Associative — links by feeling
  • Hard to switch off
  • Like perception — just happens
System 2
The Deliberate Mind
  • Slow & intentional
  • Demands real effort
  • Rule-governed reasoning
  • Can override System 1
  • But often doesn't bother

System 1 is the voice that instantly answers "what's 2 + 2?", reads the emotion on a face, or flinches at a spider. System 2 is the voice that slowly works through a tax return or tries to suppress a prejudice it knows is unfair.

Most of life runs on System 1. That's fine — it's fast, efficient, and usually right. The trouble begins when System 1 handles problems it's not equipped for, while System 2 sits back, overly willing to trust the quick answer.

Try It Yourself

A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?

Your System 1 shouted "10 cents!" almost immediately. But the correct answer is 5 cents. (If the ball were 10¢ and the bat $1 more, together they'd cost $1.20, not $1.10.)

Kahneman reports that 50% of Princeton students and 56% of University of Michigan students got this wrong — not because they lacked intelligence, but because they trusted System 1 and didn't ask System 2 to check.

03 — The First Illusion

How the Same Fact Becomes Two Different Feelings

One of Kahneman and Tversky's most striking discoveries was framing: the same information, presented differently, produces different choices. This directly contradicts the rational-agent assumption that only the underlying facts should matter.

The Asian Disease Problem

Imagine 600 people will die from a disease. Two programs are proposed:

Version A: "Program A saves 200 people. Program B has a one-third chance of saving everyone, two-thirds chance of saving nobody." Most people choose Program A — the safe option.

Version B (same facts, different words): "Program A means 400 people will die. Program B has a one-third chance that nobody dies, two-thirds chance that all 600 die." Now most people choose Program B — the risky option.

Mathematically identical. Emotionally opposite.

Why? Because System 1 responds to the emotional texture of words, not their logical content. "Saving 200 people" activates hope. "400 people will die" activates dread. The framing changes what's accessible in your mind — and what's most accessible controls the response.

People do not spontaneously transform the representation of decision problems. They adopt the formulation that is given, much as they perceive what is presented to their eyes.

— Kahneman, paraphrasing the lecture's core insight on framing

This has real-world consequences. The same study found that doctors make different recommendations about surgery versus radiation therapy depending on whether survival statistics are framed as "90% survive" or "10% die." The framing effect was equally strong among experienced physicians as among patients.

04 — The Core Theory

Why Losses Hurt Twice as Much as Gains Feel Good

Classical economics assumed that what matters in any financial decision is your final state of wealth. If you end up with $1,000, it doesn't matter whether you started with $900 and gained $100, or started with $1,200 and lost $200.

Kahneman and Tversky showed this is completely wrong. What we actually respond to is change — gains and losses relative to a reference point. And losses are felt roughly twice as intensely as equivalent gains.

The Asymmetry of Loss

Which matters more to you: winning $150 or not losing $100?

Research shows most people will decline a 50/50 bet to win $150 or lose $100 — even though the expected value is positive. The pain of the potential loss outweighs the pleasure of the potential gain. You'd need a chance at roughly $200 to make most people willing to risk losing $100.

They called this Prospect Theory, and it describes the shape of how we actually value outcomes. The function is steep and jagged around zero: a small loss stings disproportionately, a small gain barely registers in comparison. This is why people hold onto losing stocks too long (refusing to "realise" the loss), why a bad review hurts more than five good ones feel good, and why "don't lose what you have" motivates more than "gain something new."

The carriers of utility are gains and losses — changes of wealth — rather than final states. The pain of losing is roughly twice the pleasure of an equivalent gain.

— The central claim of Prospect Theory

Kahneman also highlights what he calls "Bernoulli's error" — the 300-year-old assumption (from Daniel Bernoulli's influential 1738 essay) that people evaluate outcomes based on absolute wealth levels. The error persisted for so long, Kahneman suggests, because it fit neatly with economic theories of rationality — even though anyone who has experienced the sting of a loss knows it isn't true.

05 — The Shortcuts

When Your Brain Answers a Different Question Than the One It Was Asked

The lecture introduces one of Kahneman's most elegant ideas: attribute substitution. When System 1 faces a difficult question, it quietly swaps it for an easier one — and answers that instead, without telling you.

The technical term for this substitution is a heuristic. Heuristics are mental shortcuts — not stupid ones, but ones that work well enough most of the time, while creating systematic, predictable errors in certain situations.

Meet Linda

Linda is 31, single, outspoken, and very bright. She majored in philosophy and was deeply involved in social justice causes as a student.

Which is more likely? (A) Linda is a bank teller. (B) Linda is a bank teller who is active in the feminist movement.

Most people choose (B). But (B) cannot be more probable than (A) — it's a subset of (A). To be a feminist bank teller, you first have to be a bank teller. This is the conjunction fallacy: we judge by how well Linda "fits" the description, not by mathematical probability.

The substitution is happening silently: you were asked about probability, but your brain answered the question "how much does Linda resemble a feminist bank teller?" The resemblance feels high, so it gets reported as high probability. You were answering a different question without knowing it.

This isn't a quirk of naive minds. Kahneman and Tversky found the same pattern in statistically sophisticated graduate students. Statistical knowledge doesn't eradicate the heuristic — it only enables people to catch the error when conditions are favourable.

06 — Memory vs. Experience

You Don't Remember Your Life — You Remember Its Highlights

One of the lecture's most striking findings comes from medical research. Kahneman and colleagues studied patients undergoing colonoscopies — a procedure that causes varying degrees of discomfort. They tracked pain moment-by-moment throughout each procedure.

When they later asked patients to rate the overall unpleasantness of the experience, the answers were not driven by how long the procedure lasted or by the total accumulated pain. Instead, patients' memories were dominated by two moments: the peak of pain and the pain at the very end.

The Counterintuitive Experiment

In a follow-up experiment, half the patients had the procedure extended by a minute — but at a milder level of discomfort than the peak.

The extended group reported a better overall experience, even though they endured more total discomfort. The gentler ending rewrote their memory of the whole procedure.

Kahneman calls this the Peak/End Rule. Your remembered experience of any event is essentially an average of its worst (or best) moment and its final moment — with duration barely entering the calculation at all.

This matters far beyond medicine. It explains why a wonderful holiday spoiled by a bad last day feels worse overall than a shorter, simpler trip. Why a film's ending shapes the whole memory of it. Why the way a relationship ends colours how we remember it in its entirety.

A Moment of Wonder

People were willing to pay almost the same amount to save 2,000 birds from oil spills as to save 200,000 birds.

The number didn't matter. What mattered was the mental image — a single oiled bird, struggling. System 1 responded to the image, not the statistic. Our moral intuitions, it turns out, are insensitive to scale.

07 — The Bigger Picture

What This All Means

Kahneman's lecture is ultimately about a single proposition: highly accessible impressions produced by System 1 control our judgments and preferences — unless System 2 steps in to check them.

The lecture doesn't portray us as hopelessly irrational. It maps the terrain precisely: which shortcuts we use, when they lead us astray, and what conditions make System 2 more or less likely to catch the errors. This is a science of cognitive humility.

And there's something quietly radical in the enterprise. Kahneman — a psychologist — was given the Nobel Prize in Economics. The prize committee was, in effect, acknowledging that the most important discoveries about economic behaviour came not from equations about rational agents, but from careful, humble attention to how real human beings actually feel and think and choose.

The lecture closes with a call for more such work — using the frameworks of accessibility and dual-process thinking as a bridge between psychology and economics, between the quick gut reaction and the slower considered judgment. It's an invitation to keep studying the map of our bounded rationality, so we might navigate it a little better.

Watch the Lecture

Kahneman in Stockholm, 2002

Prize Lecture delivered December 8, 2002, at Aula Magna, Stockholm University.

Read the Original

The full lecture is rich with experiments, visual demonstrations, and the precise reasoning behind each idea summarised here.

Nobel Prize PDF — Maps of Bounded Rationality →

Nobel Prize facts page →

Go Deeper

  • Thinking, Fast and Slow — Kahneman's popular book expanding these ideas
  • Judgment Under Uncertainty — the foundational 1982 collection
  • Kahneman & Tversky, "Prospect Theory" (1979) — the original paper
  • Tversky & Kahneman, "Judgment Under Uncertainty" Science (1974)