Welcome to The Bertrand Russell Society Online Forum

The Bertrand Russell Society Online Forum

Author Topic: Thinking, Fast and Slow by Daniel Kahneman  (Read 575 times)

February 01, 2018, 10:44:24 AM
  • Russellian
  • ***
  • Posts: 124
    • View Profile

THINKING, FAST AND SLOW by Daniel Kahneman (2011)

Most thinking goes on below the level of the conscious, reasoning mind.   It couldn't be otherwise.   Human beings couldn't function if they had to think out the reasons for every action.

The philosopher John Dewey said human actions are determined by impulse, habit and reason.  Our habits control our impulses.   It is only when neither our impulses nor our established habits get us what we want that we start reasoning.  This is how things are.

An experimental psychologist named Daniel Kahneman has devoted his life to studying how this works.   In his book, Thinking, Fast and Slow (2011), he summarized what he and other psychologists have discovered about the interplay of intuition and reason in decision-making.

What's noteworthy about the book is that it is based on real science.  Every assertion in it is backed up by a study, many of them by Kahneman himself and his friend,  the late Amos Tversky.

Our default mode of thinking is what Kahneman calls "fast thinking," or System 1.  It consists of the mental processes that enabled our prehistoric ancestors to react quickly, and to survive.   It is the human mind's default state.

"Slow thinking", or System 2, is the override system, comparable to taking conscious control of your breathing.   It requires continuous concentration and effort.  Doing it is hard work.  Some are better at it than  others, but few people can sustain it for long.

System 1 consists of pattern recognition.  The human mind is constantly monitoring the present state of things and matching it with previous experiences and impressions.

This works well for people with long experience of doing similar things, and receiving immediate feedback.    If a firefighter in a burning building or an anesthesiologist in an operating room says something doesn't seem right, you'd better heed them, because their intuition is grounded in long experience of burning buildings and operating rooms.  Over time, chess players, performing artists and emergency room nurses develop reliable intuition.

The problem is that intuition will give you an answer whether there is any basis for it or not.   Political pundits, stock market analysts and clinical psychologists typically have poor records of predicting results, but this seldom affects their self-confidence.

Human beings would be paralyzed if we had to think of logical reasons for every decision and exercise conscious control over every action.   We need intuition.  But intuition can mislead us.  Kahneman's book is about ways this happens.

Thinking, Fast and Slow is an extremely rich book.  Almost every chapter could be expanded into a self-help book, while some could be textbooks on negotiations, advertising and propaganda.

I've had a hard time getting started on writing about the book, maybe just because there is so much in it.   I've given up on trying to give an overview.  I will just hit a few highlights in the hope that I can spark interest in reading it.

One problem with intuitive thinking is the planning illusion.   Those who plan projects typically try to factor in everything they can foresee that is likely to go wrong.   It is predictable that they can't foresee everything that can go wrong.  That's why home remodeling contractors and military suppliers make most of their money on change orders.

Kahneman, who grew up in Israel, once talked the Israeli Ministry of Education into commissioning a high school textbook on judgment and decision-making.  He assembled a team, did some preliminary work, and then questioned Seymour, his curriculum expert.

What was the failure rate of people who wrote textbooks from scratch?  Answer: About 40 percent.   Question: How long did it take the others to complete their work?  Answer: Six to ten years.  Question:  Are we better than the other teams?  Answer: No, but we're not that bad.

Nevertheless, he let the team go ahead.   The textbook took about eight years to complete, and by that time, the Israeli government had lost interest.

The lesson is that, if you are planning a project, you should look at the success rate of those who have attempted similar projects.   Then you should use that as a reference group and determine what makes your project different from the others.

Most entrepreneurs don't do this, Kahneman said.  This is probably good for society, because the public benefits from their effort, while the entrepreneurs and their backers absorb the loss.   But if you're an entrepreneur yourself, you're better off looking before you leap.

Overconfidence is based on what Kahneman calls the WYSIATI (what you see is all there is) syndrome.   As a young Israeli military officer, Kahneman once had the task of setting up an interview system to determine the fitness of recruits for different branches of military service.  Interviews were then conducted by women soldiers who made recommendations based on their personal judgment.  The results were little better than random.

Kahneman drew up a questionnaire, based on factual factors relevant to such characteristics as responsibility, sociability and masculine pride.  These included number of jobs held, work history, participation in sports and so on.   He asked the interviewers to rate the recruits on these individual factors, without regard to the overall recommendation, and then he used this to give a weighted score.

But when they balked at this, he added another step—to shut their eyes and imagine how well the recruit would do in the military.   The conclusions based on the questionnaire were imperfect, but better than the previous subjective evaluations.  Surprisingly, so was the "shut your eyes" judgment, maybe because the intuition included actual information.

In general, Kahneman said, decisions based on checklists and algorithms have a better track record than decisions based on personal judgment.   Simple checklists and algorithms work better than complex formulas.

(I have to say I have reservations about checklists and algorithms.  They are only as good as the people drawing them up.   The "garbage in, garbage out" principle applies, as does Goodhart's Law.)

Most human beings are loss averse.  They'd rather have the certainty of a small gain than an uncertain chance to make a big gain.  But on the other hand, if faced with a loss, they'd risk a bigger loss to avoid it.

The average person, according to Kahneman, would only risk a loss of $100 if he had an equal chance of winning $200.

This makes people vulnerable to framing effects.   You can get people to make different decisions based on the same facts, depending on whether you framed the decision as seeking a gain or risking a loss.

For example, test subjects were asked what they'd do if they were given $1,000, then offered a choice between a sure gain of an added $500 and a 50/50 chance of getting either $1,000 more or nothing.  Most chose the $500.

Other test subjects were asked what they'd do if they were given $2,000, and then offered a choice between a sure loss of $500 or a 50/50 chance of losing either $1,000 or nothing.  Most chose the gamble.

Both sets of choices are the same—just framed differently.   The framing determines the response.

Credit card companies understand this.  In some states, when stores charge different prices depending on whether you buy with cash or credit cards, they're required by law to say it is a "cash discount," not a "credit card surcharge."   People are more willing to pass up a cash discount than to pay a credit card surcharge, even though the only difference is the words.

The subconscious, intuitive mind can be swayed by anchoring on irrelevant facts and impressions.  We're at risk of being intentionally primed to think certain ways without our knowing it.

When Barack Obama was thinking about running for President, his supporters wrote many words trying to dispel the misconception that Obama was a Muslim.   But the more they tried to this belief, the more it persisted.   People forgot the argument, and just remembered, subconsciously, the words "Obama" and "Muslim".

Obama supporters instead started writing about Obama's Christian beliefs and his church attendance.   That helped—although it also called attention to the inflammatory sermons of Obama's pastor, the Rev. Jeremiah Wright.

The "Obama-Muslim" link is an example of how unconscious anchors shape our thinking without us realizing it, and of not only how we mislead ourselves, but leave ourselves open to manipulation by others.

This fits in with the writings of research psychologist Daniel Kahneman, in his 2011 best-seller, Thinking Fast and Slow, and elsewhere.  He says human beings are more inclined to rely on intuition (fast thinking), which operates between the level of consciousness, than on conscious reasoning (slow thinking).

The most disturbing part of the book is how others can intentionally manipulate us by priming our intuitive minds without our realizing it.

Vance Packard wrote about this possibility in The Hidden Persuaders in 1957.   Facebook in 2012 ran an experiment to see if it could change its clients' moods by manipulating its news feed.

In the 2016 election, Facebook worked with the Donald Trump campaign, as it routinely works with advertisers, to micro-target voters based on information they've left on social media.   Facebook would have provided the same service to the Clinton campaign, but they didn't ask.

A company called Cambridge Analytica claimed to have used artificial intelligence to create individual psychological profiles on 220 million registered American voters, and to have used this to support the Trump presidential campaign.  Cambridge Analytica also supported the British campaign to leave the European Union.

None of this is mind control.  People with firm opinions are not likely to change their minds based on subliminal or targeted messages.   The aim is to increase sales of a certain product or votes for a certain candidate by a few percentage points.

But to the degree that mind manipulation is possible, the advertisers and propagandists are going to get better at it.   That's cause for concern.


Here are examples from Thinking, Fast and Slow on how anchors prime our minds in certain ways without our realizing it.

>Voting on an Arizona school bond issue were more favorable in polling places actually located in schools than elsewhere.  People shown images of classrooms and school lockers also were more favorable than those who weren't.  This difference was greater than the average difference between parents of school children and the rest of the public.

>University staff members contributed more to an "honesty box" when there was a poster of eyes looking at them than when the poster only showed flowers.

>Experienced real estate agents gave a higher appraisal to a property when the asking price was high than when it was low, even though they claimed to not be influence by the asking price.
>Experimental psychologists gave one of two word-association tests to groups of students.  One of the tests primed students with words such as Florida, forgetful, bald, grey or wrinkle, associated with the elderly.  Students who took that test, although not consciously thinking about old age, walked more slowly to another test location down the hall than students who took the other test.

>Students who were told to walk more slowly than normal were quicker afterwards to pick up on words associated with the elderly, although they hadn't been consciously thinking about the elderly

>In an experiment, students in one group were told to hold pencils in their mouths by the eraser end, with the point sticking out, which sort-of simulates smiling, while another group were told to hold pencils in their months sideways, which sort-of simulates frowning.  The smilers found a set of Far Side cartoons funnier than the frowners did.  The frowners had a stronger reaction to pictures of starving children and accident victims.

>People who were primed with images of money became (1) more independent, (2) more persevering, (3) less helpful and (4) less sociable.

>People who were primed with images of death became more receptive to authoritarian ideas.

>People who were told to think about stabbing a co-worker in the back were more likely afterwards to buy soap, disinfectant or detergent than batteries juice or candy.  If you feel your soul is stained, you are more likely to clean your body.  Psychologists call this the Lady Macbeth effect.

>A study of parole judges in Israel showed they were more likely to give decisions favorable to the prisoner right after they had eaten.

Again, this not mind control.   It doesn't necessarily work on any particular individual.  If you understand priming and anchoring, you can put yourself on guard.

The danger would be when that when you are caught unawares—when you receive some message over the Internet or otherwise that you accept as genuine, and it really is in the service of some propagandist you don't know about.

Daniel Kahneman, although a psychologist, won the Nobel Memorial Prize for Economics in 2002.  That shouldn't be surprising, because economics is a sub-set of psychology.  Economics is the study of decision-making, because on the hypothesis that human action is determined by incentives, especially material incentives.  A section of Thinking, Fast and Slow, is devoted to showing why that isn't always so.

He and his friend and collaborator, the late Amos Tversky, were true scientists.  Every assertion in this book is backed up by a study of experiment, many conducted by Kahneman and Tversky.   The appendix includes two of their most important scientific papers.

The business journalist Michael Lewis has written a book, The Undoing Project, about their collaboration, which I haven't read, but I'm pretty sure is good.

I have a version of this review, with links and illustrations, in two parts on my web log.