Kahneman's Fallacies, “Thinking, Fast & Slow”

Daniel Kahneman, as well as being a winner of the Nobel Prize in economics, is one of the subjects, along with his longtime collaborator, Amos Tversky, of Michael Lewis’ latest book, “The Undoing Project”, and so his work has drawn even more attention as the way to see through biased behavior and show how irrational people are in the conduct of their everyday lives. I want to suggest that Kahneman is dead wrong on substance, that people are reasonable rather than overcome by bias, and his deeply mistaken supposition is the result of a method that boxes his subjects into corners so that they cannot but seem hopelessly irrational. This essay, reprinted from westendejournal.com  is an attempt to bring down what has been offered up as an important icon of contemporary thinking about mental and social life.

It is very easy to show that the theory outlined by Daniel Kahneman in his “Thinking, Fast and Slow” is a very poor theory if a theory is understood to mean an explanation that will stand up to a number of different examples rather than as just a heuristic device to show how convoluted is the process of thinking. The thing is that I can always provide a rational explanation for why the explanations that Kahneman describes as biased are, in fact, the best explanation available under the circumstances for the phenomenon described. When he says that character attributes such as meekness suggest someone is a librarian rather than a firefighter, that is a reasonable inference because Kahneman has set it up so that no other evidence is provided and yet the subject of the experiment/demonstration is required to make a selection. The subject has substituted what evidence is available for what might be more relevant evidence. Kahneman says that such substitutions are the way biases work, when it would be more accurate to say that that is what reason does: draw inferences from whatever is available. Kahneman, however, insists on calling that a bias, which suggests that the practice is a departure from reason rather than reason.

What has happened is that the subjects had been asked to perform a task on the basis of limited information and told to report what they had decided. That was the game that was being played. And so they would guess, applying stereotypes, when in the real world they might not apply those stereotypes at all, or if they did, they might simply be doing what people do when they reason in the real world on the basis of limited information, which is to say, what always happens. It is only the artificiality of the conditions imposed on the subjects that results in the aura of artificiality that surrounds what they give as their opinions. If you set up a fake experiment, then the results will seem fake even if they are not.

Sometimes Kahneman doesn’t even bother with an experiment. He simply comes up with a bias by asserting as its opposite something contrary to the way things are. He says people show their bias because they use only clichéd explanations. But it is in fact the case that it is very difficult to come up with an explanation that is not already available. Few of us are original. And so we are biased because we are not original.

In general, Kahneman loses sight of any distinction between reason and bias. If every bias can be seen as a reason, and every reason can be seen as a bias, then his breakdown of thought processes into System 1, which operates quickly on the basis of biases, and System 2, which operates slowly on the basis of what I take to mean algorithms, is no longer valid, however much one is still left with the sneaking suspicion that there is a difference between the way the humanistic and the scientific imagination work, which is what I suspect is the big game that Kahneman is out to address but never calls by that name. But it also must be said that Kahneman is not at all clear about what belongs under System 1 and what belongs under System 2. Remembering the rules of a game are under System 2, but so are assessments of social situations, when it is clearly the case that most aspects of a social situation are taken for granted even if their particular application is a subject of conscious worry. Somehow, a young man learns how to talk to girls, and becomes better at it, though that doesn’t mean he knows how he got there or why he is now doing well at it, even if he has spent a lot of time thinking about how to say the right thing to girls. Kahneman is at his weakest when he deals with specific examples, never pausing long enough to give a detailed analysis of how the two kinds of processes work in everyday life, and so the book has a feel of superficiality.

No less a savant than Freeman Dyson, however, in his review of Kahneman’s book in “The New York Review of Books”, endorses Kahneman’s way of handling that distinction between bias and reason when he uses an experience of his own that he has referred to repeatedly as parallel to what Kahneman discovered when he was doing testing for the then nascent Israeli Defense Force. Kahneman said that creating a protocol of questions to ask recruits was a more successful way of predicting who would perform better in one or another military slot than were unstructured interviews. Dyson cites his time with British Bomber Command in World War II when he showed that experienced crews were no more likely to survive than inexperienced crews but higher ups would not remove gun turrets used in the team effort to defend the plane only because having the turrets made the crews feel more responsible for their fates and thereby raised morale.

Generals did not understand statistics, but relied on their biases, and so more crews were doomed than had to be. But Dyson was wrong then, and continues to be wrong. The bias of the generals was towards keeping up morale among the crews. Their object was to get the crews back up in the air for the next mission, even if that meant that the rate of crews surviving was marginally less than if there had been no gun turrets—though we do not know how to calculate how many crews would have responded half-heartedly to a challenge if they did not believe in their own agency and so thereby increased the numbers that did get home. The generals decided it was best to give what the crews thought was a fighting chance. The generals were consulting their biases, which were much more sophisticated and more reasonable as well as much colder than the reasoning offered by Dyson.

Kahneman thinks that people get statistics wrong, but again all he means by that is that they have to make approximations about where to find evidence and how to make generalizations. I once told some lawyers in a voir dire who told me that if I joined the jury I should not go to the public building where the accident that resulted in a tort trial took place because it might have changed and therefore I would become biased. I replied that I would therefore substitute the courtroom in which I was sitting as the basis for a decision about the state of public buildings in New York City and I would judge accordingly. I was dismissed from the jury for stating that I would do what I assume other jurors also would do but they kept their counsel because, for one reason or another, they wanted to stay on the jury.

What is attractive about the Kahneman system is that it rates scientific thinking superior to what Kahneman calls intuitive thinking. There is nothing intuitive about intuitive thinking. Sure, anyone can offer an opinion about a poem or a novel or a movie, and it is not hard to do so, even as it is really difficult to master even basic math and science, but it is only after having read a lot of poems or seen a lot of movies and having thought about how one’s early responses did not do justice to what was on the page or the screen that one develops enough of what is called aesthetic judgment to say why a particular line or scene works, how it goes about delivering its emotion, and what is valid or off about that perception. It takes a lot to say that the people in “Mad Men” are not true to life. That is not because the women are treated as if they are exploited or because they are portrayed as wearing pointy brassieres, both of which were indeed the case. Rather, “Mad Men” is not true to life because the motives don’t make sense, at least most of the time. Agency heads would be aware of their hypocrisy in dismissing a lush even though they themselves drank so much; they would not be oblivious to the fact that their love affairs had an impact on their marriages. Sometimes that portrait of life in the early Sixties makes sense, as when a woman puts her career before her baby but suffers emotionally for having done so. But all those people crying over Marilyn Monroe’s suicide does not make sense. Monroe was a joke, not a role model; she was no Princess Diana. If Grace Kelly had died at the time, it would have been a different story. It takes some knowledge of history and of plotting and of human psychology to come to that conclusion, and it still might be wrong, but what protocol is going to tell me different?

Judgment is hard. Just ask the general who forgets about the morale of his troops. And yet people disparage judgment by making it something universal, something anyone can lay claim to. Why did Herman Cain think he could be President? Because he believed he had judgment. But about what? Was it about fiscal policy or military policy or what women to trust? Judgment is treated as the equivalent of what the contestants on “The Apprentice” used to call “business decisions”. You were making a business decision when you decided what to do by the seat of your pants. That is the way for a business to fail. And so Kahneman plays into the hands of scientific intellectuals who like to think that what they do is better than judgment when what they might do is in fact inferior to what might be done by a well schooled judgment.

There is a general methodological question about how to do social science that is at issue here. Intuitive thinking takes more things into account, including things of which a person may not be fully aware. In order to avoid the traps of intuitive thinking, Kahneman and his fellow social psychologists invent thought and actual experiments that are a bit outré so as to highlight the variable that they wish to study. You get into a “Candid Camera” world where you can’t quite tell what is wrong, and then pounce on the subject for not having understood that, as happened, for example, in the Milgram experiment, where scientists told the subjects that they were administering pain, when in fact they were not administering pain, and so were being faulted for what they had not done. What Kahneman does is not even perform actual experiments. He constructs mental ideas of what people “ordinarily” do and then says whatever they do is a way of creating bias.

This procedure is very different from what happens in sociological case studies, where the fullness of a situation, whether an urban village or a suburban high school, or an insane asylum, is mined for what are the ordinary practices in that community that can be taken as created by the kind of community that is being studied. Such case studies are not able to tell what particular feature of the social structure may account for a particular behavior, but what the study gets right is that there is such a social structure and that the particular behavior, nuances and all, gets reported and fit into the context of the social structure. Whether the information is transferable to other communities or structures is problematic. The reader has to decide whether there are enough resemblances in two settings to think that the behaviors are transferable.

The case study method has a kind of novelistic appeal. The reader is absorbed in the world of the community well enough to sense how the behavior makes sense there, is other than bizarre, while Kahneman’s demonstrations highlight how bizarre and difficult to accept are the behaviors he puts on display only to explain away as the result of unsatisfactory patterns of thought. Yet those intuitive patterns of thought, which are better considered just more nuanced and not always well articulated sets of responses, are precisely what comes into play when you say of a case study that all other things being equal this is what happens in real life under a certain set of circumstances. Your intuition fills in when things are not right, rather than, in Kahneman, your algorithm filling in why things are wrong. Kahneman fails not because one procedure is an alternative to the other but because one procedure tries to attend to the truth of matters while the other is an exercise in how to falsify what one, for one reason or another, knows. Kahneman is insufficiently analytic in the old fashioned sense of looking for how what is makes sense rather than in how what is not does not make sense.