Reading About Thinking (on D. Kahneman’s Ideas on Perceptions of Knowledge)

An article appeared in yesterday’s NYT Magazine on the hazards of over-confidence. The Israeli-born psychologist (and epistemologist, I’d dare say), Nobel laureate and author Daniel Kahneman considers how people make decisions based on bits of information that don’t provide an adequate representation of the subject at hand. He recounts how poorly, and firmly, army officers evaluate new recruits’ leadership potential and how brash, rash or naive traders maintain investors’ trust while weighing stocks to buy or sell.

The point, as I understand it, is that individuals, including influential and powerful people, routinely make recommendations without having adequate knowledge to support their decisions. And they do so comfortably.

Men are afflicted by overconfidence more than women, he suggests, although I’m not sure he’s right on this point. In the article, he uses reckless investors who rack up stock losses as an example: Guys are more likely to lose lots of money than are women who, in general, are more cautious in their investments and, perhaps, less confident about their predictions.

I’ll have to read Kahneman’s forthcoming book, Thinking Fast and Slow, to learn more about his views on differences between men and women’s cognitive biases.

Nearing the end of the magazine piece, Kahneman alludes to medical decisions. He suggests that some doctors, perhaps through life-and-death sorts of feedback on the outcomes, may be distinguished by their capacity to gauge their own judgment skills.

He writes:

We often interact with professionals who exercise their judgment with evident confidence, sometimes priding themselves on the power of their intuition…

And asks:

How do we distinguish the justified confidence of experts from the sincere overconfidence of professionals who do not know they are out of their depth? We can believe an expert who admits uncertainty but cannot take expressions of high confidence at face value…people come up with coherent stories and confident predictions even when they know little or nothing. Overconfidence arises because people are often blind to their own blindness.

And broaches the topic of doctors’ expertise:

True intuitive expertise is learned from prolonged experience with good feedback on mistakes. You are probably an expert in guessing your spouse’s mood from one word on the telephone…true legends of instant diagnoses are common among physicians….Anesthesiologists have a better chance to develop intuitions than radiologists do….

I read this article on the train last evening and found it fascinating, so much so I hope I can find time to read the full book. Even though Kahneman is just a single human, and necessarily biased like the rest of us, he’s got some interesting and well-articulated ideas. I’m curious, in particular, if he’ll further dissect the critical thinking pathways among different doctor types.

In my experience, we’re a variable bunch. But who knows?

ADVERTISEMENTS:

Related Posts:

2 Comments

  • A related book that you might enjoy, if you haven’t read it already, is “How Doctors Think” by Jerome Groopman. It addresses the decision-making process that physicians go through and how cognitive errors can lead to misdiagnoses and bad treatment decisions. It was a very good read.

  • Thanks Solitary, I agree, “How Doctors Think” offers insights and was, for me, a good introduction into cognitive bias in medicine. That said, I wonder if the author’s (and my) perspective is limited by spending so much time with physicians of like-minded values and education.

Leave a Reply

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.