In a world where experts are our go-to for solving everything, the COVID-19 pandemic exposed just how fallible these specialists can be. Despite impressive titles and years of experience, many experts were just as clueless as the rest of us, often with unwarranted confidence. A recent study delves into the uncomfortable truth: the more knowledge you think you have, the less aware you might be of your actual ignorance.
We rely on experts and their expertise to help us solve problems, be it our individual health concerns with a physician or issues of policy that impact regional or national populations. The missteps of expertise during the COVID pandemic have generated a tidal wave of articles, ostensibly analyzing but more frequently blaming health or financial outcomes on the intent of experts. But what if these missteps, made at the edge of our knowledge, are generally more in the nature of expertise than in alleged deep-state conspiracies?
A recent study in the Journal of Behavioral Decision Making, which looks at experts and their knowledge, begins with an important but generally overlooked question: What is an expert? Merriam-Webster defines an expert as “one with the special skill or knowledge representing mastery of a particular subject.”
The researchers point out that the definition is too vague for quantitative research, and while the general field has no authoritative definition, expertise is frequently measured by professional titles or degrees, years of experience, and performance on “domain-specific” tasks. They also consider a more philosophical dimension to expertise—the ability to know what you do not know.
“To know what you know and what you do not know, that is true knowledge.”
—Confucius
The researchers suggest that we most desire this true knowledge, what we might call wisdom. Experts, as COVID-19 reveals, can profoundly influence our lives, but they are human, and their judgments are “not always accurate, and they can be overconfident.”
“We seek to understand whether experts, as classified by these criteria, can live up to the idealized conception of an expert from a philosophical standpoint.”
Dunning-Kruger effect
This study looks at a bias in our perceptions, the Dunning-Kruger effect, where people with limited competence overestimate their abilities (Dunning was one of the two authors of this current study). While Dunning-Kruger is often applied to the nonexpert, this work zeroes in on how this cognitive bias impacts experts and their opinions. They introduced two measures of “metacognition,” knowing what we know and don’t.
Murphy’s resolution: The ability to distinguish between correct and incorrect responses based on the respondent’s confidence. A high Murphy’s resolution means that the expert’s confidence in their answer is a reliable indicator of its correctness.
Yates separation: The gap between average confidence for correct versus incorrect responses. A high Yates’ Separation suggests that the respondent’s confidence is aligned with correctness—an individual knows what they do and do not know.
In short, the wise expert will have “a high Murphy’s resolution, a large Yates’ separation, as well as high confidence for correct answers and low confidence for incorrect ones.”
To measure an expert’s wisdom, they conducted studies involving climate scientists, psychologists, and investors. Each study identified a group of experts by title or degree and a group of non-experts found through online aggregators. Participants were asked a series of domain-specific questions and were asked to record their answer and their confidence in the response.
Unsurprisingly, experts demonstrated greater knowledge and more accurate self-assessments than non-experts. They showed greater confidence in those questions answered correctly than incorrectly. However, “they still made misjudgments that they held with confidence.” Those mistakes were an unfounded confidence in what they did not know—they were less aware of gaps in their knowledge than the nonexperts, leading the researchers to conclude that “awareness of error appeared blunted by expertise.”
The research revised the results once they accounted for the difficulty of the questions, flattening the questions’ “hard-easy effect.” The improved calibration of experts disappeared, and they exhibited “greater overconfidence than citizens.” It may be that the unfounded confidence comes from their day-to-day experience with easy tasks, unlike those confronted during the COVID pandemic.
Defining expertise by years of experience rather than academic titles and degrees made no difference. While experts were better calibrated to what they know, the Dunning-Kruger bias leading to overconfidence is in full effect for knowing what they do not know. Expertise can generate unfounded confidence. We have seen this in several Nobel Laureates who have opined in areas outside their Nobel expertise; Linus Pauling and his views on vitamin C is a prime example.
“That is, experts had better metaknowledge regarding what they knew but equal or worse metaknowledge regarding what they did not know.”
Why might experts be just as impacted by the Dunning-Kruger effect as mere mortals? One offered explanation is what is described as a positive-biased reward system—rewarding those “showcasing” their knowledge with correct answers on tests. Tests that do not penalize wrong answers accelerate this bias. It is more challenging to say “I don’t know” or “I’m not sure” than to weave some plausible ideas into a possibly coherent-sounding fabric. No one wants their physician to end the conversation with “I don’t know.”
The answer may lie in another ancient concept, hubris—the quality of “extreme or excessive pride, dangerous overconfidence, and complacency, often in combination with arrogance.”
“In a sense, lacking knowledge itself is not a disaster; lacking the awareness of one’s lack of knowledge is. It hinders one from gaining knowledge, listening to others’ good advice, and making efficient decisions, and this is true for both misinformed bottom performers and experts. In sum, the work herein provides a cautionary tale about guarding against error in judgment and action: Being an expert means making more correct decisions, but it does not mean the eradication of all errors. Those errors may not be anticipated, so one must stay on guard against making errors, whatever one’s level of expertise might be.”
The unanticipated Dunning-Kruger effect seen in experts, as demonstrated in this study, provides a reasonable understanding of why our experts got COVID-19 wrong without positing some nefarious intent or cabal. Experts are human, and humans err. The real message in the COVID “root-cause analysis” is being open to divergent, dare I say diverse, views.
Charles Dinerstein is a surgeon.