Skeptophilia (skep-to-fil-i-a) (n.) - the love of logical thought, skepticism, and thinking critically. Being an exploration of the applications of skeptical thinking to the world at large, with periodic excursions into linguistics, music, politics, cryptozoology, and why people keep seeing the face of Jesus on grilled cheese sandwiches.

Tuesday, April 17, 2018

Superior ignorance

I've written before on the topic of the Dunning-Kruger effect, the idea that we all tend to overestimate our own knowledge of a topic (parodied brilliantly by Garrison Keillor in his spot "News from Lake Woebegon" on Prairie Home Companion -- where "all of the children are above average").


A study released last week in the Journal of Experimental Social Psychology gives us another window into this unfortunate tendency of the human brain.  In the paper "Is Belief Superiority Justified by Superior Knowledge?", by Michael P. Hall and Kaitlin T. Raimi, we find out the rather frustrating corollary to the Dunning-Kruger effect: that the people who believe their opinions are superior actually tend to know less about the topic than the people who have a more modest view of their own correctness.

The authors write:
Individuals expressing belief superiority—the belief that one's views are superior to other viewpoints—perceive themselves as better informed about that topic, but no research has verified whether this perception is justified.  The present research examined whether people expressing belief superiority on four political issues demonstrated superior knowledge or superior knowledge-seeking behavior.  Despite perceiving themselves as more knowledgeable, knowledge assessments revealed that the belief superior exhibited the greatest gaps between their perceived and actual knowledge.  
The problem, of course, is that if you think your beliefs are superior, you're much more likely to go around trying to talk everyone into believing like you do.  If you really are more knowledgeable, that's at least justifiable; but the idea that the less informed you are, the more likely you are to proselytize, is alarming to say the least.

There is at least a somewhat encouraging piece to this study, which indicated that this tendency may be remediable:
When given the opportunity to pursue additional information in that domain, belief-superior individuals frequently favored agreeable over disagreeable information, but also indicated awareness of this bias.  Lastly, experimentally manipulated feedback about one's knowledge had some success in affecting belief superiority and resulting information-seeking behavior.  Specifically, when belief superiority is lowered, people attend to information they may have previously regarded as inferior.  Implications of unjustified belief superiority and biased information pursuit for political discourse are discussed.
So belief-superior people are more likely to fall for confirmation bias (which you'd expect), but if you can somehow punch a hole in the self-congratulation, those people will be more willing to listen to contrary viewpoints.

The problem remains of how to get people to admit that their beliefs are open to challenge.  I'm thinking in particular of Ken Ham, who in the infamous Ken Ham/Bill Nye debate on evolution and creationism, was asked what, if anything, could change his mind.  Nye had answered the question that a single piece of incontrovertible evidence is all it would take; Ham, on the other hand, said that nothing, nothing whatsoever, could alter his beliefs.

Which highlights brilliantly the difference between the scientific and religious view of the world.

So the difficulty is that counterfactual viewpoints are often well insulated from challenge, and the people who hold them resistant to considering even the slightest insinuation that they could be wrong.  I wrote last week about Donald Trump's unwillingness to admit he's wrong about anything, ever, even when presented with unarguable facts and data.  If that doesn't encapsulate the Dunning-Kruger attitude, and the Hall-Raimi corollary to it, I don't know what does.

Doesn't mean we shouldn't try, of course.  After all, if I thought it was hopeless, I wouldn't be here on Skeptophilia six days a week.  The interesting part of the study by Hall and Raimi, however, is the suggestion that we might be going about it all wrong.  The way to fix wrong-headed thinking may not be to present the person with evidence, but to get someone to see that they could, in fact, be wrong in a more global sense.  This could open them up to considering other viewpoints, and ultimately, looking at the facts in a more skeptical, open-minded manner.

On the other hand, I still don't think there's much we can do about Ken Ham and Donald Trump.

*********************
This week's Featured Book on Skeptophilia:

This week I'm featuring a classic: Carl Sagan's The Demon-Haunted World: Science as a Candle in the Dark.  Sagan, famous for his work on the series Cosmos, here addresses the topics of pseudoscience, skepticism, credulity, and why it matters -- even to laypeople.  Lucid, sometimes funny, always fascinating.




No comments:

Post a Comment