The Millenium Project 

Home >Comments and Articles > Confirmation bias, denialism and Morton's Demon.

Alphabetical ListCategoriesCommentariesArchiveAbout the SiteHate MailBook ShopSite Map/Search

Confirmation bias, denialism and Morton's Demon.

This article was recorded for the Diffusion Radio program that went to air on March 21, 2011.

I was going through my record and CD collection and two songs caught my attention. One was "The Boxer" by Simon and Garfunkel which has the line "a man hears what he wants to hear and disregards the rest"; the second was Rod Stewart's "Reason to Believe" which says "I look to find a reason to believe".

These got me thinking about how we have a need to believe things and the lengths we will go to be comfortable with what we believe. I'm saying "we" here because I'm not immune to the problems I'm going to talk about, but with a little caution we can avoid the traps.

Anyone who has ever done any research will be familiar with the problem of confirmation bias. This is hearing what you want to hear. My studies were in cognitive psychology where it is impossible to get directly to the processes underlying observations, and so it's all about statistics and interpretation. Anybody doing research in the social sciences has to be constantly aware of the possibility of confirmation bias, of selecting results and readings that fit the hypothesis and either ignoring or eliminating things that don't quite fit. I don't mean rejecting obvious outliers where the observations are so far from the rest that a mistake can be assumed – I mean shaving the results to suit what the experimenter expects to find. This may not even be a conscious act, because doing it consciously approaches fraud and most people are basically honest.

The classic case in the hard sciences of confirmation bias is cold fusion. Ponds and Fleischmann found what they wanted to find and then stopped looking. In the social sciences there was Cyril Burt's just-too-good statistics about separated twins and Margaret Mead's willingness to believe whatever some young girls told her. In medicine there was William McBride's work on Debendox. I don't think any of these people started out to do the wrong thing, but they all did it anyway because what they did confirmed their beliefs. (I am not talking about obvious cases of deliberate fraud like Andrew Wakefield or the Korean investigators into fertility. These people knew exactly what they were doing.)

Confirmation bias is rife in paranormal research, largely because this research is carried out by true believers. While there have been cases of deliberate fraud, the most common problem is testing until some anomalous situation arises and then stopping, claiming evidence of psychic or paranormal powers. I was adjudged the most psychic person in the room at a sceptics' function once, and I did this by correctly calling a coin toss seven times in a row. To a paranormal researcher this could be seen as evidence of my super powers, but as I pointed out to the group, with about 120 people in the room you would expect to take six or seven tosses to eliminate everyone.

Confirmation bias is one thing, but the next stage is denial, where results or data which contradict beliefs are rejected. Again, this can be a totally unconscious matter, but to be true denial it has to be deliberate. The driving force behind most scientific denial seems to be political or ideological. The people I most commonly come across practising denial are Holocaust deniers (who turn into anti-Semites within seconds if pushed), climate change deniers (I refuse to call them "skeptics") whose politics are often quite visible, medicine deniers, including AIDS and vaccine deniers (who simply reject all science that doesn't agree with them), creationists (who reject anything that conflicts with their religious beliefs), and generalised conspiracy kooks like 9/11 Truthers who reject anything that comes from a government. These groupings aren't mutually exclusive, and because conspiracy theories are close to the surface in almost all of them it is not unusual to find people who fall into more than one category. I would almost bet money that I could start a web site and attract followers by arguing that the attack on the World Trade Center was done on the direct orders of the President of the USA in order to distract the sheeple's attention from plans by the Jewish-owned banks to print money to fund plans by Big Pharma and the Illuminati to expand their mind control through microchips in vaccines and to spread AIDS in Africa so that Big Oil could control African resources and get everyone to pay more for petrol and electricity by telling them that oil and coal were made millions of years ago (instead of 6,000) so they are running out. Actually, I won't place that bet that because I know a couple of web sites that are already say things like that.

I've observed the resistance to facts in many of these groups at first hand. A few years ago I unwisely entered into a debate with some professional creationists. (Yes, they were paid to do it.) I was able to show that some of the very people I was debating against had had the science explained to them more than twenty years before, but were still making the same claims. It doesn't seem to matter how much research is done into the safety or efficacy of vaccines, but it is all rejected if it doesn't prove that vaccines cause autism. If you don't believe the "germ theory" of disease than no science is going to convince you that antibiotics work.

One strange aspect of denial is that often people will be presented with evidence that conflicts with what they already believe, but if it still agrees with their general belief system they will accept it and consequently hold two contradictory opinions at the same time. Any 911 Truther worth his place in the movement knows that no planes flew into the World Trade Centre but the planes were flown by Mossad agents who also worked for the CIA. I once challenged a group of alternative medicine believers by offering them five different cures for cancer, each based on a single unique cause of cancer. (In alt med there is cancer, and all forms are the same.) I pointed out that at most one of these could be correct as they were mutually exclusive and asked them to tell me which one was the correct one. I was universally informed that all of them were correct.

A beautiful example of the way this thinking works comes from alternative medicine, where I have been told that germs and viruses do not cause disease but that AIDS is being deliberately spread in Africa using vaccines contaminated with HIV. This belief is held simultaneously with one that says there is no such disease as AIDS and if there were it would not be caused by the non-existent HIV but by the use of recreational drugs (or in extreme cases by drugs used to treat AIDS). HIV has never been proven to exist despite Luc Montagnier winning the 2008 Nobel Prize for isolating the virus, and even if it did Robert Gallo didn't prove that it caused AIDS. In a case of total bizarrity, Montagnier has been adopted as some sort of hero by altmed and his Nobel Prize (for discovering something that doesn't exist) is used to boost his endorsement value. I never said that there was any logic to the way these people think. Did I say "think"? Sorry.

I mentioned that I studied cognitive psychology. I often hear this ability to hold two logically contradictory opinions as simultaneously true described as "cognitive dissonance". It isn't, because dissonance implies awareness of the contradiction; cognitive dissonance is the situation where someone acts in a manner contrary to their belief and it is resolved by rationalisation or justification. The ability to hold two contradictory positions simultaneously and believe them both to be true can't be described by a better word than the one George Orwell invented for 1984: "doublethink" which he defined in the book as "the power of holding two contradictory beliefs in one's mind simultaneously, and accepting both of them".

It must be obvious that anyone who can happily practice doublethink is going to be somewhat resistant to conflicting information, because they can assimilate it without rejecting what they already know provided that it disagrees with what they know to be false.

I want to finish by offering a light-hearted explanation of this phenomenon of resistance to conflicting ideas. It's called "Morton's Demon" and was first described by Glenn Morton in 2002 as a means of explaining why creationists will listen carefully to what you say and then completely ignore it.

But first, some background.

In 1871, James Clark Maxwell suggested a thought experiment which could show how the second law of thermodynamics could be violated. Given two rooms separated by a molecule-sized door, a demon at the door could allow fast molecules to go from room A to Room B and allow slow molecules to pass from B to A. This would eventually cause a temperature difference between the rooms and this difference could be exploited to do useful work. (A second idea was to have the demon only allow molecules to pass in one direction, eventually leading to a difference in pressure.) As the demon used no energy this would be a form of perpetual motion machine and the second law would be proved to be flawed. While this argument might convince someone who didn't know how the universe works, it was soon challenged on the basis that the demon would in fact use energy to observe the molecules. This is an example of how science works – if something is proposed which defies what we know then the first thing to look for is why it might be wrong.

Glenn Morton expanded the idea of Maxwell's Demon to explain the resilience of nonsensical or wrong beliefs. He was particularly concerned about young Earth creationists (he had been one himself) but his demon applies to a much wider class of people.

His demon sits at the front of the mind and filters incoming ideas, only letting in those with which the person agrees and blocking the rest. This is much more powerful than any system where the ideas are tested for compliance by the mind and then rejected – they don't even get considered in the first place. I have seen people repeat the same faulty arguments within minutes of being informed, with evidence, that they are wrong. And I do mean minutes. As these people appear to otherwise be functioning human beings who can even tie their own shoelaces it seems reasonable to infer that the counterarguments are not even being perceived, let alone being evaluated and rejected.

Hmm. I think I see a thesis in cognitive psychology somewhere here.

I'll give the last word to a young Earth creationist, who was commenting on a discussion of Morton's Demon by a group of Christians who were in general agreement with it, and were using it to comment on the unreasonableness of some young earthers. I think it illustrates the problem very well.

"Anyways, the whole demon thing – In my thinking demons certainly do influence people's thought. However, I don't see why a demon would have any interest in leading someone to believe in a young earth, rather than an old one. Satan and his legions have no other interest than to take God's people away from him. They hate god, and they hate his people. But if a christian believes in a young earth, in what way are they farther from Christ than a christian who believes in an old one? We are brothers and sisters in Christ, and if we have a small difference of belief, but we still share the love of Christ, than in no way has the demon accomplished the only thing he wants to do – which is to drive us away from Christ".



Back to The Millenium Project
Email the
Copyright © 1999-
Creative Commons