Cognitive Biases and “Doing Your Own Research”, Part 1 – Decision Making, Belief and Behavioural Biasesby Christopher_NW // February 25, 2013
So after what seemed to be positive reception to my previous article on logical fallacies to look out for when doing your own research online, I wanted to follow up with another area that leads to fallacious activity, the cognitive biases. Cognitive biases, unlike logical fallacies, tend not to arise from both accident and malice, but generally through pure accident and so it is important to be aware that we all exhibit these biases. Cognitive bias can be defined as a deviation in critical judgement arising in a situational manner. This ultimately leads to irrational conclusions in our minds that appear to us as quite rational.
Cognitive biases impact on all areas of our lives and there are numerous different types (see the Wikipedia list for a jumping off point). However, for the sake of reducing the length of this article I have tried to accept the challenge of picking my top 5 so that you should be aware of when doing your own research online, including real world examples. For those who want to delve deeper into the evolution of cognitive bias, please refer to Haselton et al. in the reference list (or Google scholar search “the evolution of cognitive bias”; it is the top result at time of writing). Also as this is part 1 of 3, this article will focus on decision making, belief and behavioural biases and part 2 will focus on social biases, and part 3 will delve into memory errors.
Anchoring and Focusing Effects:
Anchoring and focusing occurs when you place a higher degree of importance on a single piece of information than you should when making decisions. This leads to the potential of increasing the inaccuracy of predicting the utility of future outcomes. Good science involves taking a new piece of information and looking at how reliable the experiments were to give the result. This includes testing various hypotheses and trying to replicate the findings followed by seeing how it fits into a current paradigm. This helps us make science-based decisions and avoids potential trouble when relying on a single new piece of data (for example new data may look different due to regression to the mean not yet taking an effect). An example of this are proponents for homeopathy basing their beliefs on the infamous 1988 Nature paper by Jacques Benveniste. His work apparently showed that you can get an antibody response after diluting the antibody out creating a memory in the water (this violates basic physics). He never offered a definitive reason behind this occurrence and to this date no one has been able to replicate the results, therefore no one should base their beliefs on this faulty pseudoscientific claim. So if you find a new claim that goes against “existing dogma”, look into the wealth of evidence around the claim.
Similar to above, Attentional bias arises when we pay greater attention to more emotional information rather than more relevant and objective information, particularly when addressing correlation/causation. This difference to anchoring effects is that the Attentional bias can be based on more than one piece of information (albeit from an emotional source). This removes objectivity from a decision and replaces it with subjective reasoning. A classic example are parents who refuse to vaccinate children based on emotionally charged stories from people on internet groups that spread misinformation (such as the Australian Vaccination Network, now legally ordered to change their misleading name). One of many examples on their facebook page is from the 7th of February (2013) when they mention a former nursing professor (a bit of cheeky appeal to authority on their side) who claims her Guillaine-Barre Syndrome was a result of the flu vaccination and is even complete with an emotional YouTube video. Several mothers I saw in the comments section then proceed to say they will not vaccinate their children. This is completely illogical given that, even though the story is sad, there is no evidence beyond the correlation (which I have to keep pointing out does not equal causation, no matter how many tears you want to add to the mix). The anti-vaccination camp truly is replete with these stories that fly in the face of statistical and scientific evidence in favour of vaccination.
Availability Cascade and the Bandwagon Effect:
I have placed two together here since they seem to feed off of each other. The availability cascade is where a group idea becomes more “true” to someone due to constant repetition (self-reinforcing beliefs). This can lead into the bandwagon effect, allowing people to go along with an idea or activity simply because others are participating. I would say this easily equates with the history of Nazi Germany, but then I would risk invoking Godwin’s Law). I would have also given examples of how prevalent this is in the anti-vaccination camp, but to spice things up let’s look at an example from the world of chiropracting. I won’t go into detail debunking the mostly bunkum filled world of chiropracting (that has been done nicely on the Science-Based Medicine blogs in the references), but there is a great deal of information given out by some chiropractic groups on the vertebral subluxation theory. Without going into too much detail, this theory (which is scientifically indefensible) states that correction of misaligned vertebrae will heal organic disease. This belief system has continually been taught to chiropractic students since its inception in the late 19th century in spite of wide-spread contradictory evidence due to the availability cascade. As more and more fall for the bunkum taught by “experts”, more people in the public go along with it. If this does not seem so bad, just look at the consequences of people using chiropractic therapy for more than mechanical neck and back pain, there are cases demonstrating a risk of stroke and other injury (also see Jeret JS and Bluth M in the references). The lesson is really to question why you believe certain things.
Observer Expectancy Effect:
This is something the peer-review system in science is supposed to weed out. The observer expectancy effect arises when a researcher unconsciously (we hope it’s not deliberate) manipulates/misinterprets data or experimental design to fit with their expected results. However, sometimes due to human error the peer review system fails us, and we get published data such as the recent French study led by Gilles-Eric Séralini. Where he concluded that consumption of GM maize by rats led to an increase in tumour formation. There were so many flaws in the study design, including using rats with a predisposition to tumour development and a sample size that was too small, that France’s six scientific academies called it “a scientific non-event”. In this case, a red flag to readers of the study should have been the fact that the author was an avid anti-GMO campaigner, and was biased from the start (note, this does not necessarily make it impossible to design a good study). This highlights that just because you can find and cite a peer-reviewed scientific articles, it helps to make sure they are valid. For more information that is easy to digest in the stomachs of non-scientists, please see http://www.senseaboutscience.org/resources.php/9/making-sense-of-gm.
Named after one of my favourite clinicians in history, Dr Ignaz Semmelweis who discovered that it’s good to wash your hands before having contact between infected and un-infected individuals (this was nonsense to his colleagues at the time). The Semmelweis reflex is when we reject something that flies in the face of an accepted paradigm. Funnily enough, this is what “main-stream medicine” proponents are accused of by the alternative medicine brigade, since most doctors and scientists refuse to accept the benefits of things like naturopathy and homeopathy. The reason this doesn’t apply to us (the scientists and doctors) is because of the paucity of evidence for the claims made by these groups. I shall look to an example within the naturopathic community to demonstrate the Semmelweis reflex in the real world. The Textbook of Natural Medicine, 4th Edition (2013) teaches naturopathy students about the four humours- blood, phlegm, black bile and yellow bile. The idea developed by the ancient Greeks was that an imbalance of any of these humours causes disease. Of course with the advent of modern medicine we now know this is just plain wrong, however this seems to not get through to the naturopaths who are taught according to the old principles. The old humour theory of disease is their paradigm and science-based medicine (the field, not the website) is what they reject. This is a particular concern to me given that in America so-called “Naturopathic Doctors” or ND’s are trying to get widely licensed as primary care physicians. How can they hope to provide adequate care to patients when they refuse to accept the thoroughly tested principles of modern medicine? To oppose this licensing please go to http://www.no-naturopaths.org/.
A picture of Semmelweis’ statue I took during my time at the
University of Heidelberg’s Medical School in December, 2012.
Now I should point out that I do not believe myself to be any less biased than any of you reading this (that is called the bias blind spot; when one thinks they are less biased than others, or that they can identify them more than others), and I am well aware that we will all continue to partake in the bias cake. This is due to evolution and the reasons behind why we have developed these biases in the first place (again, see Haselton MG et al. and look deeper), they have served a purpose. However, I fervently believe that we should try to be more aware of them and reduce these biases, especially when they interfere with the objective reasoning of our world. When I do my research, I often try to see things from the perspective of any opposing view that may be credible (you do not necessarily need to waste your time investigating nonsense theories such as those from the moon landing denialists and other tin-hat warriors), and try to work out what is wrong in my research. I hope you will do the same, and I hope this article has highlighted the cognitive biases present when making decisions, forming beliefs and analysing our behaviour and that of others. Part 2 (coming soon) will look at how social cognitive biases impact on doing your own research. Happy hunting.
Edited by Stephanie Haggarty.
Haselton MG, Nettle D, and Andrews PW. The evolution of cognitive bias. The handbook of evolutionary psychology, 2005:724-746.
Jeret JS and Bluth M. Stroke following Chiropractic Manipulation. Cerebrovasc Disease, 2002:13:210-3.