Following on from part 1 of cognitive biases to look out for when doing your own research, I want to now go into the social biases side of cognitive biases. Social bias, also known as attributional error, occurs when we unwittingly or deliberately give preference to (or alternatively, to look negatively upon) certain individuals, groups, races, sexes etc., due systemic errors that arise when people try to develop a reason for the behaviour of certain social groups (or what we believe is that groups behaviour). These prejudices are exhibited by all of us at some point, so whilst their elimination is arguably not possible, we should still aim to reduce our social bias. Furthermore, we should be able to identify when a social bias is manifesting itself, particularly when people are using mediums like the internet for their own research into an area. Below are some of the common social biases found (with real examples) which often lead to fallacious activity.
I am going to start with both my favourite and what I believe to be a very common attributional bias. The Dunning-Kruger effect occurs when incompetent and unqualified people believe themselves to have expertise on a topic greater than those with appropriate qualifications and in positions of expertise. Ehrlinger et al. phrases it perfectly-
“People are typically overly optimistic when evaluating the quality of their performance on social and intellectual tasks. In particular, poor performers grossly overestimate their performances because their incompetence deprives them of the skills needed to recognize their deficits”.
I think that the best example of the Dunning-Kruger effect in action is exhibited by Meryl Dorey, the former head of the deceptively named Australian Vaccination Network (recently ordered to change their fallacious name). Despite having absolutely no medical or scientific qualifications, she is seen to constantly give anti-vaccine advice and said she is “Australia’s leading expert on vaccination”. See the short video in this hyperlink for a hilarious example. The lesson here is to both check the qualifications of someone making odd claims and also even if they are appropriately qualified, check that what they are claiming has been peer-reviewed, tested and checked. If all of your “study” takes place on crank sites and publications written by tin-hat warriors then your level of understanding of the topic could end up more worse than it was before you started.
There is even an online meme called “Public misinformation Meryl”
to demonstrate her invalid claims on public health and vaccines
which drive the fringe group known as the AVN.
Zuckerman et al. describes the egocentric bias as the “tendency to see oneself as both cause and target of another person’s behaviour”. This social bias can also be defined as when an individual claims more responsibility for the results of a joint project/action than an outside observer would give them. A real-life anecdote I can give you on the latter definition occurred when a professor I knew from a very prestigious research institute took most of the credit for a beautiful Nature paper published a few years ago (no hyperlink to preserve anonymity) despite only contributing to ~10% of the work, and additionally claiming that the other professor (who I also knew) was his “post-doc”. For those not in science, a post-doc (short for post-doctoral staff) is a researcher who has recently completed their Ph.D, whereas a professor is someone holding one of the most senior academic positions at an academic institute such as a university. Calling a professor a post-doc to make yourself look better is the height of disrespect and claiming a high degree of credit that you are not entitled to is specifically known as a self-serving bias, since it can often help the person committing it (i.e. in the career). A good example of Zuckerman’s definition of egocentric bias is the 1977 experiment by Ross, Greene and House at Stanford University where they asked students to walk around campus with a sign that had “repent” written on it. Of the 50% that agreed to do it, most thought others would also agree to participate, and of the 50% that refused to be a part of the experiment, though most others would likewise refuse. To avoid re-inventing the wheel, or re-writing the written, I will simply refer you to an excellent PsyBlog article on overcoming the egocentric bias (http://www.spring.org.uk/2012/05/how-to-overcome-the-egocentric-bias.php). In scientific publications, this is often avoided when journals provide an author contribution list and description.
This social bias occurs when people think others agree with them more than they actually do. It is a cognitive bias that has developed to help us think our views and beliefs are deemed normal by the majority, which of course, is a consensus that can be entirely false. It leads to potentially fallacious thinking because often the result is you start to think that those who don’t hold your beliefs are wrong or defective in some way. Extremist religious groups are continually guilty of the full gamut of the false consensus effect, and I cannot think of a better example than the disgusting Westboro Baptist Church in America (I won’t attach a link since I don’t want them to have increased web traffic). On the home page of their site they claim their gospel message is your last hope. This message includes picketing military funerals, praising God for hurricane Katrina and telling homosexuals (who they call “fags”) it is good when they die. They see themselves as the real majority of those who fully understand religious “truth” (and that all real Christians will follow their hate-filled teachings). This fuels their ideals and leads them to believe that others who don’t agree with them are wrong since they think they have God on their side. It is crucial that you think critically when doing your own research so that you avoid the false consensus effect. A good way of avoiding this bias is to research views contrary to your own and put your critical thinking hat on.
In-group bias occurs when we give preferential treatment to people in our own group, which could include your family, social circle, research group or those of the same religion, as just a few examples. For science to be successful as a way of making sense of the world around us, we need to be capable of seeing the benefits as well as the limitations in another group’s work as this allows the spread and dissemination of knowledge. In-group bias is partly to blame for the completely preventable deaths of children living in an anti-vaccination environment, including those who believed Andrew Wakefield’s fraudulent data. People in dangerous groups such as the AVN, Generation Rescue and the Taliban keep the misinformation within their group flowing as they have given preference to the beliefs held by the group and its members, rather than the sound and correct medical advice from outside groups including the CDC. The result is an increase in preventable childhood illness and subsequent deaths as herd immunity is shattered. So next time you find yourself agreeing with something that comes out of one of your groups, think carefully as to why you are in agreement. Is there adequate scientific evidence and reasoning present? For more information on the evolution on in-group bias, please see the hyperlink here.
This is the very human trait of claiming heightened responsibility for successes rather than failures. We all naturally want to be the one responsible for a great scientific breakthrough and not the scientific disaster, for example. This can even take place as in a group, where it’s called the group-serving bias. Anything positive is attributed to personal or internal factors and anything negative is blamed on extraneous variables (such as another, competing group). A good example of this is in politics. In 2011 here in Australia, Julia Gillard (our Prime Minister for now) blamed the opposition leader, Tony Abbott, for an increase in asylum seekers arriving by boat (I want to make it clear that this is not illegal), since he refused to let new migration laws through parliament. This was despite 2011 being the year when there was a 39% increase in violence in Afghanistan, a rise in civilian deaths in Iraq from the previous year, and over 3 million people in Somalia affected by famine (see Maxwell and Fitzpatrick in the references). These are just naming a few examples of why there would be an increase in people seeking asylum by various means, including via boat. At around the same time she claimed success in various areas including the APEC summit. The lesson from this is to always be aware of where success was truly born from in any given area before applauding. Another good question to ask is whether or not the success is legitimate or not. Alternatively, ask and investigate whether or not claims of another group or person causing the problem are legitimate.
As is often the case, many of these social biases can occur organically and not out of malicious intent. In this case you should still be aware as the results can be just as bad (and sometimes worse). I doubt many of the people in fringe groups such as the AVN really want to cause harm to children and others, but that won’t stop harm from occurring. Remember the old proverb “The way to hell is paved with good intents”? To many of the members of these groups they truly want to help the wider community, but this will forever be hampered by a combination of their cognitive biases and logical fallacies in overdrive. As for the leaders of such organizations personal gain may indeed be a driving force and this is why people looking into the organizations need to be aware of how social biases and other cognitive biases manifest themselves. Next time I will conclude this 3 part series with the final type of cognitive bias, the memory errors. Happy hunting all in your research.
Edited by Stephanie Haggarty.