The internet has allowed unprecedented access to information, which on the whole, has been a marvelous thing. Gone are the days when only those who knew Latin could both access and understand information, gone are the days when you had to spend an inordinate amount of time in the library trying to find the right book. Now at the press of a button we can usually find what we are after, or at least what we think we are after. However given the immense bias present in online articles, being able to critically analyze information so that one can sift through the quackery and bunkum is an important skill to develop and refine.
Scientists such as myself are trained to analyze and critique information; however you don’t need a formal degree to attain a level of erudition that makes it capable of critically reading the information Dr. Google and Prof. Wikipedia gives you. Part of the critical analysis involves the identification of logical fallacies, which philosophy defines as a structural flaw in deductive (reaching conclusions from general statements) arguments which renders said argument invalid. However for our purposes we will define it as any argument that becomes problematic for any given reason. Whilst we have all been guilty of logical fallacies if we can limit our committal to these fallacies we are one step closer to being able to critically analyze and produce information, and living a more educated and informed existence.
Some of the most common logical fallacies perpetrated online include appeal to authority, ad hominem, ad ignorantiam, the straw man, kettle logic, nirvana fallacy, cum hoc ergo propter hoc, non-sequitur, plurium interrogationum and the fallacy fallacy. I used Latin because it looks fancy and makes me look smarter (this is a cheeky attempt at showing an informal fallacy). However, fear not, as below are the aforementioned fallacies in more detail with an example for each.
Appeal to authority
An appeal to authority is when someone is agreed with due to their position in society or their level of education. I see this a lot, and have noticed we have a natural attraction to using this. However it can be quite a complex fallacy to deal with due to areas it appears to be legitimized. Generally speaking you accept the medical advice your doctor gives you because you trust in their formal and regulated medical training. However it would be wrong to assume that the doctor (or whoever) is always right. This can be especially true when listening to a highly qualified individual talk about a topic that is not in their field. It’s like I’ve always said, just because someone has a Nobel prize in Medicine (or whatever field) does not make them an expert on all medicine, just their specialization. This leads in perfectly to a sub section of this fallacy, argumentum ad verecundiam (irrelevant appeal to authority), and so to show you an example of both fallacies tied in: Linus Pauling Ph.D, was a brilliant scientist who won two unshared Nobel prizes, one in chemistry (1954) and a peace prize (1962). However he began to make large, unsubstantiated claims that mega-doses of vitamin C prevent and cure cancer. Just because he had a Ph.D and was a brilliant Nobel laureate does not make him an expert on medical science and cancer. It should be noted that while current and large scale meta-analyses disprove the effectiveness of mega-dosing vitamin C for cancer therapy, if he had turned out to be right, it would still be fallacious to say his reasoning was sound because at the time of said claims there was no evidence backing it up.
The Latin translates as “to the man”. The fallacy develops when one attacks the person making the argument rather than the argument itself. This would be like discrediting something because the arguer is in possession of a conviction. An even more primitive form of this fallacy would be like calling Peter an idiot, and is not actually a logical fallacy unless you said “Peter is incorrect because he is an idiot. Ad hominem is rampant in politics, and one example dates back to 1800 when Thomas Jefferson was deemed “an uncivilized atheist, anti-American, a tool for the godless French” by proponents of John Addams, the oppositional presidential candidate. This is quite an easy fallacy to watch out for, but due to human emotions can be difficult to control (at least for some).
An argument from ignorance, where one assumes a claim to be true/false since it has not been proven otherwise. A red flag to look out for is when the following is written in one form or another- “there is so much we don’t know about X, how can they say it is false!?”. Dan Buzzard’s blog gives a great example where pushers of homeopathy state that in time quantum physics will likely deliver the proof that homeopathy works. This is going along with something in the absence of proof. It can also be the case where the individual is ignorant to the evidence for something and therefore to them, thing X is false because there has been no proof for it. An example of this is people who claim that vaccines do not work and germ theory denialists (yes I was shocked that they exist too); Bill Maher perfectly fits into both of these groups. Dr David Gorski pointed out that in 2005 Bill Maher said “I don’t believe in vaccination either. That’s a… well, that’s a… what? That’s another theory that I think is flawed, that we go by the Louis Pasteur theory, even though Louis Pasteur renounced it on his own deathbed and said that Beauchamp(s) was right: it’s not the invading germs, it’s the terrain. It’s not the mosquitoes, it’s the swamp that they are breeding in”. Bill seems oblivious to the fact that Pasteur’s renunciation of germ theory is a hoax and no solid evidence exists that is contradictory to germ theory or to the effectiveness of vaccination. Here, Bill seems to be pleading from his own ignorance, which does not make his claim any truer.
Straw man logic
One creates a straw man when they create a similar but unequivalent position to an original argument, then refutes the created argument in order to break down the original argument (often whilst ignoring key points of the original argument). A simple example will help understand straw man logic. Ann says antibiotics are good, Peter says “if all antibiotics were good, we would have no antibiotic resistant bacteria, and because of antibiotic resistant bacteria we have new superbugs!”. Here Peter incorrectly claims Ann wishes to imply that all antibiotic use has had positive effects; Ann never made a comment about antibiotic misuse. Another example, this time a real life one, should cement what a straw man fallacy is. In 2005 President George W Bush states (in relation to troop withdrawal from Iraq) “We’ve heard some people say, pull them out right now. That’s a huge mistake. It’d be a terrible mistake. It sends a bad message to our troops, and it sends a bad message to our enemy, and it sends a bad message to the Iraqis”. Saying that un-named people want troops immediately out of Iraq exaggerates the opposition’s view point. Here Bush is what some of us like to call a “straw man warrior” if we are being cheeky.
Kettle logic occurs when someone uses a variety of inconsistent and incongruent points to defend their argument. The name comes from a story in Sigmund Freud’s publications, whereby a man accuses his neighbour of returning his kettle in poor condition. The accused argues three points- the kettle was returned undamaged, that it was already damaged before he used it and that he never borrowed the kettle in the first place. What is obvious is that the argument would be stronger with only one of the 3 points being used. During George W Bush’s Iraq war, Vice President Dick Cheney gave a classic example of kettle logic when he said the intelligence used to invade Iraq had been sound and accurate; the faulty intelligence was all Bill Clinton’s fault; the invasion didn’t do any damage but rather it was the Iraqis who damaged Iraq; and any invasion causes horrific things to happen, that just comes with the territory. Identifying kettle logic is quite easy generally speaking, but can get trickier to spot in lengthy discussions, so keep sharp and try and keep track of points being made in an argument.
This has nothing to do with rock n roll; rather the nirvana fallacy arises when one rejects solutions to problems because they are not perfect. Whilst it is good to aim for perfection, in the real world you will seldom (if ever) find a perfect solution, and so you should settle for what is practicable. Additionally, it is good to identify shortcomings to research and products etc, but that does not mean the idea itself should be outright rejected. The nirvana fallacy crops up all the time when reading about advances in biotechnology and pharmaceuticals (including drugs and vaccine development). A perfect example of this comes from proponents of the “Green our Vaccines” movement from 2008, where one commentator said “I believe in vaccines…..but I don’t believe they are 100% safe. Until they prove it to me, my child is having no more”. The fact of the matter is that while vaccines are not 100% effective and there is a very very minor risk of an adverse effect (for instance, around 1 in a million who get the MMR vaccine will experience a severe allergic reaction), the risk is offset by the benefits to both yourself and to others via herd immunity. Measles kills 1 per 1000 infected, and up to 20% of cases involve severe illness. So demanding that things like vaccines and drugs need to be 100% safe before use is lunacy, just as if I were to demand that unless playgrounds/parks were 100% safe I won’t take children there (in the USA around 200 000 injuries per year occur at playgrounds).
Cum hoc ergo propter hoc
This is not only fun to say, but an important fallacy. The Latin translates to “correlation proves causation” (which as anyone who has taken any basic science or statistics course at school will tell you is a poor assumption). Just because two variables occur together does not mean cause and effect. Statisticians like to use the ice cream analogy to demonstrate this, where lots of ice creams are sold at the beach, people who go to the beach are at higher risk of developing skin cancer, therefore ice cream causes skin cancer. In my opinion the best real life example of this is by the anti-vaccine movement (yes I am being hard on them, but only because they are dangerous to public health), particularly the famous Andrew Wakefield case where he fraudulently created data showing that vaccines cause autism. Whilst this has been thoroughly debunked (see Brian Deer’s page in the references), there are pockets of resistance who fail to see or don’t want to see that they are partaking in cum hoc ergo propter hoc. As an aside, children tend to exhibit signs of autism around the same time the vaccine schedule occurs, and when looking at non-vaccinated children there are no statistically significant differences in autism rates.
Simply translated as “doesn’t follow”, a non-sequitur is when an argument’s conclusion does not follow the premise of said argument. This can, like many fallacies, occur quite naturally and without malice. For instance you may have experience some confusion mid way through an argument and conclude with something that makes no sense with respect to what you were talking about. This would be like if I told you “heroin is addictive, addiction is an illness, so when you leave home you are at risk”. In this example, leaving home being risky (the conclusion) has not had any ties to the start of the argument. I have not said that leaving home as an addict to score heroin is riskier than staying home and not doing heroin (more logical). A good example of this in real life is in advertising, particularly deodorants. Lynx ads for example (and I am not saying Lynx deodorant is not nice, I love it) frequently shows men applying deodorant, the deodorant smells nice, this attracts women to the men. The fallacy is that by using Lynx you will attract women, obviously negating the effects of personality, physical attraction etc.
This is a fallacy of many questions (or “loaded” questions). A favorite of politicians and radio shock jocks, a loaded question is made whereby the arguer incorporates something presupposed into their question which is designed to limit responses. I actually experienced this a few weeks ago (with a vaccine denier no less), where she emailed me asking “if Big Pharma knows that mercury is toxic why do they put it in vaccines?”. Here she loads the question with the presupposition that so-called “Big Pharma” puts toxic mercury into vaccines, which can limit an uneducated person’s response (I mean uneducated in terms of one’s knowledge about vaccine formulation). In actuality elemental mercury, which is different to a molecular compoundcontaining mercury, is not put into any vaccines and in both cases toxicity is dependent on dose (fruit for example has trace amounts of formaldehyde, but the amount is so small it causes no harm). Another example was when my friend contacted me to ask “why do governments put fluoride in our water when they know it’s a poison”. After reading the aforementioned example I am sure you can see where the plurium interrogationum comes into play (presupposition that an authority “knows” the ingredient they are using is toxic, when in fact this is false). Note: I was happy my mate asked me about this before posting and spreading any nonsense, I gave him all the information debunking the anti-fluoridation movement.
I thought I’d end on this last one because it’s quite amusing. Just because you do find an error in someone’s logic in their argument, it does not make the conclusion false. This would be like if I claimed that “Australian Aboriginals cannot absorb vitamin D from the sunlight due to their dark skin, so it is important that they maintain healthy levels of this vitamin”. The incorrect information here is two-fold, firstly you don’t get vitamin D directly from the sun, your body synthesizes it after exposure to UVB light from the sun, and secondly dark-skinned individuals can absorb UVB for aforementioned vitamin D synthesis, just due to the pigmentation the levels are indeed lower when compared to Caucasians. However, the conclusion that Aboriginals should maintain healthy levels of vitamin D is still correct; otherwise this increases the risk of bone dysfunction and the development of atopy and allergy (documented in various studies, see my publication in the references for an overview).
This is not an exhaustive list of fallacies you will encounter (I would need to write a book to fully cover them all in detail), however I have found them to be the most prominent. Enthused readers are encouraged to Google “logical fallacies” and “list of fallacies” (in addition to reading philosophical works, but remember, keep a keen eye out for fallacies! Also avoid becoming a fallacy champion yourself. Happy hunting.
Weir C and Walton S. The interplay between diet and emerging allergy in Indigenous Australians: what can we learn from Indigenous Australians? Int Rev Immunol2012, 31:184-201.