Thomas Kuhn was among the leading figures in the philosophy of science during the 20th century (along with other greats such as Karl Popper and Ludwig Wittgenstein). He popularized the use of the word paradigm, though many people have misinterpreted his work to his dismay (he even said that he was “much fonder of my critics than my fans”). His 1962 book The Structure of Scientific Revolutions is among the most cited books of all time and a must read for philosophers, scientists, historians of science and anyone with an interest in the process of science. To summarize Kuhn’s theory:
1. Interest in something occurs.
2. Many schools of thought emerge to create an immature science.
3. A paradigm emerges; normal science is born and addressed in a puzzle-solution fashion.
4. A dogma emerges with what fits into known theory.
5. Discoveries show holes in the dogma and anomalies unveiled.
6. A crisis point for the theory and a potential end for the “normal science” in question emerges.
7. Alternatives are sought out in the face of scrutiny.
8. Evidence mounts to allow for a scientific revolution and a paradigm shift occurs.
It is important to note that the crux of Kuhn’s idea was that science doesn’t just move forwards bit by bit, by require revolutionary ideas and theories to emerge and after standing up to scrutiny, cause an upheaval of the field altogether. Throughout the history of science many revolutions have occurred to change fields; the Copernican revolution and Galileo uprooting Ptolemy’s school of thought (showing that the earth was not at the center of the universe), quantum mechanics displacing Newtonian mechanics, the establishment of aseptic technique and the germ theory of disease, and discoveries including that of the atom and of DNA. Here I want to explore the top 5 Kuhnian revolutions and paradigm shifts in the history of malaria as identified by Jan Peter Verhave (see reference below). I’m writing about this because firstly it is an interesting topic, secondly there are historical lessons to be learnt, and finally, I study and research malaria, so I find it cool and wish to disseminate some of this information with you. If you want a run down of the life cycle of malaria, which in itself is quite remarkable, see my article on it here.
1. The replacement of miasma theory with the malaria parasite.
The word malaria actually comes from the Italian mal’aria, meaning “bad air”. This stemmed from the original idea that malaria was caused by noxious air like that found around swamps (when in fact the link with swamps was due to them being great mosquito breeding grounds). Interestingly, many different diseases that presented with a fever were lumped under the term malaria. The idea of miasmas pervaded much of medical and common thought at the time. However, since around 1879 change started to occur bit-by-bit, with Klebs and Tommasi-Crudeli finding that marsh water contained microbes, which they believed to be bacteria responsible for malarious diseases, and named the organism Bacillus malariae (see Carter’s book in the references). Doctors also found pigments in the blood of patients who had died from malarious fever; however this all was still piecemeal fashion elucidation. The revolution came when Dr. Alphonse Laveran made the discovery of parasites in the blood of patients who had malaria. He worked on many protozoal blood-based parasitic diseases and for his demonstration that parasites in human blood, rather than “bad air” caused these diseases, he not only overturned the old way of thinking (the old paradigm), but he was awarded the 1907 Nobel Prize in Medicine. Additionally, he has the honor today of having the sub genus that malaria parasites belong to named after him, Laverania. Kuhn’s idea of a paradigm shift is demonstrated perfectly in this example, whereby the newer way of looking at malaria was completely incompatible with that of the old way, and paved the way for much advancement in both the study and treatment of the disease.
2.Anopheles mosquitoes recognized as vectors of malaria parasites.
The year of 1898 saw the publication of Dr. Ronald Ross’ discovery showing that malaria parasites were transmitted via the bite of mosquitoes (his experiments used birds; yes they can get malaria too, as can many animals). This was a revolutionary finding since before Ross’ discovery (one which netted him a Knighthood and the 1902 Nobel Prize in Medicine) many had believed that female mosquitoes died after they laid their eggs and simply could not transmit a parasite. To further compliment this finding, a zoologist by the name of Giovanni Battista Grassi proved that only Anopheline mosquitoes could transmit human malaria. Together, these findings shifted the paradigm of the time towards a mosquito-human-parasite way of looking at malaria, which allowed the development of the idea that controlling mosquitoes could play an important role in preventing cases of malaria. Which leads us to…
3. Mosquito control, DDT and eradication.
Looking at malaria from an entomological perspective and not just from a medical point of view allowed for people to consider controlling mosquito populations. However, a widespread program could not be realized until after 1939, when Paul Müller discovered that DDT was potently insecticidal. Fun side fact, it was a student chemist by the name of Ziedler that originally synthesized DDT in 1874. Various programs were around, including the use of selective pyrethrum spraying, but arguments such as the best techniques and location for spraying persisted, and piecemeal puzzle solving involving Anopheles sub-species and types were still being worked out. This is the typical path of most science; solving bits and pieces of the problem along a fairly large timescale, which both proceeds from and precedes a scientific revolution. DDT not only replaced pyrethrum, but also became the centerpiece of the World health Organization’s idea for global eradication of malaria. This is due to DDT being potent against mosquitoes, but also how cheap it was, its long-lasting effect and infrequency of applications required. It began to have such widespread success alongside the use of chloroquine (an anti-malarial drug) that Müller was awarded the 1948 Nobel Prize in Medicine. However, from this story there are three lessons to be learnt. Firstly, as the world approach eradication of malaria and saw less and less cases, it wasn’t seen as important, and so a lot of funding for anti-malarial programs was diverted, which allowed for a resurgence over time all over again (a lesson I hope the government of Sri Lanka appreciates). The lesson here is to not celebrate too early and to see programs through to their actual conclusion, lest it becomes a costly and harmful mistake. Secondly, it slowly emerged that DDT is environmentally devastating, which has lessons regarding controlled use of the compound and more research into novel and environmentally friendlier insecticides. Finally, an ethical issue emerged where there were fears over loss of herd immunity due to DDT house spraying interfering with parasite transmission (natural immunity has been seen in adults in endemic areas so long as they do not leave the area, as the immunity is short lived). On the other hand it was argued that less child mortality was worth the potential price of adult immunity. Whilst this ultimately didn’t seem to develop into a problem, it highlights a lesson that one has to be aware of potentially serious ethical issues when trying to eradicate a disease.
4. The discovery of laboratory-based culturing of malaria parasites.
Until the 1970’s, people studied malaria in patients or animal models. These systems had many benefits for developing the normal course of scientific knowledge about malaria (and other diseases), and were being used to also understand the disease, test drugs and allowed for efforts towards a malaria vaccine; a situation which ultimately failed. However in 1976 William Trager and Jim Jenson elucidated the means of culturing Plasmodium falciparum parasites (the most lethal type) in human blood in the laboratory, termed in vitro culturing. This shifted the paradigm at the time away from animal model studies so that lots of parasite material could be gathered for vaccination research. Ironically, there has now been a shift back to other models to study malaria with an eye for vaccine efforts since previous efforts failed to yield an efficacious vaccine. However, the shift back has not been revolutionary as the initial one because now malaria vaccine research is so multi-disciplinary, utilizing blood cultured parasites, molecular biology techniques, genetics, bioinformatics, structural biology and animal models. One hopes that during this century a revolutionary idea regarding novel vaccination approaches emerges, or that the puzzle-solving style of most research allows for the genesis of an effective vaccine.
5.The use of combination drug therapy for treating malaria.
The story of antimalarial pharmacotherapy is quite an interesting one, with essential research accidentally giving birth to part of the textile industry. Allow me to explain. Quinine was the active compound derived from the cinchona tree that proved to be an effective antimalarial. However, vast plantations had to be utilized to provide enough of this compound, and so with the emergence of organic and medicinal chemistry in the 19th century, a challenge arose to synthesize quinine. In 1956, William Perkin, at only 18 years of age, took up the challenge. During his attempts at synthesis, he accidentally produced a mauve-coloured product. He went on to patent this dye and in 1870 the demand for this colour amongst the wealthy was enormous, and so gave rise to a boom in the textile industry. Also of note is that in 1887 Czeslaw Chęciński found that a dye known as methylene blue and the chemical eosin allowed one to visualize Plasmodium parasites with remarkable clarity in blood smears. Then in 1891 the great Paul Ehrlich (who went on to share the 1908 Nobel Prize in Medicine for research into immunity) found methylene blue could actually kill Plasmodium parasites. Unfortunately it was not sufficiently better than other anti-malarials at the time, and so never took off (see Krafts et al. for more information). Ultimately the synthetic compound known as chloroquine became the staple of malaria pharmacotherapy, however problems such as drug resistance became more and more prevalent, and was followed by parasite resistance to other emerging drugs. Then came the discovery of artemisinin by researchers in the Chinese army in 1972; however due to political issues, outside scientists were not widely aware of this new anti-malarial until the late 70’s and early 80’s. The revolutionary discovery was the finding that combining artemisinin (a fast acting drug) with a other drugs that acted slower. The originality came from the idea of a combination-based approach to treatment and not just single drugs, including the development of new drugs where the existing paradigm was to use only the one drug as a monotherapy. Artemisinin combination therapies are now the front line treatment option for malaria around the world, and are also effective in certain cases of complicated malaria.
To conclude, I will take a line from Verhave’s article (which inspired this piece)-
“Real paradigm shifts in the Kuhnian sense are upheavals of perceptions of the disease, and consequently of the approach to control, so much so that post-shift scientists have difficulty imagining how they worked before the change”.
I eagerly await to see what the next revolution is in my field, and hope to have my paradigm shifted many times in my lifetime to move further and further towards a more complete understanding of how to eradicate this scourge of humanity. I do think it will involve a massive coordination of skilled people and resources, in addition to engagement with professionals in fields seldom linked to malaria. This includes further engagement with the business community, as they have resources and power to help with malaria eradication; they just need to be made further aware that said eradication would serve to benefit them, as there are economic benefits to be reaped.