Written by Rebecca Varghese
My bedroom floor was covered in spiders and candies the weekend before Halloween. I was helping my cousin put together classroom goody bags. Despite the festive wrappers and plastic creatures, the collection did not scream “Halloween” to me. There were no Reese’s peanut butter cups, the most valuable candy on the playground for weeks following Halloween when I was her age. She couldn’t bring candies with peanuts because of the increased rate of peanut allergies. Hearing about all the deprived kids in my cousin’s class got me thinking: how was everyone affected when my promoting class only had one student with the allergy?
We are living in a developed world. We are advanced in our understanding of illnesses and cures compared to other developing countries, but we still seem to have higher rates of allergies and asthma . We have learned to equate sterility with healthiness. For example, many are opting to have fewer children and raise them in sterile bubbles. We sterilize everything from dropped pacifiers to vegetables . We treat almost every ailment with antibiotics, even those that are unaffected by such treatments. While precautionary steps have saved countless lives, has our obsession with sterility gone too far?
The hygiene hypothesis, originally proposed by Professor David Strachan at the University of London in 1989, states that the “lack of early exposure to infectious agents, symbiotic organisms (such as the gut flora or probiotics), and parasites may increase susceptibility to allergic diseases by suppressing the natural development of the immune system” . Strachan was studying the relationship between the risk of developing hay fever and family size when he noticed that there was an inverse relationship between the two variables. He suggested that “infection in early childhood transmitted by unhygienic contact with older siblings” prevented allergies in younger siblings. Although he admitted that this hypothesis was “too vague” to become a new scientific paradigm, many researchers further explored the correlation between early exposure and immunity and found positive results .
When we are raised in this sterile bubble, our body is not given the opportunity to learn how to defend itself against harmful agents at a time when adaptation is quickest. Our body will protect us from what it thinks is harmful by overreacting to factors like pet dander, grass and peanuts. Common protective mechanisms are a runny nose and watery eyes to flush out the pathogen. While there are so many other factors at play for immunity development, the hygiene hypothesis can explain the situation at the most basic level of understanding from just one of the many perspectives .
So what is the moral of the story? Eat that chip you just dropped on the floor. Actually, don’t. As tempting as it may be to prescribe a daily dose of worm and microbe infested dirt, it really isn’t the safest solution. Our body’s microbial flora is only a fraction of what belonged to our pre-medieval ancestors. We are fighting with less immune intelligence . We cannot protect ourselves anymore without the help of modern medicine. There may be a better idea. If we can figure out the particular protein from the “bugs” that teaches our system, we can increase exposure in a safe and controlled manner and possibly gain some ground in the fight against the harmless, like Reese’s peanut butter cups .