We’re So Healthy It’s Killing Us

0

In the early 1990s, a woman named Tamara visited her doctor in nearby Murfreesboro, Tennessee with a persistent sinus infection. Her doctor didn’t think it was anything to worry about, but prescribed the antibiotic erythromycin to help clear up the infection. Within 24 hours, she was experiencing a heart arrhythmia and shortly afterward died. She had had no history of heart problems and was otherwise healthy. The official cause of death was sudden cardiac death, but the antibiotic was to blame.

Studies[1,2] have shown that certain antibiotics (erythromycin included) carry a two-fold increase in cardiac death and a five-fold increase in cardiac death when combined with a host of CYP3A inhibitors (including Prozac, Zoloft, Prednisone, and Prilosec).

How senselessly tragic. It’s bad enough to die before your time, but what’s worse is that she died, not from a car wreck or a tornado, but from a common medical prescription—what’s known as iatrogenic illness.

doc-and-nurse

Tamara is not alone. In an brief published in the Institute of Medicine in 1999, it was estimated that nearly 100,000 people died each year from errors in the hospital. Add to that adverse drug reactions (106,000), unnecessary procedures (37,136), and surgery complications (32,000), and you have higher mortality than Alzheimer’s, diabetes, influenza, and pneumonia combined. It’s a sad irony but the third-leading cause of death in America is health care.

The medical mortality rate doesn’t even tell the whole story. Much of what we accept in modern medicine doesn’t kill us but harms us nonetheless. And they say that whatever doesn’t kill you makes you stronger, but I disagree. There are some medically beneficial components of our daily lives that actually make us weaker and dramatically reduce our quality of life. The worst part of all is that we do these things intentionally to make us healthier and improve our quality of life. That’s the paradox of the modern health care system: we’re so healthy it’s killing us. One is left to wonder, are we too healthy?

The Disease

In 1546, the Italian physician and scientist Girolamo Fracastoro published De Contagione et Contagiosis Morbis (On Contagion and Contagious Diseases). In it, he solidified the idea that unseen “spores”—whether chemical or organic—were the cause of disease. This concept was overshadowed by the miasma theory, which held that contagious disease was caused by miasma, or bad air, emanating from rotting organic matter. It wasn’t until French microbiologist Louis Pasteur determined the role of bacteria in epidemics 300 years later that Fracastoro’s ideas would evolve into what is known as modern germ theory.

Avoiding cholera before germ theory was a tricky enterprise
Avoiding cholera before germ theory was a tricky enterprise

In the meantime, more and more people moved into crowded urban centers and subsequently became susceptible to these dangerous microorganisms. The high density of populations in cities combined with communal waste and water solutions made them a literal breeding ground for disease.  Epidemics of cholera, typhus, and tuberculosis (TB) were a fact of life and in the 19th century, TB alone killing an estimated one-quarter of the adult population in Europe.

During one such outbreak of cholera in England, John Snow, a student of germ theory hypothesized that it was being caused by infected sewage making its way into the water supply. Despite naysayers in municipal government and the medical establishment at the time, Snow investigated every case in the outbreak and traced the cause to a specific water pump in Soho—the Broad Street pump. Everyone who had died from that outbreak had consumed water from that specific pump. Evidently a woman who had lived near the pump had a child who had contracted the disease from somewhere else. She had washed the baby’s diapers in water she then dumped into a leaky cesspool mere feet from the pump.

Snow was eventually vindicated, but his recommendation for citizens to boil their water before consuming fell on deaf ears and the tragedy became, as he said, “the most terrible outbreak of cholera which ever occurred in this kingdom.”

But continuously boiling water wasn’t a long-term solution and with such dense urban populations, the idea of obtaining clean water was increasingly impossible, so municipalities turned to other methods of killing potentially harmful microorganisms. In a paper published in 1894, it was formally proposed to add chlorine to water to render it “germ-free.”

The same idea of sanitation was applied to medical procedures too. Throughout the Age of Enlightenment, medical science became more sophisticated and more widely accepted, but certain diseases seemed ironically to increase with increased medical care. In the 1800s, Hungarian physician, Ignaz Semmelweis noticed that women giving birth at home had a much lower incidence of a specific disease called puerperal fever than those giving birth in the doctor’s maternity ward. Why would this be? Hospitals were at the fore of medical technology and innovation—it should’ve been the safest place on Earth. Semmelweis predicted that this type of iatrogenic illness, which killed upwards of one out of every four post-partum mothers in some places, was caused by the doctors themselves.

At that time, it wasn’t uncommon for obstetricians to deliver babies one after another without so much as washing their hands. They also conducted numerous interventions on laboring women with unclean instruments. Semmelweis discovered that doctors who washed their hands with a calcium chloride antiseptic solution saw a 90 percent reduction in childbed fever fatalities. But Semmelweis’s findings were even less well-received than John Snow’s. The scorn and ridicule that he received prompted him to leave his hometown of Vienna and landed him in a mental institution where he later died.

Eventually, the veracity of germ theory overcame the arrogance of municipal and medical authorities and antiseptics were increasingly used in both city infrastructure and hospitals. Along with better nutrition and hygiene, antiseptics in sanitation and medicine contributed to a remarkable decline in infectious disease in the last half of the 19th century. One dramatic display of this was in the incidence and mortality of tuberculosis, which declined precipitously in various regions throughout the West even before medical practitioners knew what caused it leading one scholar to assert, “It follows that better living conditions would help to reduce the number of deaths from tuberculosis better than any medical preventive measure.”

tb_mortality_over_time_uk

These measures had a dramatic effect on the incidence and mortality rate of certain infectious disease. Wherever populations were able to increase sanitation and hygiene, infectious disease and subsequent mortality plummeted. By the beginning of the 20th century, commercial disinfectants like Lysol began to make their way into Western homes. People started using disinfectants to kill microbes on their floors, on their dishes, and in the air. Disinfectant use spread like, well, like a communicable disease.

The Miracle Drug that Wasn’t

Despite their dramatic success, there was one place antiseptics and disinfectants didn’t work: the human body. By design, the chemicals were toxic and killed life and so could not be used as ingestible medicine. But the efficacy of disinfectants prompted scientists and medical professionals to push for an antimicrobial medical treatment.

In 1922, the inquisitive Scottish World War I veteran Alexander Fleming was trying to find that miracle drug that would kill harmful infectious diseases without doing damage to the human body. While working with some bacteria in a Petri dish, his nose leaked and some of the mucus accidentally landed in the dish. He watched to see what would happen and, sure enough, something in the mucus killed the bacteria. That something is a type of enzyme called lysozyme, which is healthy for humans but eliminates bacteria by damaging their cell walls. For Fleming, this was a dramatic finding, which suggested that his miracle drug was possible.

mte5ndg0mdu0oda4mzk3mzi3Six years later, after returning to his laboratory from vacation, Fleming would finally discover what he was looking for. Before his vacation, he had placed a stack of Petri dishes in Lysol to clean them, but the stack was so high, some of the dishes remained above the sanitizing liquid. While complaining to a former assistant how much work he had, Fleming referenced the tall stack of dishes and noted the ones left unsanitized. There was even mold growing on one of the dishes. Upon closer inspection, Fleming noticed that one dish with mold growing in it had none of the Staphylococcus from the experiment left in it. It had appeared as if the mold had killed the bacteria!

In a complete coincidence, a mycologist had been collecting molds in a laboratory one floor below Fleming’s and some of that particular mold—of the genus Penicillium—had made its way upstairs and into a Petri dish left idle while Fleming was on vacation. Fleming had just accidentally discovered the drug penicillin, which kills certain bacteria yet is harmless to most humans.

Slowly but surely, researchers in the United States and Europe started replicating Fleming’s findings and used penicillin in treating serious bacterial infections from bacterial meningitis to tuberculosis. During World War II, the use of penicillin saved countless lives affected by battle wounds and pneumonia. The success of the drug won Alexander Fleming a Nobel Prize and garnered the nomenclature Fleming was hoping for: the miracle drug.

Today, penicillin and other antibacterials like amoxicillin, ampicillin, and oxacillin are prescribed for conditions as varied as strep throat and urinary tract infections. It’s estimated that over 150 million prescriptions for antibiotics (which include antivirals and other antimicrobials along with antibacterials) are written every year—many of them for children. We are officially in the antibiotic era in which the drugs have dominated health care.

It’s important to note, however, that Alexander Fleming didn’t invent penicillin, he discovered it as a natural byproduct of certain types of mold. In fact, the penicillium mold is in the air all around us and is easy to grow yourself—just leave some bread in a dark, damp place for a couple days. Penicillium mold is used in roquefort and bleu cheeses and the crusts of brie and camembert cheeses.

There are other natural antimicrobials too. From honey to garlic, dozens of edibles contain antimicrobial properties. And like penicillin, some antimicrobials are found in some unexpected places. Actinomycetes, for instance, are organisms that live in soil and approximately half of them are capable of synthesizing antibiotics like tetracyclines, which have been shown to be effective in treating typhus, strep, cholera, anthrax, syphilis, and even acne. Tetracyclines have been found in human skeletal remains from as long ago as 1800-1500 years ago, indicating that our ancestors consumed the antibiotic, whether knowingly or otherwise.

Helpful antibiotics are all around us, but unfortunately, we’re not taking advantage of these natural cures. In our quest to avoid disease by desanitizing everything so thoroughly, we’ve stifled much of their potential. Remember, the only reason that penicillin was discovered was that Alexander Fleming didn’t have a big enough disinfectant pan to fit all of the test dishes. If he’d had his way, all the Petri dishes would have been thoroughly cleansed before his vacation and mold wouldn’t have been allowed to grow and create the antibacterial agent that naturally surrounds us. We got so excited about the prospect of killing all the bad germs that we didn’t notice that we’re killing some of the good germs too. By this measure, we’re too clean—we’re too healthy.

So, we avoid dirt at all cost, cover all our household surfaces with disinfectant, douse ourselves with hand sanitizer, and throw away any food with a hint of mold on it. And to compensate for the loss in natural antibacterial agents, we lather up with antibacterial soaps and lotions, swish antiseptic mouthwash, and like the father in My Big Fat Greek Wedding, spray Windex on everything else including injuries. Antibiotics in general and antibacterials in particular have been so impressive in treatment that patients demand prescriptions for them even when there is no clinical benefit, such as when they’re used for the common cold. According to the CDC, up to half of antibiotic use in humans is unnecessary and inappropriate. That is astounding and dangerous.

chicken-middle-2-ngsversion-1423839831159-adapt_-676-1But most of the antibiotics in use in America aren’t intended for humans. Modern food production also requires heavy use of antibiotics. Some antibiotics are used to encourage livestock growth, but much is used as a prophylactic measure to prevent illness in animals that are subjected to dangerously dense living conditions. Upwards of 90 percent of antimicrobial use in the US is for non-therapeutic purposes in agricultural production.

All of those antibiotics—synthetic and otherwise—have to go somewhere. Antibacterials such as triclosan can enter the environment and may accumulate in the food chain over time, posing a health threat. There have also been studies showing a link to use of antibacterials in hygiene and increased allergy incidence in children. More threatening, though, is the fact that overuse of antimicrobials has led to a spike in antimicrobial resistance in which a microbe evolves to become more or fully resistant to antimicrobials that previously could treat it. Resistant microbes are increasingly difficult to treat, requiring alternative medications or higher doses, which may be more costly or more toxic. According to a World Health Organization report, “this serious threat is no longer a prediction for the future, it is happening right now in every region of the world and has the potential to affect anyone, of any age, in any country. Antibiotic resistance is now a major threat to public health.” The CDC reports that upwards of 23,000 deaths a year in the US can be attributed to antibiotic-resistant infections.

Perhaps more startling is that antibiotics used and overused to kill harmful bacteria kill off good bacteria also. We have over 1000 trillion bacteria inside and on our bodies (which equates to about three to four pounds) that we are dependent on for digestion and gastrointestinal health. When you ingest antibiotics, you necessarily upset the natural balance of those bacteria and a whole host of problems can result. When enough good bacteria die, the yeasts in our digestive system take over. As Doug Kaufman explains, “Yeasts are opportunistic organisms. This means that, as the intestinal bacteria die, yeasts thrive, especially when their dietary needs are met. They can use their tendrils, or hyphae, to literally poke holes through the lining of your intestinal wall.” The result can be the so-called leaky gut syndrome, which involves bloating, gas, cramps, food sensitivities, and even psychological conditions. As neurogastroenterologist Jay Pasricha has said, “For decades, researchers and doctors thought that anxiety and depression contributed to these problems. But our studies and others show that it may also be the other way around.”

The Cure

The word antibiotic comes from anti (meaning against) and biotic (meaning life), so literally, antibiotic means anti-life. And while the intent is for antibiotics to be destructive of microscopic organisms, they, along with antiseptics, can be destructive of macro-organisms such as human beings too.

The goal was to stop the horrors of infectious disease and to a large extent, these chemicals and drugs have done a remarkable job. There is a reason someone can title a book Love in the Time of Cholera—because cholera isn’t a thing any more. Same with tuberculosis and the plague. If you’ve never heard of erysipelas, dropsy, effluvia, it’s because it’s not a common cause of death as it was in the 1800s. People actually used to die from tooth infections and now they’re easily treatable. All of this happened through a concerted campaign against the microbes that wrecked havoc on humanity.

But, as with everything, there can be too much of a good thing. There was a point at which people started using disinfectants so much that it killed many of the healthy microorganisms that protected us against harmful bacteria. Luckily we had medical science step up and produce artificial antibiotics to do what nature was doing before but, as explained above, that has its risks as well.

The solution isn’t to drop all antibiotics and antiseptics altogether. Modern civilization requires close living situations and these prophylactic measures are necessary to prevent constant outbreaks of communicable disease. The solution also isn’t for everyone to drop modern life and go live like Tarzan in the jungle, though that does seem appealing sometimes. The solution is to use these medical treatments when they’re helpful and to avoid overuse.

Drinking some water that came from a sewage-infested river? Chlorine might be a good idea. Going to dig out some shrapnel from your bleeding guts? You probably should make sure there’s no bacteria on the forceps. These measures are absolutely appropriate. But fervently scrubbing down every surface of your house? That’s unnecessary. Preventing your kids from playing in the dirt? That’s most likely doing them a disservice. Taking antibiotics for your common cold? You’re doing it wrong!

We recommend antiseptics in the bathroom, around trash, and in the hospital (or anywhere you treat open wounds). We also recommend washing hands thoroughly and using hand sanitizer after interacting with the general public and public places like gas stations or public restrooms. These can be health-saving and potentially life-saving measures. But you don’t need to go overboard and sterilize everything all the time. You don’t need to wear off the finish of your tile counter or use the sterilize feature on your dishwasher to ensure a healthy lifestyle. In fact, researchers found that households who hand-washed their dishes were less likely to develop childhood allergies (eczema, asthma, or ARC).

We also recommend medical antimicrobials when a patients is diagnosed with a serious bacterial or viral infection, but that’s it. Avoid them in all other instances to promote natural cures and to promote your natural immune system. The CDC has launched a campaign to help combat antibiotic resistant disease. They recommend that patients:

  • Ask if tests will be done to make sure the right antibiotic is prescribed.
  • Take antibiotics exactly as the doctor prescribes. Do not skip doses. Complete the prescribed course of treatment, even when you start feeling better.
  • Only take antibiotics prescribed for you; do not share or use leftover antibiotics. Antibiotics treat specific types of infections. Taking the wrong medicine may delay correct treatment and allow bacteria to multiply.
  • Do not save antibiotics for the next illness. Discard any leftover medication once the prescribed course of treatment is completed.
  • Do not ask for antibiotics when your doctor thinks you do not need them. Remember antibiotics have side effects.
  • Prevent infections by practicing good hand hygiene and getting recommended vaccines.

In Zero to Paleo, I wrote about the !Kung San in Southern Africa, one of the last groups on earth to practice the traditional hunter/gatherer lifestyle on Earth and they maintain a robust, healthy lifestyle. It seems the only real threat to the !Kung way of life is the government’s forced modernization policy which has relocated many Bushmen into permanent housing settlements, a plan that many have claimed is a way clear the area for lucrative diamond mining and tourism. When the Bushmen are relocated to settlements and removed from their natural hunter/gatherer lifestyle, the results are tragic, to say the least. For the 1,800 or so Bushmen that were relocated into camps over the past decades, the quality of life has rapidly deteriorated and many have contracted HIV Aids or tuberculosis, or become dependent on alcohol.

The government insists that they provide drinking water and basic health care at the camps, but the Bushmen of the Kalahari were able to procure enough drinking water in the middle of the desert so as to survive for thousands of years without government help; wasn’t that enough? Additionally, basic medical care is good, but only becomes vital when people are forced to live in cramped quarters where disease tends to thrive. As one relocated !Kung woman reported of the camps, “There are too many people; there’s no food to gather; game is far away and people are dying of tuberculosis. When I was a little girl, we left sickness behind us when we moved.”

This last part is important because it shows that hunter/gatherers are considerably healthy despite their limited nutrition and their complete lack of modern medicine. Once they are moved into an unnaturally and densely populated area, communicable diseases become a problem and quality of life suffers. The basic principle, then, is to emulate their lifestyle as best you can despite living in densely populated areas: expose yourself and your children to the type of viruses and bacteria that would be found in nature as traditional hunter/gatherer societies and avoid the pathogens that are prevalent and become epidemics in densely populated areas. As it turns out, the real miracle drug isn’t a drug at all—it’s nature itself. Emulating that will give us the best shot of living happy, healthy lives.