Today, hospital acquired infections are incurred by one in 20 patients admitted to the hospital (affecting 1.7 million patients each year). The medical term for this is “nosocomial infection” (Greek nosos. “disease,” komien, “to care for.” Roman military hospital orderlies were called nosocomi). The cost is enormous, adding more than $30 billion to the bottom line for healthcare in the US alone.
Antibiotic resistant infections due to “Superbugs” are on the rise. One superbug, called “MRSA,” affecting over 100,000 patients a year, caused the death of more than 18,600 patients in 2005. This number supersedes the death rate for breast cancer, AIDS and SARS combined.
A HISTORY OF HOSPITAL ACQUIRED INFECTIONS
“He rolled up his shirt sleeves and, in the corridor to the operation room, took an ancient frock from a cupboard; it bore signs of a chequered past, and was utterly stiff with old blood. On of these coats was worn with special pride, indeed joy, as it had belonged to a retired member of the staff. The cuffs were rolled up to only just above the wrists…” Leeds, England, 1884.
This was the state of affairs in surgery, before the introduction and acceptance of the principles and rituals of antisepsis.
Prior to the 1800’s, typhus was recognized as a “hospital” infection, running rampant in city hospitals caring for the poor and in military hospitals. Once surgery became more prevalent during the 19th Century, other hospital acquired bacterial infections became a frequent occurrence. The combination of surgery followed by serious infection was well over 80% for simple surgeries such as limb amputation, in Munich, during the 1870’s.
At that time, it was far safer to undergo an operation in bed at home (3–5 times safer) than it was to have the same procedure performed in the controlled environment of the hospital.
“Erysipelas” (streptococcus skin infection, causing bright reddening) was considered a part of hospital life, especially after surgery, prior to 1890. In the 1830”s the term for hospital-acquired infections was introduced by James Simpson in England, who called the problem “Hospitalism.”
The prevailing theory for the spread of infection was that poor ventilation and stagnant air (“corruption of the air,” “divine wrath”) was the culprit, not direct contact from infected source to patient. Therefore, physicians were more than strong in their opinions as to how to prevent infection: open the window and prevent overcrowding. “Germ theory” was not yet in favor, so the concept of personal hygiene was not a consideration. Any intimation that such might be true would be taken very personally.
Take the case of Ingaz Semmelweis.
Ingaz Semmelweis was a surgeon who worked in Vienna at the obstetrical clinic there in 1848. He noticed that the infections rates for the obstetrical clinic run by the physicians (“puerperal fevers”) were at least twice that of a second clinic run by midwives. Apparently, the doctors were teaching anatomy in the mornings in the postmortem area, and then moving on to their operating duties thereafter. The midwives, on the other hand, came straight from the home.
Semmelweis came up with the idea of hand washing and the use of an antiseptic solution for hands and surgical instruments for the physicians when they moved from the autopsy area to the operating theater. This was certainly the idea of a crazy person.
He met great resistance. Within two years of his idea, he was forced to quit medicine altogether. In the end, he died in an asylum for the mentally unstable.
It took another 40 years for the stubborn and persistent Joseph Lister to put forth, and finally have accepted, the same idea. He showed that limb amputations became infected 47% of the time before hand washing and carbolic acid antisepsis, and only 15% of the time after this ritual was introduced.
Separately, Florence Nightingale demonstrated that hygiene made a difference. In 1854, across the Bosphorous during the Crimean War, she demonstrated that cleaning up the military hospital there with fresh linens, rat poisons and scrub–brushed floors would ultimately result in a reduction of the combat wounded death rates from 40% to 2% in a matter of six months. Ironically, she was able to match the same 2% infection rate that is the accepted “risk standard” for modern surgical procedures today.
Hospital surgical infection rates remained between 2–15% until the advent of production-level penicillin in 1941. Thereafter, post-operative pneumonia death rates fell from over 30% to below 10%, and surgical wound infection rates fell to below 5% towards the end of World War II. With the gradual introduction of an ever-widening array of the sulfas and other antibiotics, surgical infection rates fell to as low as 2% by the 1960’s.
After the publication of the “Origin of the Species” in 1859, Darwinism became all the rage. Biologic organisms from microbes to mammals, and people too, compete. Only the fit survive. Neitzsche’s übermensch said that “What does not destroy me makes me stronger.”
It had long been known that poultices containing certain molds and fungi worked better than others to prevent putrefaction of wounds. Once bacterial culture techniques became established, mini-Darwinan battles among competing organisms could be observed firsthand. The observation that some molds thrived to the detriment of bacteria was called “antibiosis” by Paul Vuillemin in 1889.
The impact of this observation was not appreciated until late August in the summer of 1928, when Alexander Fleming returned from vacation to discover that some of the staphylococcus cultures that he left behind had become contaminated with a mold. He noticed a zone around the mold where the bacteria seemed not to grow. It turned out that a soil fungus, which he identified as Penicillium notatum, had infected the Petri dish and was making a product that leached into the surrounding agar and held the growth of the surrounding bacteria at bay.
While penicillin destroyed bacteria, it appeared to be nontoxic to humans. Penicillin was effective against other “Gram-positive” bacteria such as Streptococci, Gonoccocci, Menningococci, Diptheria bacillus, and Pneunococci, while it was not effective against “Gram-negative” bacteria such as E. Coli.
Fleming did not pursue his discovery, however, due to the difficulty of isolating pure penicillin.
In 1943, the mass production of penicillin for clinical use was begun. This occurred after researchers “discovered” Fleming’s writings, and approached large pharmaceutical companies. Penicillin soon became readily available to Allied troops during World War II.
Penicillin proved highly effective on D-day, and saved countless lives.
The use of penicillin became widespread, and it was generally believed that the end of bacterial infections was near. The antibiotic was effective, saved lives, and was given to patients undergoing surgery to prevent infection.
As spectacular as the results at the end of World War II were, resistance to the new drugs arose with sudden, and at the time, unappreciated rapidity. Once mass production of penicillin was introduced, it took only three years for the staphylococcus species of bacteria to develop the ability to grow in the presence of penicillin.
The first cases were discovered in London in 1943 by Mary Barber. Looking back in 1961 in a review entitled “Hospital Infection Yesterday and Today,” she noted that
“By 1946, however, they (sic penicillin resistant staph) were becoming quite frequent, and a few years later, in hospitals all over the world, they outnumber penicillin sensitive strains… Today penicillin resistant staphylococcal infection is common even in hospital outpatients.”
By the late 1940s, antibiotics including streptomycin, tetracycline, and erythromycin were developed. Methicillin was discovered in the 1960s, as was Vancomycin, a drug that remains a “drug of last resort” in treating severe infections that are unresponsive to other antibiotics.
After the 1960’s development of new antibiotics slowed, due to the combination of tighter regulations resulting from new safety laws and a desire to hold new antibiotics in reserve. Currently the “time to resistance” for a newly introduced antibiotic remains at around 18 months.
As antibiotics are only effective against bacteria, vaccines were developed to target viruses, including smallpox, measles, mumps, typhoid fever, rubella, diphtheria, tetanus, yellow fever, pertussis, and poliomyelitis. Advancements in antiviral drugs were made starting in the 1970s with the introduction of acyclovir to protect against herpes and cold sores.
Soon, and to the surprise of many, hospital infections rates began increasing, instead of decreasing, during the late 1960’s. The explanation was initially believed to be due to relaxation by medical staff of aseptic techniques, as they increasingly relied upon antibiotics.
At the same time new medical technologies began to proliferate (e.g. artificial respirators, the “heart and lung machine” etc.). It was soon recognized that these devices inadvertently served as reservoirs for infection. Aggressive cleaning procedures were soon instituted. Healthcare providers began to realize that “sterile” did not necessarily mean sterile.
Technology standards and protocols were gradually developed as a result. Centralized supply systems were introduced: items were sterilized in a clean location and subsequently distributed in enclosed sterile packs. Surgical trays were sterilized with pressurized steam heat, and then wrapped. Gases such as ethylene oxide and liquid solutions of formaldehyde or alcohol were used to sterilize heat-sensitive equipment. As improvements were made in sterility, infections rates declined.
THE RISE OF SUPERBUGS
As the twentieth century advanced, hospital acquired infection rates climbed. Staph aureus infections spread rapidly in the 1950s until Methicillin was introduced in 1960. Methicillin-resistant Staph aureus (“MRSA” was described only one year later). Resistance was natural, and was once again the result of overuse of the drug (for instance, Methicillin was even sprayed in the air of infant nurseries in an effort to prevent infection).
Patients with chronic diseases such as diabetes, obstructive pulmonary disease, hepatitis, AIDS and tuberculosis survived longer and soon became reservoirs for resistant organisms after having survived multiple infections and antibiotic treatments. Chronic immune suppression, the inevitable concomitant of such long survival, and the overuse of antibiotics added to the problem.
In hospitals, the use of antibiotics became the recommended treatment for surgical procedures, as standardized by the Centers for Disease Control (CDC). Chronic antibiotic usage in the intensive care unit (ICU) setting also became standard. Failure to regularly use antibiotics to treat any perceived or possible infection soon became a regular cause for medical malpractice actions, even as the same patients who were dying of antibiotic-resistant infections were doing so as a result of the overzealous use of antibiotics.
The Catch-22 irony of having to live by a standard that demanded the overuse of antibiotics in the face of a growing in-hospital epidemic of antibiotic-resistant infections lead to the naming of these organisms as “Superbugs.”
The result is that a growing population of patients are surviving in hospitals who carry an increasingly various assortment of resistant infections other than MRSA. Some of these organisms that are of growing concern include Coagulase-negative Staphycoccus, vancomycin-resistant Enterococcus,Streptococcus, Klebsiella oxytoca, Serratia marescens, Pseudomonas aeruginosa, Legionnaires disease, and resistant Candida albicans.
There were an estimated 94,360 invasive MRSA infections in 2005, with an estimated 18,650 deaths. MRSA now kills more people annually in the U.S. than AIDS and breast cancer combined. Some MRSA infections are resistant to vancomycin, and many MRSA superbug infections have no antibiotic answers for them at all. Other organisms, such as Enterococcus faecium, have mortality rates as high as 96% for those patients who suffer them.
“Community Acquired Superbugs”
Antibiotic resistant organisms were found in the community as early as 1945, four years after the introduction of penicillin. With a growing population of chronically ill people harboring infections such as MRSA, the chances for becoming infected in the community, rather than in the hospital, have been rising over the past 20 years. Over the past 10 years, other hospital antibiotic infections have also begun to move into the community, posing an ever more malignant potential threat.
Most recently a combination of media coverage and a sudden increase in the incidence of what is now called “community acquired MRSA,” has lead to a rise in public awareness of what health care professionals have known about for years. Recent deaths have been attributed to exposure in schools, restaurants, gyms and locker rooms. Some of these superbugs have mutated in the community, and are of different strains than those found in hospitals. The reverse threat of hospital patients and staff being exposed by these organisms now presents a new wrinkle to the superbug problem. For instance, patients in England are now tested for resistant Staph in the days prior to a proposed surgical procedure in an effort to prevent post-surgical superbug infections from MRSA.