My ancestor died of a splinter. Wait, what?

Little bumps and blisters used to kill.

Share

Annie Hortense Crawford’s death was a long, dramatic affair. According to her front-page obituary in the California Democrat, the local paper serving California, Missouri, Mrs. Crawford’s demise had begun a full week prior to her 1930 demise. First, there was severe pain in her hand. Slowly, a creeping debility overcame her entire body. “She grew steadily worse throughout the day and evening,” the newspaper reported, “until the end came.”

It’s easy to imagine my great-great-great-grandmother (for that’s who Crawford was) was felled by some larger-than-life illness. But the reality is a little different: Crawford died of a splinter.

Reading the details of her rapid decline almost 90 years later, I was struck by the historic nature of her death—almost unfathomable to Americans today—and set out to find out why, exactly, people don’t die of splinters anymore. In the process I discovered the peculiars of her death weren’t, in fact, all that peculiar. In fact, unless we change our relationship to antibiotics, death by splinter could be familiar once again.

hospital
The Octagon Ward at John Hopkins was part of a hospital-wide effort to stop air from circulating between care units. Wikipedia

When Crawford was born in 1860, many Westerners still attributed disease to miasma—bad air—or an imbalance in the bodily humors like blood and bile. Doctors, just as they had since the days of ancient Greece, treated all manner of illnesses with things like fresh air, rest, and even bloodletting. It’s no surprise, then, that most people in this era died of their infections and many diseases considered curable today killed thousands.

“We had a gross misunderstanding of things we called blood poisoning and things we now recognize to be infectious disease,” says Duane J. Funk, a physician and sepsis expert at the University of Manitoba. But over the course of Crawford’s life, enterprising researchers drove an incredible shift in the practice of science and, most importantly, how we think about infection.

In the late 1850s, the French scientist and father of microbiology Louis Pasteur set about disproving the common theory of spontaneous generation. At that time, many people believed that agents of decay—the things that molded bread or rotted a peach—magically appeared from within the bread or peach itself. By showing that microorganisms came from elsewhere—that they infected a body—Pasteur established the basic mechanism of infectious disease. He went on to develop the earliest technique for pasteurization, as well as rabies and anthrax vaccines.

Other scientists subsequently sought to validate and expand on Pasteur’s ideas. Though he was ridiculed at first, the inquisitive surgeon Joseph Lister ultimately proved that carbolic acid had a sterilizing effect on open wounds and, when properly applied, saved lives. In 1890, the German physician Robert Koch published his unified “germ theory.” Koch’s postulates displaced miasma theory and germ theory remains the predominant explanation for infectious disease to this day.

By the time a splinter pierced Crawford’s thumb in March of 1930, scientists knew that small microbes, invisible to the naked eye, could invade a human body and feast until the host recovered or, more often, died. These germs, such as they were, caused everything from waterborne illnesses like cholera to sexual transmitted conditions like syphilis. They were also responsible for the disease that killed Crawford: blood poisoning.

But just because doctors of that day may have understood the biological war raging in my ancestor’s thumb doesn’t mean they could cure what ailed Crawford. It would take one moldy discovery—and more than a decade of subsequent research—before anyone could do a thing about infectious disease.

A World War I-era Red Cross poster.
A World War I-era Red Cross poster. Wikipedia

Splinters aren’t deadly in and of themselves. Neither really are blisters, scratches, or other seemingly superficial assaults on our skin. But these nicks allow for something much deadlier to make entry into our otherwise sealed-off selves: Harmful bacteria like Staphylococcus aureus or Group A strep. Sometimes, these minuscule invaders can wreak havoc, pushing people to the point of blood poisoning, which today we call sepsis.

While thousands of people still die from sepsis each year, many Americans think they are impervious to such diseases. Funk says that may be because, on a statistical level, people aren’t that susceptible to death by splinter and never really have been. “I get cuts from shaving every second or third day,” says Funk. “The question is, why do some of them get infected and some of them don’t?”

The answer, he says, starts in the skin. “As soon as you get the splinter wound or the cut, right off the bat, there’s a battle that begins,” Funk says. First, blood clotting factors swarm to the affected site. This not only stops a person from bleeding out; it also serves as a biological drawbridge, raising against any potential invader. In some cases, there may not be harmful bacteria on the afflicted site at all. But if there is, the immune system is ready. It deploys white macrophages, the body’s Roombas, to slurp up any dirt, bacteria, or other foreign objects. “Bacteria are all around us,” Funk says. “But 99 percent of the time, our immune system works great at preventing infection.”

The very young, very old, and infirm are less likely to fight back effectively, however. Genetic predispositions toward certain illnesses, how aggressive a given bacteria or virus is, and other circumstances also factor into the progression of disease. Crawford, who was 70 at the time of her death, was part of this vulnerable population. The infectious agent—like Staphylococcus aureus—was able to push past Crawford’s natural defenses, which had diminished with age, and make their way into her bloodstream.

The infection likely moved quickly from there, Funk says, thanks to the tropical heat of the human body. “Some of these bugs have a doubling time of eight to 20 minutes,” he says. “There’s two [microbes], then there’s four, eight, 16—you do the math. It doesn’t take long to have millions to billions of bacteria floating in your system.” But they didn’t just float. Division made the bacteria hungry, so they eagerly turned Crawford’s heart, lungs, liver, and other organs into food. Without medical intervention, her body was overwhelmed. Her blood pressure likely dropped suddenly. And in the absence of any suitable medical intervention, she died.

Alexander Fleming, the father of penicillin, in the lab.
Alexander Fleming, the father of penicillin, in the lab. Wikimedia Commons

For thousands of years, the fate of sepsis patients was largely sealed. But that began to shift in 1928 when the Scottish scientist Alexander Fleming discovered penicillin, the world’s first antibiotic. Just two years before Crawford’s death, Fleming was working with a lab culture of Staphylococcus, which just so happens to be one of the two main agents that cause sepsis. He noticed what scientists call an “inhibition halo”—a line the bacteria could not cross—clearly defined on the lab specimen. A blue-green mold had contaminated the sample and inhibited bacterial growth. Fleming’s original experiment was wrecked, but the mishap presented him with an unprecedented opportunity.

Upon isolating the mold, Fleming found he had the relatively common fungi Penicillium notatum on his hands. The mold, which thrives in damp environments, easily infests water-damaged buildings. When airborne, it can cause allergic reactions in humans. But when synthesized into a bacteria-fighting drug, Fleming realized the cloudy growth could save thousands of lives. The only problem: it couldn’t be synthesized.

For a decade, Fleming tried and failed to persuade chemists and manufacturers to help him transform his fungal find into a mass-market product, knowing all the while that lives were being unnecessarily lost to infection. It was not until World War II that penicillin made its debut as a bona fide treatment. In 1941, scientists at the USDA isolated higher-yield strains of the mold, which they used to successfully treat burn victims of the 1942 Cocoanut Grove nightclub fire in Boston. At the same time, researchers at the drug company Pfizer perfected a deep-tank fermentation system that generated high-quality penicillin in industrial quantities, which allowed the drug to finally go mainstream.

In the 70-odd years since its debut, penicillin has saved millions of lives, and subsequent antibiotics have saved millions more. “I think one of the greatest advancements in medicine is the development of antibiotics, which turn[ed] a lot of these diseases into non-events,” says Funk. Today, when an unlucky American finds themselves with an infected splinter or verging on sepsis, these drugs will almost certainly save them from my great-great-great-grandmother’s fate.

But even with modern medicine, Funk says Crawford may not have recovered. The U.S. Centers for Disease Control estimates that 1.5 million Americans get sepsis each year. Whether it’s from a splinter like Crawford’s or, more commonly, a hospital-acquired infection, sepsis continues to kill approximately 250,000 Americans annually. And not only can antibiotics fail—it’s increasingly apparent they can create problems all on their own.

A close-up of MRSA.
A close-up of MRSA. Pixino

While penicillin was still being hailed as a wonder drug, by 1942 scientists were suddenly aware of a terrifying possibility: antibiotic-resistant superbugs. Mere months after penicillin had finally been mass-produced and deployed, researchers reported the existence of penicillin-resistant bacteria. “By growing the organism in increasing concentrations of penicillin over a long period it was possible to render the organism resistant to penicillin,” Charles H. Rammelkamp and his colleagues wrote at the time.

The biggest fears of these early scientists have since been realized. Today, at least 2 million Americans experience antibiotic-resistant infections annually. Approximately 23,000 die as a result. Antibiotics have saved thousands of lives, but they have also slowly selected for even more powerful bacteria. A round of penicillin might kill 99.9 percent of the harmful bugs in a person’s body, but the few organisms that live are stronger than average and now they’re free to breed wildly. Given the right environment—like a weakened immune system in an ICU patient—the already-scary Staphylococcus aureus can transform into methicillin-resistant Staphylococcus aureus, more commonly called MRSA.

About 720,000 Americans acquire infections while in the hospital in 2011, according to a CDC report. And for every three people who die in the hospital, one dies of sepsis. While doctors are working on instituting new protocol to reduce the risk of superbugs, like minimizing the use of ventilators, which can cause pneumonia, and carefully tailoring treatments to the specific bacteria regaining control over these pathogens has proven difficult.

Despite increasing awareness, doctors continue to overprescribe antibiotics and patients continue to quit an antibiotic regimen before they’re supposed to. At the same time, the livestock industry consumes 70 percent of the antibiotics in the United States to keep their animals healthy—all the while breeding antibiotic-resistant meat, soils, and even farmers. A 2014 report from the United Kingdom predicted 10 million annual deaths due to antibiotic resistance by 2050. While experts still quibble over the impending tsunami of deaths from antibiotic resistance, one thing is clear: People rarely die of splinters in 2018, but 2080 is looking a little different.

Poring over the reports on antibiotic resistance, I’m reminded of a dystopian novel I once read called Station Eleven by Emily St. John Mandel. At one point in the book, the main character describes her brother’s death as “The kind of stupid death that never would’ve happened in the old world. He stepped on a nail and died of infection.” While Station Eleven was a work of fiction, I remember feeling a jolt down my spine when I read that line for the first time. My great-great-great-grandma’s all-too-real obituary (“The injury was so insignificant that she thought nothing of it… until the end came”) gave me the same sensation. Reading the detailed account of Crawford’s death, I can’t help but think this brave new world looks a lot like the old one she left behind.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.