The following is an excerpt adapted from The Invention of Surgery by David Schneider.
Amputating a man’s arm is a gut-wrenching and shocking act. Regardless of the clinical justification and no matter the years of practice, severing a limb from a body requires stubborn resolve and intense personal subordination. Perhaps some surgeons grow callous to cutting off a limb—I never have.
I am a surgical intern at Pennsylvania State University and all I want is a couple hours sleep. I figure, if I can lie down now, I’ll instantly fall asleep, and will get enough shut-eye before 4:00 a.m. to last me through another day of grunt work. But on this winter night, just as my body twitches and shocks itself to slumber, my pager vibrates me to reality. Like all surgical interns, I am taking call while “in-house,” staying in the hospital all night fielding phone calls from the Emergency Room, the hospital floor nurses, and outside patients.
In the darkness, I fumble for my little black Motorola pager on the nightstand next to my head. Checking the four-number code, I dazedly recognize “6550” as one of the extension numbers to the Medical Intermediate Care Unit. We don’t get many calls to that number, and
I hope that I have been paged incorrectly. Without turning on the lights, I prop up on my left elbow and punch the number on the institutional green AT&T office phone. A nurse answers my call, informing me of an urgent surgical consult on a seventy eight-year-old man with elbow pain. She explains that he had been admitted hours before with heart attack–like symptoms, but that all preliminary tests were ruling out an MI (myocardial infarction, or heart attack). Oftentimes an MI patient complains of crushing chest pain, with associated left arm or jaw pain; alert ER personnel hear these complaints and immediately begin testing the patient for a “cardiac event.” Although all initial tests were ruling out an MI, the severity of his symptoms warranted a hospital admission. As the hours progressed, his arm pain worsened, and by 2:00 a.m., the medical team was getting nervous. The aged patient was developing blisters and “ecchymoses” (bruises) on his left arm. Instead of an MI, they are now considering some ominous issue with his musculoskeletal system.
I am a know-nothing surgical intern, just months removed from medical school graduation, but I agree to come evaluate the patient as a first-line responder for my surgical team. I sit up in bed, take a deep breath, and slide on my day-worn, slightly smelly socks. After fumbling for my shoes, my thoughts become more organized, and I’m already starting to generate a “differential diagnosis,” the list of possible causes behind this man’s presentation. While trying not to wake my bunkmate and fellow intern, I slip out of the night call room and jog up the echoic stairway to the medical floor.
Briskly walking down the darkened hallway, I arrived in the Medical Unit, which is a beehive of activity. Nurses and aides are darting around, and are oddly relieved to see me. Typically, floor nurses in an academic medical center rightfully have disdain for interns. They arrive every July with their new MD degree, but are as helpless as a newly licensed motorist trying to drive a stick-shift, uphill, for the first time. But these were medical nurses, adept at caring for cardiac patients, but greenhorns themselves when dealing with an odd musculoskeletal patient who would normally be a couple floors down on the orthopedic floor.
A young nurse points to the corner bed, where a seventy-eight-year-old gentleman restlessly lies in his hospital bed, his left arm propped up on pillows. Rapidly, I can see the bruises that the nurse was telling me about, and I can also see that his forearm is swollen. I ask, “Mr. Louis, does your arm hurt?”
This aged man is truly sick, and can only mumble a feeble, “yes.” Growing concerned, I approach his bedside, and focus on his arm. There are dark, splotchy patches of bruises the color of grape jelly. I lean over the bed, inspecting the inside aspect of his elbow. There are several raised burgundy-colored blisters above his elbow, and I am starting to feel out of my league. What am I looking at?
I reach for his wrist to lift his arm, and instantly feel the crackle of air under the forearm skin that feels like squeezing a bag full of wet Rice Krispies. My stomach drops, and while I don’t have much experience or judgment in the practice of surgery, I know this is gas gangrene, the byproduct of “ flesh-eating” bacteria. There are classes of bacteria that are infamous in causing rapid infections that result in the death of the body’s soft tissues, so-called “necrotizing fasciitis,” with the occasional byproduct of “subcutaneous emphysema,” or gas underneath the skin. The physical exam finding of subcutaneous emphysema is frightening, to say the least.
I gently place Mr. Louis’s arm back on the pillows, knowing that I am seeing my first case of “nec fasc”—pronounced “neck fash,” in common parlance. (This is how residency works—you can read all about subcutaneous emphysema and necrotizing fasciitis, but until you have someone’s limb in your hands with crunchy air underneath the skin, you have not been properly initiated. Somehow the numbers work out. Relatively rare, every surgery resident has seen nec fasc.)
I turn to the nurse and say, “necrotizing fasciitis.” All conversation stops and everyone freezes.
“Really?” she says.
“Yes. I’m going to call Dr. Moulton, my senior resident.”
Connecting with Mark Moulton, I explain the details of the case. Getting to the point, he asks me, “Are we early enough to save his arm or will we have to amputate?” I confess to Mark that I really don’t know, that I don’t have any experience. Mark tells me to get the patient rushed to the operating room immediately. We will try and save Mr. Louis’s life, if not his arm.
A flurry of phone calls to the operating room and the anesthesia team achieves the impossible, and we are rushing to the OR within half an hour. Life is on the line. The rest of the orthopedic team has made its way to the hospital by 3:00 a.m. , and my boss, Dr. Spence Reid, quickly concludes that an amputation is mandatory. In the pre-op holding area we get a portable X-ray that reveals air going all the way to the shoulder. Typical for necrotizing fasciitis, the bacteria are on a warlike march, leaving a plume of air in their wake, and before the bugs get to the chest, daring surgery must be performed. Not only do we need to amputate his entire arm, the collarbone and shoulder blade must also be removed, a so-called “ forequarter” amputation (as opposed to hindquarter, or lower limb).
Even before transporting the patient to the OR suites we gave Mr. Louis a large dose of penicillin, but necrotizing fasciitis is notorious for not responding to antibiotics in the emergency setting. Penicillin helps, but surgical magnificence is demanded if the patient is to live another hour.
Once we urgently transfer the patient to the operating room and the anesthesia team intubates him, we rapidly position him on the surgical table. Racing to save his life, he is propped on his side and his entire left side and arm are swathed with greenish-blue surgical drapes. Dr. Reid works very quickly, making a dramatic, football shaped incision around the shoulder blade and chest. Under nonemergency situations, this dissection would likely take ninety minutes, but under the circumstances, the dissection is done at lightning speed, in barely a dozen minutes. The collar bone, the shoulder blade, the entire arm, and all the muscles attached to those bones are rapidly cut away. The nerves emanating from the neck and the large blood vessels emerging from the chest cavity must all be tied off and cut.
As a resident at the beginning of my training, I know I would kill this patient if I attempted to do the operation. I just don’t have the skills yet. Dr. Reid is a superb surgeon, a master craftsman with unique understanding, adept hands, supernormal concentration and stamina, and most important right now, heroic courage. Moments like this will kindle all these attributes in me for the rest of my life, and Dr. Reid’s greatest gift to me will be the gift of confidence, the ability to take on impossible shoulder and elbow cases in the future. Surgeons are criticized for arrogance and brashness; this critique is probably fair, but at this moment, a fearlessness nurtured from deep self-assurance is mandatory. A surgeon can perceive if he has outflanked a flesh-eating bacterial infection—there is no crackly air in the layers of soft tissue that he is cutting through. A cocktail of life-supporting medicines continues to be injected into Mr. Louis’s IVs as our team completes the final steps of the forequarter amputation. Cutting edge antibiotics, in addition to penicillin, are being pumped into his body even as the team races to detach the limb.
The moment of liberation of the putrefied appendage finally occurs, leaving a gaping wound over the rib cage. There is a simultaneous sense of triumph over the bacterial horde and an acquiescence to the power of microorganisms as the limb is separated from the thorax and dropped into a hazardous waste trash bag. Aggressive irrigation with antibiotic laden saline is performed, and a palpable optimism flickers to life in our operating room.
Mr. Louis, although bizarrely disfigured with no arm and no shoulder, will live.
Mr. Louis’s life was saved by surgery and by penicillin. I have posed the question many times to friends and patients: How many years ago was the first dose of penicillin given? In ancient times, or five hundred years ago, or during the Revolutionary War, or after the First World War? Few people realize that the first clinical administration of penicillin in a small English hospital was only seventy-five years ago.
The pioneering work of Pasteur, Lister, and Koch convinced scientists and physicians that germs were real. As Robert Koch microscopically elucidated their life cycles and interactions with humans, the dark veil of ignorance regarding infectious diseases was lifted. Semmelweis and Lister, among others, were able to show the advantages of handwashing and sterilization, and it is not surprising that public health institutions were created in the years after John Snow helped create epidemiology and Florence Nightingale influenced hospital design. Although improved sanitation and cleanliness dramatically decreased epidemics, there was still no answer for acute or chronic infections in individual patients. The advent of modern chemistry coincided with the triumph of germ theory during the 1880s, in no small part because manufacturing dyes provided contrast and color to an otherwise drab and blurry microscopic world. The bourgeoning German industrial chemical companies began as dye manufacturers, only later turning to fertilizers, perfumes, photography, and pharmaceuticals. Paul Ehrlich (1854–1915), a Prussian-Jewish physician-scientist continued the proud German tradition of perfecting the art of histological staining, eventually gaining fame for differentiating the component cells in peripheral blood. A contemporary of Robert Koch, Ehrlich had a breakthrough insight when he considered the chemical processes that were occurring during the staining of tissues and bacteria. There was a primitive understanding that certain dyes had a special affinity for certain cells (and their constitutive parts); further trial-and-error testing with dyes by the Danish physician Hans Christian Gram (1853–1938) yielded the most important finding in the history of bacteriological microscopic analysis—that bacteria could be grouped into two main classes of cells that either stained purple (“Gram-positive”) or red (“Gram-negative”) in response to a series of staining steps with crystal violet and safranin stains.
Paul Ehrlich was intrigued by why different dyes were attracted to particular species of bacteria, but handicapped by primitive research tools, had no way of formulating a scientific response. However, demonstrating the type of keen insight that geniuses possess, Ehrlich skipped several steps ahead and wondered if the dye materials could be manipulated not to just embellish a slide but to kill bacteria. If a staining material could be identified that targets and binds with a particular class of bacteria, it made sense to the pioneering scientist that a dye could be used as a weapon. Ehrlich traveled to London in 1907 to lecture to Britain’s Royal Institute of Public Health, delivering a lecture for the ages. He dreamed that one day there could be a “targeted drug, one that would attack a disease-causing microbe without harming the host suffering from the disease.” Ehrlich conceived of chemical compounds that would serve as magic bullets, just decades after researchers had finally proven the germ theory. Barely fifty years removed from John Snow’s revolutionary epidemiological research during the cholera outbreak of 1854, Ehrlich returned to the very London neighborhood that had been (literally) awash in diarrhea, stumping for magic bullets.
By the time Paul Ehrlich had traveled to London, he was already well on his way in the quest for the magic bullet. Modern chemistry was in full bloom, with Dmitri Mendeleev’s periodic table coming into focus and a developing appreciation of how atoms bind together to form complex molecules. For an extremely insightful researcher like Ehrlich, the mysteries of simple chemical compounds were beginning to dissolve at the turn of the 20th century, and as one of the fathers of histological staining, it’s not a surprise that he turned to azo dyes like methylene blue, congo red, and alizarin yellow in the search for a chemical breakthrough. Since the mid-1880s, Ehrlich had experimented with the azo dyes as potential therapeutic agents, and although he was inadvertently turning his patients’ eyes and urine various colors of the rainbow, he and his lab partners were able to show a response to malaria.
Azo dyes—aniline derivatives like the mauveine discovered by William Perkin in 1856—are chemically stable and not very changeable; Ehrlich and his cohorts were hoping to find another substance that acted like a dye (showing a propensity to bind with certain bacteria), but was more chemically unstable and easier to manipulate in the lab. Ehrlich knew of a chemical compound named atoxyl that had been shown to kill trypanosomes, single-cell parasites that cause diseases like African sleeping sickness. He was intrigued by atoxyl, particularly once he realized that it was a chemically unstable arsenic-based molecule and not a true aniline dye.
And so the testing began. Ehrlich and his colleagues Alfred Bertheim and Sahachiro Hata began to chemically modify atoxyl in 1907, feverishly altering the composition of the molecule bit by bit. Different versions were further modified, and a numbering system was generated based upon these modifications. The eighteenth version of the fourth compound (number 418) was effective in curing sleeping sickness, but was causing blindness in some of Hata’s lab animals and was therefore abandoned. By the summer of 1910, in what can only be described as crude experimental processes, Compound 606 had been created and tested. The sixth version of the sixth compound (606, arsphenamine) showed tremendous success in lab animals with various diseases, including syphilis.
Syphilis likely was not present in Europe before explorers brought it back from the New World in 1495, and it raged for four hundred years across the continent with its slow-motion terror of blisters, aching testicles, sore throat, raised skin rash, and in its final stages, facial deformities and brain infections. With no effective treatment, mankind was defenseless against the corkscrew-shaped bacterium. Until Compound 606.
The German chemical company Hoechst AG, also located in the Frankfurt area, began marketing Compound 606 in 1910 as “Salvarsan.” Through trial and error, Paul Ehrlich had created a molecule that was part stain, part poison. The dye portion of arsphenamine would bind to the surface of the syphilis bacterium, whereas the arsenate portion killed it. In so doing, he had developed the world’s first synthetic chemotherapeutic agent. For good measure, Ehrlich coined the term “chemotherapy.”
Salvarsan rapidly became the most prescribed medicine in the world, leading to hopes that it would have broad application among many different types of bacteria. Unfortunately, Salvarsan, and its improved version, Neosalvarsan, had extremely narrow efficacy across the microbial world. This, paired with its significant side effects, made it a qualified success. More significantly, the development of Salvarsan was a false lead, as all future antibiotics (after the sulfonamides) would be “natural” molecules gleaned from nature—from fungi or bacteria—and not synthetically created from dyes or other simple chemical molecules. When sophisticated chemical engineering is performed by pharmaceutical companies in the search for a new antibiotic, it is upon naturally occurring chemicals already being produced by living organisms.
World War I (1914–1918) introduced horrific methods of combat, and while there were the predictable medical advances achieved from the theater of war, there was a transitory disruption in the German pharmaceutical industrial machine. The German biochemical revolution was fueled by rigorous academic programs at decentralized universities, a cultural identification with industriousness, and the creation of durable funding that was the envy of Germany’s European neighbors. There was a grand consolidation among German chemical and dye businesses following the conflagration, setting in motion the powerful chemical, agricultural, and pharmaceutical manufacturing enterprises. Familiar names like Bayer, Agfa, BASF, and Hoechst combined together to form IG Farben in 1925, resulting in the largest chemical company in the world. As will be seen, the German chemical corporations involvement in World War II was much more diabolical and vastly more damaging.
In the years leading up to World War II, the Teutonic drive for innovation in chemistry had led to great breakthroughs in fertilizer development, which even today accounts for half of the world’s crop production. Assembly-line manufacturing, pioneered by Henry Ford, was fundamental to the next wave of the Industrial Revolution in the early 20th century, but instead of making vehicles, the German research machine would use mass production organization to tackle scientific challenges with brute force. The testing of prospective chemical compounds was formalized on a grand scale, exposing huge numbers of potential drugs to various bacteria in what was described as an “endless combination game [utilizing] scientific mass labor.”
Paul Ehrlich, the father of histological staining, immunology (he was the first to grasp antibodies), and chemotherapy, died in 1915, just as World War I was exploding. Wartime disruptions and the vacuum left after his visionary leadership led to a lull in chemotherapy discovery. The formation of IG Farben in 1925 and the arrival of Gerhard Domagk (1895–1964) in 1927 to Bayer set the stage for a muscular approach in the quest for a true antibacterial medicine. “If Ehrlich had tested dozens of different recipes in order to find the antisyphilis treatment, Bayer would try hundreds. Or thousands.” In a foreshadowing of the petrochemical polymer industry, Bayer chemists began producing thousands of chemical compounds from coal tar, the thick liquid that is a by-product of the production of coke and coal gas from coal.
Domagk, as a pathologist and bacteriologist, had gained a specialized understanding of the microbial enemy (including being a wounded soldier in World War I), and was critical in constructing the experimental framework, having identified a particularly virulent strain of Streptococci (Gram-positive cocci which links in twisted chains). Streptococcus, the pathogen famous for throat infections, pneumonia, meningitis, and necrotizing fasciitis, was an ideal test bacterium, not only because it was common, but because it killed laboratory animals so terrifyingly efficiently. Domagk, like his famous German predecessor Robert Koch, intentionally infected laboratory white mice with his test bacteria. Thousands of diseased mice died over the first few years of the project, helplessly succumbing to Strep despite being injected with myriad coal-tar derivatives from the Bayer chemists.
Trudging along, as science demands, the scientists continued tinkering with the azo dyes, chemically modifying the compounds with the addition of chlorine atoms, then arsenic, then iodine. Years of failure and almost no hope demanded a resiliency that was perhaps battle born, but a breakthrough did finally occur in 1932, when the team began linking the azo dyes with a sulfa-based molecule. The protocol that he had practiced for years yielded a monotonous outcome: injecting live Strep cultures into the abdomen of a mouse would result in death within a day or two. But in late 1932, outside Düsseldorf, Germany, twelve mice were administered a new drug—an azo dye amalgamated with sulfanilamide—shortly after being injected with the deadly bacteria. Concurrently, fourteen mice were injected with the same bacteria but were not given any medicine. All fourteen of these control animals were dead within days, while all twelve that had received the new compound, KL-730, had lived. The Bayer scientists had stubbornly forged ahead as the carcasses of rodents piled up, but in 1932, the world’s first antibacterial magic bullet had finally been crafted.
Bayer knew that their new medicine, KL-730, which they would name “Prontosil,” was effective against bacteria because of the unique marriage between the azo dye and sulfanilamide. Except that it wasn’t. What the Germans had never performed was an isolated test of sulfanilamide alone. A group of French scientists at the Institut Pasteur in Paris repeated an experiment with various sulfanilamides on a group of forty mice, including a treatment group with sulfanilamide alone and no azo dye.
After a few days, the Parisian team evaluated the response among the test animals. Almost all of the mice died who were treated with newer azo-sulfanilamide combinations, but all of the mice lived who were treated with Prontosil, Rubiazol, and sulfanilamide alone. The Bayer scientists had assiduously labored to protect their patent rights over Prontosil, sure that it represented a bonanza, but they had never considered that sulfanilamide alone might be the subjugator. At about the same time that the Institut Pasteur scientists made their discovery, the Bayer group was unearthing the same sobering fact. While it was a tremendous moment for mankind, it was a financial catastrophe for Bayer; the sulfanilamide molecule had been discovered (and patented) in 1908 by Viennese chemist Paul Gelmo, and was now in the public domain. The financial goldmine had evaporated before their eyes.
Bayer did profit from sulfanilamide. They marketed it around the world as Prontosil, even after realizing that sulfanilamide alone was the effective agent, without the need for the azo dye. (It also explains why Prontosil was only effective in vivo and not in vitro. In a test tube full of bacteria, Prontosil posed no risk. Only animals have the enzyme that separates the dye from sulfanilamide. If testing had only occurred in test tubes, and not animals, Prontosil would have appeared as a failure, and it was this and other drugs that educated the early pharmaceutical manufacturers that “pro-drugs” were genuine. At times, pro-drugs are ideal—a pro-drug is intentionally manufactured so it can survive digestion, turning into the active metabolite once in the bloodstream.) Prontosil and other forms of sulfanilamide hit the world market in 1935, immediately making an impact. “Virtually overnight, mortality from childbed fever [Strep pyogenes] fell from 20 to 30% to 4.7%.” 9 Physicians across the United States and Europe embraced the new drug, but the American public became intimately acquainted with the new sulfa drug in 1936 when Franklin Delano Roosevelt Jr., while a student at Harvard College, contracted a life-threatening streptococcal throat infection. Physicians in Boston administered the new magic bullet, saving his life, and in the process, helped propel America into the modern age. The New York Times trumpeted the news on its front cover, helping ignite a “sulfa craze” across the country, even leading to patients asking their physicians for the new wonder drug by name (a first). Even at the outset of the antibiotic revolution, overprescribing was a temptation.
The European quest for synthetic chemotherapeutic molecules was in full launch mode as the world tilted toward a second Great War. Chemists were obsessed with a haphazard survey of chemicals, believing that the new man-made particles could outsmart the bacterial enemy. While the modern pharmaceutical industry has created, de novo, chemicals that lower blood pressure, increase blood flow, and alter cholesterol levels, the source of antibiotics would be from mother nature, not from the minds of scientists. Unbeknownst to the chemists, several years before sulfanilamide was given to a human, an accidental discovery in London had already opened the vistas of future medical care.
Alexander Fleming was a young Scottish physician working at St. Mary’s Hospital in London, and although he was trained as a physician and surgeon, his talents in laboratory research had led him to an eventual career as a bacteriologist. Small and slight, Fleming had joined the inoculation department at St. Mary’s in 1906, soon turning his attention to Paul Ehrlich’s Salvarsan.
Bacterial researchers have always followed the pioneering example of Robert Koch, studying the lives and sensitivities of microbes by growing colonies of bacteria in Petri dishes in a nurturing environment. Fleming and his colleagues focused on important pathologic bacteria like staphylococcus and streptococcus, culturing the bacteria and evaluating the conditions that altered colony formation. In 1922, Fleming and a lab assistant were cleaning up Petri dishes that had been seeded with bacterial colonies when they noticed an odd pattern. Typically, in a Petri dish of bacterial colonies, there is widespread, even growth of bacteria across the dish; instead of seeing such growth, Fleming noticed that there were blank areas of no bacterial colonies. In a victory for everyone who has suffered from the common cold and a drippy nose, Fleming recalled that nasal mucous from his own nose had dripped onto the culture dish days earlier, and he rapidly surmised that his own nasal drippings had somehow hindered the growth of bacteria. The shy and reticent researcher concluded that there must be a substance in the nasal discharge that had inhibitory powers, naming it lysozyme. For the first time in world history, a purely organic substance had been characterized as having antibacterial properties.
Lysozyme became a fascination for Fleming, albeit a research dead-end. In time, researchers were able to show how lysozymes function to weaken the cell walls of bacteria, but more important, the recognition of a molecule that inhibited, or killed, microbes prepared Fleming’s mind for his revolutionary observation in 1928.
As summer turned to fall in 1928, Alexander Fleming returned to London from a holiday by the sea. When he arrived at his petite laboratory at St. Mary’s Hospital (preserved today as a memorial to the man and his momentous discovery that September 3), a jumbled stack of Petri dishes was on a tabletop, including a dish that had fallen off its perch and lost its lid. The story goes that he glanced at the Petri dish and quickly did a double take—dozens of round spots of staphylococci carpeted the dish but their spread was limited by a large island of white mold on one side of the dish. Recognizing a pattern similar to what he had seen five years earlier, the blotch of mold had a surrounding beltway, a demilitarized zone of sorts, where there were no bacterial colonies and no fungus.
Fleming muttered softly to himself, “That’s odd.”
For thousands of years, humans had unwittingly harnessed mold to make wine and beer and bacteria to make cheese. Fewer than one hundred years before Fleming’s discovery, Louis Pasteur had solved the riddle of fermentation, and less than half a century before, Koch had demonstrated that bacteria were real. Fleming had already concluded five years earlier that lysozymes from human fluids retained antibacterial properties, and now, perched in his little lab above Praed Street, began conceptualizing that the mold itself was making a substance that was deadly to the staphylococcus .
The name of the mold? Penicillium. (Read that carefully. It doesn’t say “Penicillin.”)
The Penicillium mold was likely a contaminate in the building or from the air from an open window. There has been much conjecture about the source of the mold—was it from a nearby lab, was its presence a hallmark of sloppiness of research, did it taint the bacterial culture because Fleming’s assistant was slovenly?—but in the final analysis, Penicillium is a common mold that has been making its own special chemical as a defense, likely for millions of years. How it got into that lab is not important, but the fact that Fleming paused to consider its actions is significant.
Correctly ascertaining that Penicillium was producing a substance that inhibited bacterial encroachment, Fleming and his assistant, Stuart Craddock, (initially) became obsessed with farming Penicillium and harvesting the resultant “mold juice.” Fleming then tested this concentrate on other bacterial samples and found that it was effective against staphylococci and streptococci, finally settling on the name “penicillin” as the name of the substance that would make him world famous. In March 1929, Fleming published an article titled, “On the Antibacterial Action of Cultures of a Penicillium, with Special Reference to Their Use in the Isolation of B. Influenzae.” This predates, by several years, the German discovery of sulfanilamides, but Fleming and his team lost out on the designation as providers of the first antibiotic because they could never adequately cultivate the finicky mold in sufficient quantities to make it clinically significant.
In fact, Penicillium was so persnickety that Fleming gave up. It is confusing today to reconcile Fleming’s abdication on mastering the development of (arguably) the most significant drug ever discovered, but the lack of sophisticated research tools, lab space, manpower, and most important, intense drive to corral the fungus meant that it would be up to another team, more than a decade later, to harness the power of Penicillium. Amazingly, Alexander Fleming walked away from Penicillium and never published on it again.
Excerpted from The Invention of Surgery by David Schneider, published by Pegasus Books. Reprinted with permission. All other rights reserved.