Last week, a study emerged on the oldest-known healed surgical amputation, found on a young adult in an art-covered cave in Borneo. In that research, the authors argued that because it’s so rare for humans to survive amputations without further treatment, the young adult must have received sophisticated medical care that allowed them to live for years after the surgery.
“We infer that the Late Pleistocene ‘surgeon(s)’ who amputated this individual’s lower left leg must have possessed detailed knowledge of limb anatomy and muscular and vascular systems to prevent fatal blood loss and infection,” the authors wrote. “They must also have understood the necessity to remove the limb for survival. Finally, during surgery, the surrounding tissue including veins, vessels, and nerves were exposed and negotiated in such a way that allowed this individual to not only survive, but also continue living with altered mobility.”
But people have survived amputations more often than we’d think. So does that mean the trauma doesn’t always require sophisticated medical care? Or are treatments that we now think of as simple more effective than we give them credit for?
For Gillian Bentley, a biological anthropologist at Durham University in England, the example that comes to mind is of a subsistence hunter she met decades ago in what’s now the Democratic Republic of the Congo. One morning, she was up early having breakfast, when an old pygmy man came into the village. “He walked with a stick, and the bottom of his leg and foot were missing,” she says.
His story, relayed to her by a colleague who had conducted demographic research in the area, was that the man, a member of the semi-nomadic Efé ethnic group, had cut his own leg off after being bitten by a poisonous snake while looking for food. Evidently, he’d survived in spite of having to walk through tropical jungles with a healing wound—at risk of infection.
That man’s survival suggested to Bentley that amputation “didn’t require a sophisticated surgery.” “He did what he had to do in that circumstance,” she adds. She wasn’t there studying medicine, but says she never saw evidence of traditional antimicrobial use in the community, where painful bacterial infections, called tropical ulcers, were common. More often, she says, people approached the missionary station in her village for care.
But was that Efé senior evidence that people are regularly able to survive amputations? And if so, does that mean the Bornean adolescent didn’t need such specialized care? “I’d be careful about extrapolating from work with hunter-gatherers today,” especially based on a single account,” says Karen Hardy, an archaeologist at the Autonomous University of Barcelona, who studies the history of herbal medicine.
She points out that contemporary people who practice hunter-gatherer lifestyles live in different social and ecological contexts than those thousands of years ago. Efé people have had their worlds shaped by slave raiders, colonial plantations, and warfare in the past two centuries. More broadly, social transitions since the Paleolithic period have eroded herbal knowledge, Hardy says. Her research has suggested that the transition to agriculture and sedentary lifestyles during the Neolithic was associated with fewer medicinal plants.
So the Efé case might not be representative of how people survived amputations historically. “[Paleolithic people] will have known about antibacterials and how to stem blood flow,” Hardy says. “There are lots of plants that do that, and lots of evidence that they were able to use these plants. I have no problem at all accepting that it took expertise to do an amputation in terms of pain relief and antibacterials.”
Part of how you understand the implications of these case studies hinges on how to define “sophisticated medical treatment.” While contemporary surgery, with its schematic understanding of arteries, nerves, antiseptic, and anesthesia, has delivered obvious benefits, older treatments might still have been effective, without fitting neatly into the framework of modern medicine.
In records of 18th and 19th century amputations by European and American armies, significant numbers of patients survived, in a context that was famous for deaths from infectious disease. One doctor, on the HMS Victory warship, recorded 102 injuries and 10 amputations after a battle in 1805. Only 6 of the wounded died. “They didn’t know about hygiene or hand-washing, and antiseptics didn’t come along until about 1875,” says Mervyn Singer, a critical care doctor at University College London. In other data Singer has compiled, survival after amputation during the American Civil War ranged from 50 to 90 percent—not good odds, but still admirable given the resources on the battlefields.
A recent analysis of skeletons buried in a hospital cemetery in England between the 18th and 19th centuries found that a surprising number had healed amputations. And that’s just those who died. “Those who had a full recovery would have left [the hospital], so it’s a biased sample,” says Rebecca Gowland, an anthropologist who studies skeletal trauma at Durham University.
Singer, who’s gathered military records on amputations as part of his research on how the body survives stress, says that many of the advances in his field in the past few decades involve realizing that sometimes a sick person requires fewer interventions, rather than more. That includes “giving them less fluid, ventilating them less hard, sedation that’s less hard, and not feeding them aggressively,” he says. His theory is that the body has a wider ability to handle trauma than we give it credit for, including, perhaps, an amputated limb.
But battlefield amputees received non-surgical care, and some of it might be more effective than we give it credit for, Singer says. Going back to the 1500s, surgeons realized that they could stop artery bleeds with forceps. Amputation patients might have had a wound cauterized with hot oil, copper sulfates, or tar, which could both stop bleeding and sterilize the site.
To Hardy, that last treatment is particularly telling, because there’s lots of evidence that prehistoric people—even pre-human hominids—gathered and worked with naturally occurring tar, or bitumen. Neanderthals used it and pitch to attach stone tool heads to wood handles. It’s not a stretch to think that some of them might have incorporated the material into their medical regimens.
There’s even evidence that caregivers applied treatments that are now widespread, well before the modern medical system existed. The first defibrillators showed up in the late 1700s, though it took another 250 years for them to become popular with hospitals and emergency services.
Hardy gives the example of North African herders, who treated saddle sores with a specific mold—that turned out to be Penicillium. “It’s just a world out there of stuff that nobody knows yet,” Hardy says. People might be better at surviving a lost limb than we’d expect, but that might also be because we’re better at treating ourselves than we realize.