Mikkael Sekeres is the director of the Leukemia Program at the Cleveland Clinic and the author of When Blood Breaks Down: Life Lessons from Leukemia, from which this article is adapted. This story originally featured on MIT Press Reader.
My patient, a man in his 70s, sat a few feet away from me in a clinic room at our cancer center. His wife was by his side, both literally and emotionally—she was his touchstone, his connection to the normal life he led before his leukemia diagnosis. I noticed they tended to wear outfits that even complemented each other, as if their sartorial choices had harmonized and become intertwined along with their affection over the 40 years of their marriage. Their choice for the day: grey sweatshirts declaring their allegiance to the hapless Cleveland Browns.
He had weathered the slings and arrows of the chemotherapy we used to treat his cancer during a five-week hospital stay, and now was in a tenuous remission. We talked about next steps in his treatment, which ranged from giving him a break, to more chemotherapy, to considering the most aggressive intervention we could offer—a bone marrow transplant.
The phrase “bone marrow” transplant was a bit of a misnomer, though. While we could wipe out any residual leukemia in his bone marrow with high-dose chemotherapy and replace his fresh bone marrow from a healthy person, we may not be able to find a good bone marrow “match.” Another potential option: We could use umbilical cord blood from a newborn, which is rich in the stem cells normally found in the bone marrow, and which recent studies have shown may not need to match as closely as is necessary for a marrow donor. Hearing this, my patient’s wife interjected.
“Our daughter is pregnant, and her due date is next month.” She started, glancing at my patient as he nodded his head in agreement. “She wanted us to ask if she should save the baby’s cord blood in case he needs it for a transplant.”
I explained to them that the baby’s cord blood was unlikely to be a close enough match to my patient, as my patient’s daughter would only be a half-match for him, and her baby less than that. My patient then asked me a question I have been hearing more and more over the years: “Should my daughter save the cord blood in case our grandbaby needs it, in case he or she develops cancer?”
Indeed, in the US, the practice of storing umbilical cord blood is steadily on the rise. Banking cord blood in case a bone marrow transplant is needed in the future is appealing on so many levels. The umbilical cord attaching the developing fetus to its mother’s placenta is rich in those juicy bone marrow stem cells that are so effective at making the blood components. Coming from an infant at the time of birth, they should be uncorrupted by cancer (emphasis on the should, as we’ll see in a moment). Cord blood is also easy to collect: At the time of delivery, after the cord is cut, the remaining blood in that cord is milked out into a collection bag. That bag is then kept in a freezer until the time comes, if ever, when it is needed and can be infused as a transplant.
The cost for using commercial cord blood banking companies, however, can be substantial. Upfront charges with what’s called an enrollment fee can range from $1,500 to $3,500. On top of that, a yearly storage fee is assessed, with the total amount for 18 to 20 years of storage cresting $5,000 in some cases.
Brochures for these companies line Plexiglas display cases in obstetrics offices, with pamphlets exhorting nervous, expectant parents to protect their baby from the medical evils that lie ahead. What better source for a transplant than a child’s own, pure stem cells, harvested at a time years before that child ever developed cancer? But cost aside, is the effort even worth it for the risk that a child may one day develop a cancer and need a future transplant?
To answer this question, we need to take a couple of things into consideration. First, what is the likelihood of a child developing a cancer, and then needing a transplant to treat that cancer? A study conducted by the Center for International Blood and Marrow Transplant Research attempted to figure this out. They first identified the cancers for which transplantation could potentially be needed. For people aged 0 to 19 years (the length of time a cord blood would be kept banked) leukemia was the most common, followed by lymphoma, neuroblastoma, brain tumors, and sarcomas. Cancer in children and adolescents are rare — all told, the incidence rate in the US for all of these cancers combined is about 12 per 100,000 children per year. It’s horrible if it’s your child who develops cancer, but pediatric cancer is still an uncommon event.
The next conclusion is based on the likelihood that these cancers would not be eradicated by chemotherapy and/or radiation therapy and would require an allogeneic transplant—that is, one that uses stem cells taken from a genetically matched donor—and the assumption that everyone could identify a sibling or “brother from another mother” transplant and was healthy enough to undergo the procedure. The authors estimated that the incidence rate of transplant for children and adolescents was a little over 2 per 100,000 per year in the US during their first two decades of life. Analyzed another way, the probability a child will need a transplant by the time he or she reaches age 20 is 0.04 percent.
The lifetime chance of getting struck by lightning is similar, at about 1 in 3,000, or 0.033 percent.
Would you pay thousands of dollars for a medication right now, in the event that sometime in your life you may be struck by lightning, and that medication may help you survive the lightning strike?
Seems excessive to me.
A second way of determining the value of cord blood banking in case a child develops cancer is to consider whether that cord blood is really as pure as we think. The most common childhood cancer through age 19 is leukemia, with an annual incidence rate of 4.7 per 100,000 children in the United States. Could it be possible that the leukemia was present at some small level even at birth, years before the child was diagnosed with leukemia?
One approach to studying this would be to screen every newborn for leukemia. Given the incidence rate of childhood leukemia, this would mean subjecting over 21,000 babies to a blood test for every case of future leukemia identified.
It’s difficult to justify that type of monumental screening effort to answer a research question about the origins of leukemia. A more reasonable approach would be to identify children who have leukemia, and try to determine whether they had it when they were born.
But how to go about obtaining a blood sample from a birth that occurred years earlier? A group of clever scientists from the UK and Germany thought the answer might be found in something called Guthrie cards. Robert Guthrie was a microbiologist working at the Roswell Park Cancer Institute in Buffalo, New York, in the 1950s when his niece was diagnosed with phenylketonuria (PKU), an inherited deficiency in the enzyme necessary to metabolize the amino acid phenylalanine. If caught early enough, an infant’s diet can be modified so that the effects of the deficiency are minimized. If not, the condition can lead to developmental defects and mental disability.
Guthrie’s niece was not so lucky.
This, and having a child of his own with cognitive delays, motivated Guthrie to devote his career to detecting preventable childhood diseases. He developed a test for PKU that could be performed when a drop of blood from a finger prick or heel stick was applied to filter paper on a card. It was successfully piloted in Newark in 1960, and by 1963, 400,000 infants had been tested in 29 states. Testing spread around the country, and across the pond.
And hospital laboratories kept those Guthrie cards for years after a child was born.
The scientists found three children with acute lymphocytic leukemia (more common in children than AML, whereas the opposite is true in adults) who had the same chromosome mutation associated with their leukemias—a translocation of chromosomes 4 and 11. After obtaining permission from the parents of these children, the scientists then searched laboratory repositories to find the Guthrie cards stored there from when the children were born. They used a PCR-specific lab test for this translocation on the dried blood still remaining on the children’s Guthrie cards, and were able to detect the chromosome abnormality for all three children—from a blood drop obtained months or years before the leukemia was diagnosed. In another, similar study, the same group of scientists was able to detect chromosome evidence of leukemia in 9 of the 12 Guthrie cards obtained from children who diagnosed with leukemia between two and five years later.
The leukemia was there all along, even prior to birth in these children, waiting years in some cases to rear its ugly head. And if the leukemia was measurable on a genetic level in their blood, it was almost certainly present in their cord blood. Banking cord blood from these children would have preserved those juicy, healthy stem cells, but also probably cells already corrupted by genetic abnormalities that would lead to leukemia—again, if the cells were re-infused into a child as a transplant years later.
Getting back to the question: Is the cost and effort of banking cord blood worth it for the risk that a child may one day develop a cancer and need a future transplant?
I didn’t think so when my three children were born.
But I did have their cord blood collected and I donated it to be stored for use through the Be The Match program, in case a complete stranger needs it. So that one day, my children could be the brothers from another mother, or sister from another mister—me being the mister!
And so that one day, my patients won’t have to forego potentially curative treatments for their leukemias because they can’t find an adequate donor.