Last week, The Wall Street Journal reported that a team of researchers in China were treating terminally ill cancer patients with the gene-editing technique known colloquially as CRISPR. According to the Journal‘s report, the Chinese researchers are attempting to halt disease progression in patients with esophageal cancer by tweaking a piece of DNA in some of their white blood cells. This adjustment changes the way their immune system fights the cancer.
But how did China edge out the United States to become the first to use CRISPR in humans? American researchers were, after all, the ones who discovered the techniques’ ability to tweak and alter DNA. But as the Wall Street Journal points out, America is right up on China’s heels. Carl June, a pioneer in immunotherapy cancer treatment at the University of Pennsylvania is awaiting clearance to begin similar trials of his own—using the techniques of CRISPR on immunotherapy (harnessing the immune system to fight cancer)-based treatments for certain malignancies. He might get clearance from the FDA as early as next month.
According to Nature, the American trials will be similar to the newly-approved immunotherapy treatment called Kymriah. The treatment involves removing a person’s blood and isolating their T-cells. Using another form of gene-editing, a disabled HIV virus attaches a receptor to the T-cells (a type of white blood cell). The re-engineered T-cells then grow and proliferate in the lab. When ready, doctors infuse the newly engineered T-cells back into the cancer patient’s body where they go to work finding cancer cells and killing them. While this treatment can be highly effective (often in people who have exhausted every other medical option available), the process itself is extremely time consuming, and it often doesn’t work for everyone. The goal of this new immunotherapy trial involving CRISPR is two-fold: To see if using CRISPR-based therapies is indeed safe for humans and if CRISPR would help make therapies like Kymriah more efficient and effective.
If approved, the new phase one human trial would use CRISPR to tweak the DNA in a person’s T-cells in three ways: The first would do the same as the HIV virus in the Kymriah approach, attaching a receptor that finds cancer. The second would remove a protein that has the potential to mess with the receptor. A third tweak would prevent a cancer cell from finding the T-cell by removing a protein that acts as a tracking device, so to speak, for the malignant cell. If successful, these changes could make immunotherapy far more effective. And, because CRISPR is relatively easy to use in a laboratory setting, it could make treatments that use the process far more efficient, potentially increasing availability and decreasing the time (and money) spent to make the drug.
But China is already so far along, and in some instances, their efforts are showing positive results. If American researchers led the CRISPR discovery and early race, what handicap is allowing China to gain the lead? It’s a little something called the FDA. And it’s worth the lost race. To gain approval for their trial, Chinese researchers had to present their plan to the hospital’s ethics committee. According to the Wall Street Journal, this committee is made up of a handful of the hospital’s doctors, a lawyer, and a former cancer patient. The group discussed the issues for a few hours before they greenlit the human trial.
Unlike China, the United States has a far more painstaking and demanding system to bring a new drug or therapy to the market. For this one in particular, Carl June had to present his plan to a National Institutes of Health (NIH) advisory committee. Then he had to submit the project to the FDA itself, which requested that June perform further lab tests on human cells before attempting them in humans. The main concerns were at the forefront of other CRISPR trials: Will the DNA editing tool make cuts that were otherwise unintended? Could those edits cause unforeseen adverse consequences to the patient, either right away or later on? Because CRISPR is such a new discovery, researchers around the world are highly concerned that scientists have not yet worked out all of its kinks.
The FDA’s caution is high for anything that uses gene editing or CRISPR. But other drugs go through an extremely rigorous process, as well. This is all for good reason: Before the FDA existed, manufacturers could market and sell any drug without needing to say what’s in it and without needing to show that it could actually treat the thing that you were buying it to treat. While it’s hard to sit with the idea of potentially losing a medical breakthrough race, it’s important to remember how and why the United States created the FDA in the first place.
Today, we take prescription drugs knowing exactly what the side effects could be and what the general probability of them occurring is. But imagine taking a new drug without knowing these vital pieces of information. Would you still swallow it? We don’t often think about the arduous and long process drugs now go through before they reach our bodies. But these processes are there for an extremely good reason. Consider the fecal transplant. The treatment, which involves collecting stool from a volunteer and administering it to a person with a severe gut infection caused by a bacteria called clostridium difficile, shows immense success, but it is not yet approved by the FDA.
Part of the reason is that the treatment is, like CRISPR studies, entering uncharted territory. There’s so much scientists still don’t understand about the human gut microbiome (the collection of bacteria that live in our intestines) and its effects on our health. Transplanting one person’s gut microbiome into another person’s could indeed cure them of their infection, but it could also cause unwanted consequences. The microbiome has an effect on both our digestion and our immune systems. Theoretically, the recipient of a fecal transplant could be at a higher risk of developing immune system disorders because of the gut microbes they now have. At the same time, just like those cancer patients in the CRISPR immunotherapy trials, C. diff patients are often in a life-threatening situation and the fecal transplant is a last ditch effort to clear up the infection. Researchers admit that it’s hard to know for sure how safe you have to be when these people who could potentially benefit from this treatment are in a life-threatening situation. In the immunotherapy studies, most patients have exhausted all other available treatments.
For a drug to reach your medicine cabinet, it needs to have gone through three phases of clinical trials. Phase one is simply to show that a drug or therapy isn’t toxic. If a drug makes it past that point, it moves on to phase two which it must again maintain that it is non toxic, but also prove that its effective, doing to the job that it’s meant to do. In phase three, researchers must test the drug against the currently available treatment for the condition (if there is one) the new drug is attempting to treat. If a drug doesn’t work any better than one currently on the market and there’s no other redeeming qualities like it being cheaper, or its side effects are less intense, then its much harder for drug companies to gain FDA approval to start selling the drug.
Each step of this process is long and arduous but the hurdles are there for two main and super important reasons: To determine what the drug’s toxicity (in other words, what are its chances of killing you or causing other forms of short and long term damage to your body?) and, does it actually do what it says its supposed to do? If a drug is supposed to reduce the symptoms of heartburn, does it actually do that? This all seems obvious that this kind of testing should exist. But the only reason it does, is because there was a time when it didn’t.
Back in 1906, the United States Congress passed the original Food and Drug Act (the precursor to today’s Food and Drug Administration). At that point, the law’s main purpose was to prevent the buying and selling of food, drinks, and drugs from having any form of mislabeling or tainting. Simply, the product had to contain what it said it contained on the label.
The current regulations governing the FDA’s testing process are a product of our own mistakes. In 1937, as soon as a drug called Elixir Sulfanilamide reached the market, it quickly caused the deaths of 107 people, many of whom were children. The active ingredient in the drug, sulfanilamide, was used at the time as a type of antibiotic used to treat anything from gonorrhea to strep throat. The drug originally came in the form of a pill. But one pharmaceutical company, the S.E. Massengill Company, decided that the therapy would be even more popular if it came in the form of a flavored liquid. So they had a chemist mix sulfanilamide, with diethylene glycol, and water—plus a little bit of raspberry flavoring. Once ready, they labeled it accordingly, and distributed gallons of it across the country—and pharmacies readily purchased it.
Diethylene glycol is highly miscible, that is, it’s great at mixing together any particle into a well-formed liquid. As such, the new formulation was deemed a great success. Until people started actually taking it. Turns out that in addition to its high miscibility, diethylene glycol is also extremely poisonous to humans, and causes immediate acute kidney failure. Death reports soon started coming in of people who had taken the liquid medicine. It was swiftly taken off the market. The thing is, the drug went immediately from the laboratory to the medicine cabinet. No testing beforehand whatsoever. So, the following year, in 1938, the United States passed the The Federal Food, Drug, and Cosmetic Act, which required that new drugs had to show that they were safe before they started selling them—essentially what phase one in the clinical trial approval process is today. This began an entirely new wave of regulations, each year bringing us one step closer to the highly arduous process we have today. Regrettably, sulfanilamide wasn’t the only infamous incident that tweaked this regulatory process. Other tragedies over the past century have shaped it as well.
In the now infamous case of thalidomide of the 1960s, the drug (thalidomide) was a sleeping pill that quickly became widely popular in Germany. Soon after Australian doctors discovered that it could also alleviate the nausea caused by morning sickness in pregnancy. So doctors started to prescribe the drug, off-label (a practice still very much in use today) to pregnant women. But while there had been some testing done beforehand to determine the drug was safe for humans to take, no one had done any studying of the drugs effects on a developing fetus. As the world soon found, it can cause severe birth defects; specifically causing the shortening or complete absence of limbs. This, in part, led the U.S. to create much more stringent laws around drug dispensing, requiring drug makers to prove their drug works before it can gain FDA approval.
We have birth control to thank for laws mandating that drugs come with patient packet inserts listing every side effect available and the chances of each one occurring. Initially birth control was almost taken off the market due to its dangerous side effects. But rightfully, women pressed that they wanted to be given a choice first, whether they would agree to taking a drug with the risk of whatever side effects. Today, that’s typically how drugs are presented by physicians to their patients—weighing the benefits versus the risks, which are clearly and accurately made available, and allowing the patient to choose.
Learning from our past, it is wise that we remain cautious and go through the regulatory procedures that have been a century in the making.