Of Brain Disease and Belly Ache

In her inaugural post the doctor explains why eating humans is bad; and eating margarine is barely any better

I’m writing a screenplay for the next big Hollywood blockbuster. The main character is a Harvard-educated doctor who conducts research on a remote South Pacific Island in the 1960s. The doctor realizes that the native people on this island are suffering from a devastating epidemic. He notes the symptoms of this mysterious disease: first, the infected victims begin to tremble; they lose the ability to walk and begin to laugh a terrible, demonic laugh; dementia and death soon follow.

Before long, the doctor deduces that this nightmarish affliction is transmitted by . . . eating the brains of dead relatives as part of a funeral ritual. (I haven’t quite figured out how to work this detail discretely into the script—I’m writing a classy screenplay, after all, not Silence of the Lambs IV.) Coincidentally, the government has recently outlawed funerary cannibalism and as the native people stop eating human brains, the disease gradually disappears. Doc wins a Nobel Prize for his work. Then he is arrested and spends time in prison. I added that extra bit because the unofficial rules of a Hollywood screenplay state that all geniuses should suffer from serious personal or psychological problems so that the average American viewer doesn’t develop an inferiority complex. That’s why we have big budget movies about John Nash and Bobby Fischer (who struggled with social isolation and mental illness) but not about Einstein (who struggled with quantum theory and bad hair).

Do you think my screenplay will sell? I haven’t quit my day job yet. But all of that bizarre story is true: in 1976 Dr. Daniel Gajdusek won the Nobel prize for discovering the cannibalistic origins of kuru, a prion disease that was decimating the Fore tribe of New Guinea [left]. (Prions are infectious particles, made only of abnormally folded proteins—Mad Cow Disease is another prion disease that has been in the news lately.) Like Mad Cow Disease, Kuru can be slow and insidious; people may develop overt symptoms decades after they were first exposed to the infectious particle. There were quite a few articles published about kuru back in the day, before political correctness and sensitivity training were in vogue; a 1968 JAMA article on kuru is wittily titled “On not eating your neighbor.”

But why am I harping on about kuru? According to a recent article in Lancet, there have been only eleven identified cases of kuru between 1996 and 2004—all of them acquired (presumably) before Australian government cracked down on cannibalism in the 1950s. I bring it up because the case of kuru sets an interesting precedent for diet, law and disease prevention.

This July was the final deadline for the trans fat ban in New York City restaurants. Now all food service establishments in the city are forbidden from using cooking fats or spreads with more than 0.5 grams of trans fat per serving. When you think about it, this situation is not so different from the banning of cannibalism in New Guinea; both are government interventions that impact people’s eating habits and (hopefully) improve public health. Maybe my comparison is a bit far-fetched—but it still makes for great cocktail party conversation. (I mean, so I suspect—for some reason I never get invited to cocktail parties anymore.)

Another interesting point about the trans fat ban is that most related advertising has only a negative slant: products are labeled “no trans fats.” Other products often advertise both what they contain and what they lack; for example, diet cheese may state that it is “dairy free” but also that it is “made with soy.” Very few foods, however, emphasize the presence of the alternative to trans fats, which are known as “cis fats.” What are cis fats? (I always thought the term “cis fat” sounded like the name of an early 1990s female rapper: “Sis Phat in the house, yo.” But I digress.) To understand cis fats, it helps to first explain a bit about trans fats—a kind of unsaturated fat; meaning it’s made up of hydrocarbon chains that contain at least one double bond. The atoms in trans fats are arranged around the double bond in such a way that the fat molecules are fairly straight and can pack tightly, especially in places you don’t want fat to be, like in your arteries.

Cis fats, which are also unsaturated (i.e. they contain double bonds), have their atoms arranged as to make the fat molecules fairly bulky. Cis fats thus can’t pack together as tightly and are more fluid than trans fats. (Imagine Sis Phat as a more flexible dancer working the floor and trans fat as the uncool dude sitting behind the table in the corner.) Saturated fats, which have no double bonds, pack together tightly as well. Research suggests that trans fats are even worse than saturated fats because trans fats decrease the level of good cholesterol (HDL) in the body.

Some folks might bring up freedom of choice, and say that, as with smoking, people ought to be allowed to make their own decisions about what they put into their bodies. (Probably the Fore of New Guinea weren’t too happy when the Australian government banned cannibalism at funerals: “What, we can’t eat dead people, but you can eat Vegemite, which, everyone knows, tastes like dead people?!”) On the other hand, trans fats aren’t really such exciting things to eat; they tend to have long, hyphenated, scientific-sounding names. Butter, olive oil and, heck, even natural lard aren’t trans fats, and thus will still be around in all their greasy goodness. As for a ceremonial brain substitute, we’re still on the lookout for that one.

Welcome to the inaugural post of The Doctor Is In_. Medicine and the biomedical sciences are chock full of the bizarre, the fantastic, and the downright disgusting. As a medical student with a peculiar sense of humor, I’d like to share some of my favorite examples of weird and wild stories of the human body, health and disease._