In the mid-1920s, Dr. Clara Davis assembled a group of fifteen infants — most of them orphans, many of them undernourished, all under a year old — isolating them from all social contact in an experimental nursery. For months, the babies were given free rein to assemble their own meals from a limited selection of 34 foods. These included fruits and vegetables (bananas, peas, turnips), grains (oatmeal, Ry-Krisp), and meats (chicken, liver, bone marrow), each mushed or minced in a bowl. A nurse was instructed to sit stonily by, spoon-feeding a child only when he or she had shown definite interest in the contents of one of the bowls.
The results were remarkable. In months, haggard, hollow-eyed babies grew cheeky and plump; rickety infants cured themselves through diet alone. Rather than shying from new flavors, the babies tried nearly every dish, developing their own unique preferences along the way while managing to select a diverse and wholesome diet whose balance of calories from protein, fat, and carbohydrates mirrors nutritionists’ ideal ratios.
It’s the kind of experiment that (hopefully) would never make it past any contemporary institutional review board. Davis, who was worried about the many children in her care who were sickly and undernourished yet stubbornly refused to eat the nutritious foods that were urged on them, had simply wondered: what would children choose to eat if they were liberated from the pressures and expectations of parents and doctors?
Nearly ninety years later, Davis’s research continues to be widely invoked as evidence that we instinctively know our own nutritional needs. So why do so many of us seem to be so bad at “healthy” eating? The usual story is that the entanglements of modernity — technology, processed foods, advertising, perhaps even culture itself — have alienated us from our innate biological wisdom, or worse, have exploited our hard-wired appetites, imperiling our health to feed corporate treasuries. What is the “paleo diet” if not a stage-crafted return to our primitive, evolutionary nature, when desires were precisely congruent with needs? We are born knowing how to eat, goes the story, and we spend our lives forgetting.
Nonsense! writes Bee Wilson in her compelling new book, First Bite: How We Learn to Eat. The experiment was a rigged game. The carefully structured environment of Davis’s study made it impossible to choose poorly. Although none of the children proved fully omnivorous, their individual predilections mattered little, as all possible food choices were good ones.
Moreover, how much can we conclude about the real world from such an unworldly and artificial scenario? “We cannot arrive at the truth about appetite by removing all social influences,” Wilson concludes. “Appetite is a profoundly social impulse.”
“Appetite is a profoundly social impulse.”
In First Bite, Wilson argues that there’s little about the way humans eat that can rightfully be attributed to basic instinct. It’s (almost) all learned behavior. That is, in order to grasp — and potentially change — how we crave what we crave, neither neuroscience nor endocrinology nor experimental psychology, nor any laboratory science, will provide adequate answers. We have to look to history and culture, family and state, as well as to brain and body.
Wilson is a Cambridge-trained historian whose previous books include Swindled, a history of food adulteration, and the excellent Consider the Fork, which demonstrated how the tools we use to cook and eat have shaped what we cook and how we eat. In this one, she proves to be a clear-eyed and level-headed guide to the fraught and fretful landscape of contemporary dietary research.
Wilson tells us that researchers have documented a developmental window, between four and seven months, when infants are strikingly receptive to new flavors before becoming once again fussy and resistant. Unfortunately, standard pediatric feeding recommendations urge breast- or bottle-feeding during that period, foreclosing the possibility of expanding infant palates at this tender age.
Yet Wilson takes pains to show that biology is not destiny. By understanding that preferences are developed through experience, we can develop better ways of effectively educating children’s palates, building from tiny tastes of loathed vegetables to familiarity, acceptance, and possibly, delight. Even for eating disorders where there seems to be a strong genetic component — such as anorexia nervosa — or cases of extreme food selectivity, Wilson details therapeutic practices that have shown promise, particularly those that place feeding and eating in a broader contexts of social and family life.
In several chapters, Wilson uses the dynamics of family life at the table to explore the different ways that gender and culture affect how we eat, and in particular, the ways that families preserves cultural practices that fit poorly in conditions of sweet abundance. “The way we reward children with food is based on folk memories of a food supply that has not existed in the West for decades,” she says, when white sugar and refined flour were rare delights rather than readily available snacks.
Some may argue that there’s little practical difference between arguing that modern diets are out of whack with evolutionary biology, and that they are out of step with culture. Doesn’t this just substitutes one difficult-to-change cause for another? But Wilson reminds us that cultures can, and do, change, and that modernity does not automatically mean a decline from some imagined ideal traditional cuisine, that was both nutritionally and gastronomically replete.
In the book’s final chapter, Wilson shows us that Japanese cuisine, lauded both for its salubriousness and for its attention to sensual experience, is not a centuries-old tradition, but a modern creation, forged from Meiji-era dietary policies and post-War US food aid. She also details educational initiatives (in European countries such as Finland and France, alas) that have improved children’s health and expanded the diversity of their diets by helping them develop rich, multisensory relationships with food and eating.
Although much of the book focuses on children, Wilson’s central message is aimed squarely at adults. Eating is something we must learn how to do, but it is not something we learn once and for all. The mistake is to assume that our appetites are inborn and indisputable, and our habits are immutable. This is why the most important lesson of this book is this: pleasure matters. And what we find pleasurable can change, a process that she calls a “hedonic shift.” We can’t expect to eat better, if we don’t like what we eat.
The most important lesson of the book is this: pleasure matters.
Yet for someone who has written an admirable account of how the tools we use shape cooking and eating, the book’s general lack of engagement with the possibilities of food technology, and the labor of food production, is disappointing. Although Wilson writes with great eloquence and considerable insight about how processed food fuels our cravings not just for salt, sugar, and fat, but also for nostalgic connection with our own histories, offering “a continuity with the past that you just can’t get from other foods,” she dismisses industrial food as realm of false pleasures, a peril to be taken in careful moderation. Indeed, much of Wilson’s food advice will sound familiar to fans of Michael Pollan, Mark Bittman, Marion Nestle, and others who extoll the benefits of a whole-foods diet. I wished Wilson had acknowledged that eating less processed food means doing more of our own food processing, which can have its own dispiriting spiral of consequences for time, love, and work.
There are other quibbles. Wilson is a lucid and compelling writer, weaving nimbly between historical narrative, scientific research, and personal anecdote. But like a Las Vegas buffet, the book at times groans under the load of its own endlessly variety. At one point, an anecdote from the life of nineteenth-century French philosopher Charles Fourier appears next to an account of present-day Chinese grandparents indulgently overfeeding their grandchildren, and is followed by an early twentieth-century rhyme from her own grandmother about food waste. It can feel a bit disorienting.
At some point, after a certain number of stories about people in labs being fed soup, milkshakes, or “protein ‘preload’ substances” so that researchers may measure their hunger cravings, the reader may notice that her own appetite for such scientific studies has waned considerably. This reader also wondered why the incisive, acute skepticism that Wilson fruitfully applied to her accounts of early- and mid-twentieth century food research seems to be absent from her summaries of more recent studies.
As in Wilson’s earlier Consider the Fork, research-dense chapters are interlarded with delightful, illustrated squibs — in this case, moral fables featuring milk, birthday cake, chili peppers, and other foods, meant to dramatize the prescriptions for better eating that Wilson insists are not “advice.” But make no mistake. This book aims to make a difference in the way we think about food and eating, both for ourselves and our families, and for public and educational policy.
Reasonable books making reasonable recommendations risk accusations of self-evidence; some readers may come away from First Bite feeling that its findings are rather self-explanatory. But this does not make this book any less useful or necessary. Wilson writes that people sometimes would “get a little angry” when she described the central argument of the book, that we learn to eat, and that our tastes are culturally and socially constructed. “What about genes?” they would insist.
As a historian who writes about industrialized food, I’ve encountered this reaction myself. But why is this the case? Why do so many Americans (and Britons) seem stubbornly committed to the idea that our tastes are inborn, and thus uneducable? Alas, this is one mystery that First Bite does not clarify.
Nadia Berenstein is a Ph.D. candidate in History & Sociology of Science at the University of Pennsylvania; her dissertation tells the story of the history of synthetic flavors and flavor science in the United States. You can read more about her research on her blog, Flavor Added, or follow her on Twitter @thebirdisgone.