The list of military inventions that have transformed civilian life is long, stretching back at least as far as the Roman Empire’s road system (built for troop transit but foundational to the spread of Christianity as well as the world’s first postal system), if not beyond. More recently, this process of technology transfer has brought us the internet, GPS, and, as Anastacia Marx de Salcedo explains in her new book, Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat, the contents of kids’ lunchboxes as well.
Feeding an army on the battlefield is a perennial logistical challenge, and one that has inspired a variety of ingenious solutions over the last five millennia. On her whistlestop tour of combat ration prehistory, Marx de Salcedo visits both Mongolian jerky and Aztec cannibalism. The former is still a feature of military cuisine today, while the latter at least makes the U.S. Army’s otherwise reviled MREs seem a little more attractive.
The first big industrial-age innovation in food preservation technology—canning—came about in part thanks to a French government competition to help feed Napoleon’s army. But, as Marx de Salcedo’s narrative makes clear, the conflict that put Goldfish, granola bars, juice pouches, and “homemade” turkey ham sandwiches on sliced wholewheat into her children’s packed lunch was World War II. “In the universe of processed food,” she writes, “World War II was the Big Bang.”
Faced with the challenge of feeding millions of men, scattered across the globe, the U.S. Army invested its not insubstantial resources into the development of lighter, longer-lasting rations, either in the army’s own R&D labs in Natick, Massachusetts, or in collaboration with universities. The military shared its findings with American food corporations, who were both eager for lucrative government vendor contracts and loathe to invest their own capital in primary research—a dynamic that ensured that the U.S. Army’s combat-feeding objectives underlie an astonishing number of grocery store staples. Wartime innovations in blood plasma transport paved the way for instant coffee, the McRib is descended from military research into “fabricated modules of meat,” and the finger-staining dust on Cheetos can be traced back to a dehydrated, compressed “jungle” cheese invented by government scientists in 1943.
The McRib is descended from military research into “fabricated modules of meat.”
But the path from the Department of Defense to Piggly Wiggly is frequently far from direct. Take the PowerBar, whose origins Marx de Salcedo traces back to the Second World War-era Logan bar, or emergency D ration, a fortified, meal-replacement chocolate bar that was deliberately designed to melt less easily and taste less good, so that soldiers wouldn’t be tempted to eat it before they truly needed it.
Following the war, non-melting chocolate spun off into a separate research project (M&Ms aside, it is still an unsolved problem), while the meal-in-a-bar idea was given a boost by the 1969 development of a mathematical model to map water activity in different foods under different conditions—research that was carried out by MIT scientists but funded by the Natick Center. The amount of available water in food determines, to a large extent, how quickly it spoils, and the ability to accurately model it meant that scientists could begin to reformulate foods to lower their water activity. The result? An explosion of enticingly named “Intermediate Moisture Foods,” including all those cookies, bars, and pastries that combine the previously incompatible ideals of a soft and chewy texture and a near eternal shelf life.
The military hired Pillsbury to make the first IMF bars—Space Food Sticks, introduced in 1970—with the explicit hope that industry would adopt the technology and then invest their own resources to develop improved recipes and production techniques. Within a decade, dieters and outdoor enthusiasts became the early adopters for the first fortified energy bars, meal replacement bars, and chewy granola bars. Today, the category takes up an entire aisle in most grocery stores, contributing to the comprehensive snackification of the American diet. In turn, the former director of the Combat Feeding Directorate at Natick tells Marx de Salceda that the military is now considering replacing the outmoded concept of breakfast, lunch, and dinner with “more of a grazing event.”
So far, so interesting, but Anastacia Marx de Salcedo’s writing style may prove an insurmountable challenge for some. Certainly, I found descriptions such as this one, of life on a Viking longship as being “a lot like Beer Pong Night at Phi Sigma Kappa,” as well as her habit of interrupting the text with parenthetical asides such as “(warning: crackpot theory ahead!),” more annoying than amusing.
Meanwhile, while focusing on the particular foods that she, and many other American parents, feed their kids helps give the book a clear narrative structure, it necessarily elides some of the other, equally significant ways in which the Second World War remade the global food system. Lizzie Collingham, in her recent book, The Taste of War: World War Two and the Battle for Food, adds valuable context here, describing how the conversion of munitions and tank production lines to make fertilizer and tractors played an important role in accelerating the post-war Green Revolution in agriculture, while wartime food shortages and substitutes primed a population to value abundance and accept reduced quality ingredients. U.S. government anxiety about shielding its farmers from a damaging peacetime slump in demand even helped sabotage a Utopian post-war plan to end world hunger, reinforcing inequities that continue to this day.
Anastacia Marx de Salcedo’s larger point, however, is well taken: the needs of the military play an outsized role in shaping the food industry’s research agenda, resulting in the proliferation of products that are optimized for portability, convenience, shelf-life, and mass appeal, rather than health, taste, or environmental sustainability. Many contemporary activists place the responsibility for change America’s food system onto the consumer—after all, we do have a choice as to whether to buy granola bars and juice pouches or not. This makes it all the more refreshing to hear someone point out the obvious: when the military is the largest investor in food research, its agenda will, inevitably, shape the American diet.
“What would our food be like if the Natick Center didn’t exist?” Marx de Salcedo wonders, in her final chapter. “The question is unanswerable,” she concludes—but wouldn’t it be nice to find out what might happen if we could untangle taxpayer-funded food R&D from its military goals?