A bag of chocolate chip cookies holding up a sign that says "nutrition facts, one cookie, 320 calories"
Calorie counts are built on decades-old science. Popular Scienc
SHARE

This post has been updated. It was originally published on 1/12/2016.

America’s century-old system for counting calories in food comes from chemist Wilbur Atwater. In 1887, he began to research the energy we get from eating by measuring the nutritional value in different meals and subtracting the amount of energy left in people’s bodily excretions.

Atwater’s research has since been boiled down to the 4-9-4 rule: Each gram of protein, fat, and carbohydrate provides, respectively, 4, 9, and 4 calories of energy. The United States Department of Agriculture (USDA) has used these figures for decades, tweaking them only to account for different qualities—such as the digestibility—of specific foodstuffs.

[Related: How AI helped make the best calorie tracker]

But in the past decade, nutritionists have clamored for a reappraisal of calorie intake and nutrition information. For one thing, they say the present system ignores the difference between raw and cooked food. Harvard University researchers assert, based on mouse studies, that processed food is easier for the body to absorb, so it provides more calories. That goes for baked or blended food. Even a handful of chopped peanuts gives you more energy than whole ones.

In 2011, USDA researchers, with a grant from the nut industry, reported that the caloric value of pistachios had been overstated by 5 percent on the nutrition label. In 2012, they found almonds were overstated by 32 percent, or 40 calories per serving. So you might not want to take labels at face value.

This article was originally published in the November 2015 issue of Popular Science