About two years ago, give or take, I was interviewing the head of a mobile development firm about the various features and design choices in their run-tracking app. I thought I had just realized something smart.
"Ohhhhhh," I said breathlessly into my Skype headset. "So the reason you're tracking elevation is to help with the calorie burn count. You factor in that, along with speed and distance, and that's how you get a calorie count."
"Well ... kind of," the app guy said.
"Then again," I said, "I never entered my weight into the apps, so ... I guess it figures out the average burn for someone moving my speed, whatever height and weight? Or maybe it's taking into account ..."
"Actually," the app guy interjected, "the calorie formula comes from a third-party source. It's not really robust. It's more of a guess. It's almost always a guess, on any app. It's in there for those who want to see something."
I've thought about that conversation (now paraphrased from best memory) quite often–it never made it into a published post, but I wish it had, because it was refreshingly honest. It is frequently in mind because of all the enthusiasm around a bunch of phrases one must throw quotes around: "wearables," the "quantified self," the excitement around Apple's unconfirmed-yet-somehow-certain "Healthbook", and what is almost certainly a false, scam-ish "fitness tracker" crowdfunding campaign.
Here's something everyone should know, especially those with an eye toward improving their health: Apps on your phone and devices on your wrist are about as good at counting your calorie burn as the food labels and restaurant menus are at showing you how many calories you will be eating.
Which is to say: the entirety of calorie counting is a really, really rough estimated science. And it hasn't changed that much in more than 100 years.
Calorie counts on foods are just best guesses
Wilbur O. Atwater, who lived exactly as long ago as his name sounds, created most of what you might call nutrition science. His bomb calorimeter is much the same today as it was in the mid-1800s: a fireproof container, submerged in water, with that water hooked up to a thermometer. One sets fire to the mashed-up food in the container, and one watches how high the temperature of the water outside rises. One calorie (technically a kilocalorie) can raise a gram of water from 14.5 to 15.5 degrees Celsius.
But some foods burn that wouldn't actually give a human exactly that much energy. Kale, for example, burns quite nicely, but won't get you through a triathlon. Atwater figured this, and created reference tables for figuring the human-relevant energy of various foods: "Beef, loin," "Milk, skimmed," "Whole bread," and so on.
"Oh, man," you think, "Olde-tyme science, amirite?" Except that the U.S. Department of Agriculture still uses conversion tables to figure out calories, and it last updated its Atwater tables in 1973. We freeze-dry and puree foods, set them on fire, watch the water, and convert the results. Which says nothing about how we actually digest our food, or how restaurants do not turn out the same meals each time.
It would be one thing if all food had its calorie counts coming from the same USDA-regulated lab. Food sellers, however, can use their own equipment or labs to get their calorie counts for their label; they might get fined if the government does a spot check and finds the value way off. But look at how many new and re-formulated food products enter your grocery store every year, and it's pretty easy to do the math on the risk.
And that's just the supply side. The part where you use and burn calories? Man.
Calorie burning counts are shots in the dark
As I heard in my conversation with the fitness app maker, calorie counts are included in activity apps more for the rewarding of users than metrics. Unless you stop a runner and ask them their height and weight that day, when they last ate, and get some muscle density measurements once in a while, any number is an average.
But even those averages can be very far apart. Ask Rachel Feltman, who wrote for Quartz about wearing four fitness trackers at once for 10 days. The results were indeed all over the place:
Representatives from FitBit, Jawbone, and Misfit confirmed that their bands estimate resting caloric burn—the calories you burn just by sitting around being your bad self—using your height, weight, age and gender. All three use algorithms informed by their accelerometers to estimate active caloric burn ... You might move your body as much while walking down a flat road as you do while hiking up a mountain, but the caloric expenditure will be different.
Wired had Sean Downey try out seven fitness monitors and apps in the same workouts. He asked exercise physiology professor Dan Heil why he, too, was seeing such variations. Heil's answer:
“Everyone assumes when a device gives a calorie count that it’s accurate, and therein lies the danger,” he says. Just because the watch on your wrist says you burned 1,000 calories, it isn’t necessarily true. “There’s a huge margin of error and the true calorie burn lies somewhere between 600 and 1,500 calories.”
The hard truth on fitness tracking
Beyond things like "tests" and "evidence" and "analysis by experts," there is some common sense to how good a consumer health app or wearable can really be. As pointed out in Pando's takedown of the magic-world crowdfunded calorie counter Healbe: if any scientist or firm were anywhere close to being able to measure blood sugars and/or calorie intake with a non-invasive device, they would be almost immediately hailed as a diabetes and nutritional science savior, given the finest lab equipment at a top university, and never want for money again in their life.
There are devices that can measure calorie burn far more accurately than a Bluetooth band loosely touching your wrist. They are indirect calorimeters, they cost upwards of $30,000, and they are the size of a full medical cart.
None of this is to suggest that tracking your runs, keeping an eye on your steps, or trying to burn off some calories before tonight's pizza and beer session is some terrible thought crime. Just don't put much stock in your phone or a band to beat science that, for some likely deep reasons, is not very far beyond the 19th century.