First Rule of Dietary Recommendations? First, do no harm

So what is “health” and what actually constitutes a balanced diet? The answer depends, of course, on whom you ask. The World Health Organization defines health as follows:

Health is a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity.

bushman family sa tourismThe Gwi Bushmen of the central Kalahari desert, on the other hand, define health in terms of their relationship with the land, making their environment essential not only for physical provisioning of food and shelter, but for spiritual and cultural survival. In other words, their perspective on health and its handmaiden, well-being, is one of econutrition with a dose of mysticism thrown in, just in case.

The Food and Nutrition Board (FNB), as part of the Institute of Medicine of the National Academy of Sciences, has a different take on health and what constitutes a balanced diet—whatever that is. The FNB has been summarizing dietary advice more or less since the 1940s in the form of recommended daily allowances (RDAs), and their report, updated and issued every five years, heavily influences the creation of the U.S. Dietary Guidelines for Americans (think Food Pyramid). In their most recent report in 2002, this massive 900-page report described macro nutrient intake like this:

To meet the body’s daily energy and nutritional needs while minimizing risk for chronic disease, adults should get 45% to 65% of their calories from carbohydrates, 20% to 35% from fat and 10% to 35% from protein . . .

Bizarrely, they also state:

. . . added sugars should comprise no more than 25% of total calories consumed . . . .added sugars are those incorporated into foods and beverages during production [and] major sources include candy, soft drinks, pastries and other sweets.

Yep, none other than the National Academy of Sciences recommends that up to 25% of your daily calories can come from added sugar. Setting aside for the moment which fats, carbs, and proteins should make up the various ranges proposed, you really have to pause a moment and ask yourself what set of morons at the National Academy of Sciences thinks it’s reasonable to get up to 25% of your calories from nutrient-poor, rapidly-digested, insulin-spiking added sugar.

To help you wrap your head around how disastrous these recommendations truly are, a one-day meal plan has been prepared to supply the nutrients in accordance with the FNB’s recommendations. The breakfast, lunch and dinner meal plan (adapted from the China Study) delivers the protein, fat, carbohydrates and sugars within the ranges outlined by this scientific organization on behalf of the American people.

Meal Foods

1 cup Froot Loops
1 cup skim milk
1 package M&M’s
Fiber and vitamin supplements

Grilled cheddar cheeseburger

3 slices pepperoni pizza (from the “other” guys, not nakedpizza)
1 16 oz. Soda
1 serving of cookies

Nutrient                           Sample Menu                         Recommended
Content                            Ranges

Total Calories                       ~1800                                           Varies by height/weight
Protein                                    ~18%                                              10-35%
Fat                                             ~31%                                              20-35%
Carbohydrates                     ~51%                                              45-65%
Added sugars                       ~23%                                              Up to 25%

Even worse, many government sponsored programs (think national school lunch program) are constructed using these same ranges. You don’t need to be a nutrition scientist to predict the outcome. Just look around.

Using the FNBs report as a playbook, the U.S. Department of Agriculture and Department of Health and Human Services updates the Dietary Guidelines for Americans and accompanying Food Pyramid every five years—and has been doing so since their 1980 Congressional mandate. These guidelines are important because they determine, more or less, the composition of meals served as part of the national school lunch program, form the basis of many official nutrition recommendations, and provide cues for food marketers.

In short, the Food Pyramid is designed to “prevent the diseases of dietary excess.” This is interesting when you consider the current issue of Ode magazine which essentially says fat is back and actually good for you, juxtaposed with the 1984 cover of Time magazine that suggested fatty foods cause heart disease and by implication, dietary fat can also make you fat. Oh, how much can change in just 25 short years.

The 1984 Time cover parroted the 1980 dietary guidelines that “recommended reduced intake of all fats” and marked the beginning of the low-fat craze that swept America through the 80s and 90s, sparking our national obsession with processed carbohydrates to replace the calories lost from those fats.


Twenty-five years later, Ode magazine is essentially stating what the science has said all along: fat does not make you fat and fat does not cause heart disease. As bad as it is to base national dietary recommendations on weak science, the idea that “it can’t hurt to reduce fat intake no matter the science” has possibly—though inadvertently—contributed to our modern obesity calamity. Harvard researchers touched upon this emerging unintended consequence in 2001 when they made the following statement in a published this article:

It is now increasingly recognized that the low-fat campaign has been based on little scientific evidence and may have caused unintended health consequences.

What the hell?

What the Harvard researchers were eluding to in 2001, and what has been pointed out by others, is that once the public bought into the low-fat craze, the public assumed that they could consume the newly marketed low-fat or no-fat products flooding the market without concern. As the graph below shows, about the time American females started to heed the less fat is better message per the 1980 dietary guidelines and the accompanying marketing of low-fat products by the food industry, this same group of American females also started to grow in waist size. As a percentage of total calories, fat decreased between 1980 and 2001, but the percentage of overweight and/or obese females skyrocketed.


Interestingly, over this same period, age-adjusted caloric intake for females increased 18 percent, from an average of 1,542 a day in 1971 to 1,877 in 2001. Men show similar patterns. During this same period, females significantly increased carbohydrate intake and, as you may have guessed, most of the carbs were highly processed (easily digested and absorbed). In other words, the majority of the additional calories consumed by females (and males) came in the form of high processed carbs in those low-fat and no fat paroducts.

Ironically, given that obesity is a risk factor for heart disease, the dietary guidelines recommending fat restriction may have worsened the obesity rates in America, and in the final analysis will have the undesired effect of actually increasing rates for heart disease as our current overweight and obese generation ages.

Recent consumer behavior toward the popular 100-calorie “mini-packs” suggests that low-fat or no-fat product offerings may have convinced consumers that they can eat these products with abandon, subsequently contributing to the 18% increase in calories noted above for females from 1971 to 2001. Researchers from Arizona State University found that test subjects consumed more calories when presented with mini-packs versus larger, traditional bags of same snacks. Oops.

The point of all this is that health and a well balanced diet mean different things to different people. Government-sponsored dietary guidelines are often the result of compromise between Congress and powerful lobbies, and not based on the merits of the best science at hand. Marketers and the food industry just make matters worse as they misappropriate dietary guidelines to push products such as low fat Doritos fortified with vitamins and organic Oreo cookies. I could go on for pages.

Dietary guidelines aimed at individual dietary components (e.g., fat) for the potential benefit of individual-level dietary modification (you) can have a net effect on America’s population that can be helpful or harmful. Given this, it is reasonable that government entities and other experts issuing dietary recommendations and advice be guided by the dictum, “first, do no harm.” If this cannot be answered with straightforward and convincing science at hand, do not issue dietary guidelines at all.


Splenda may damage gut bugs

splendaA handful of researchers from Duke University recently split a group of 50 male Sprague-Dawleys rats (i.e., your basic lab rat) into five equal groups. One group got water with its standard rat chow. This was the control group. The other four groups had their water spiked with the high-potency artificial sweetener Splenda.  The doses were 100, 300, 500, and 1,000 mg of Splenda per kg of body weight for the “volunteer” rats. All groups either downed good old water or their spiked water daily for the next 12 weeks.

According to lead researcher Professor Mohammed Abou-Donia, “The dosage levels were selected because they span the range of values below and above the accepted daily intake (ADI) for sucralose [and ingredient in Splenda] of 5 mg per kg, daily established by the U.S. Food and Drug Administration.” In other words, the researchers were trying to basically mimic what the average citizen might consume daily – or slightly more.

At the end of the 12-week study, researchers noted significant reduction of beneficial bacteria in the Splenda groups. For example, healthy bifidobacterium and lactobacillus bacteria were reduced by 37% and 39% respectively with the lowest dosage of Splenda. These reductions were relative to the control group.

Interestingly, while all the rats gained a little weight hanging out during the 12-week experiment, the groups consuming the diet spiked with Splenda gained the most weight. The trend – though not a perfect fit – revealed that the more Splenda consumed the more weight gained. The researchers were also concerned with some enhanced gene expression in specific regions in the Splenda groups “which are known to limit the bioavailability of orally administered drugs,” according to the Duke researchers.

The reduction of healthy bacteria in our Splenda rats was also accompanied with a change in the pH of the rat colon. That is, less acidic. Both the reduction of beneficial gut bugs and increase (less acidic) pH is not a good thing and would skew the balance of bacteria in the rat gut to the point that pathogenic bacteria may gain some ground – not good.

As you can imagine, the good folks over at McNeil Nutritionals, part of Johnson and Johnson, which markets Splenda, were peeved by the published results. Predictably they are claiming “the study was flawed” and the study was nothing more than a “Sugar Association-funded rat study.” Even though the study was just published, Splenda got wind of the results prior to publication this summer and sought legal action. And this is where it gets weird.

In July, U.S. District Judge Dale S. Fisher, who is not a scientist, ruled that the results of the rat study cannot be extrapolated to people and are therefore “irrelevant”, according to a statement by Splenda folks. The court also noted that the researchers’ opinions about the study could be misleading. Take that big suga!

Not amused by the courts comments, in an interview Professor Mohammed Abou-Donia, lead researcher of the “rat” study, said: “The Judge accepted our study and conclusions as valid, but decided that because it was done in animals, it should not be extrapolated to humans. Despite the fact that Splenda was approved for use based on animal, mostly rat, studies.”

I guess the judge is unaware that pretty much ALL medical research into human disease has a “rat lab” component.

Take home message: If you have been using Splenda in your morning coffee or tea – or using in baking – you might want to do a little more research or cut back a tad. If you are one of those creatures of habit and can’t give up your Splenda, you might want to eat a little more of our pizza as each and every slice contains the prebiotics (and a diversity of fiber) that have been clinically proven to promote the health and growth of the very bacteria Splenda seems to be reducing.

Are Fruits and Veggies Really a Good Source of Fiber?

The message is everywhere—we should eat more fiber. The current Dietary Guidelines for Americans—the latest “food pyramid” concept—recommends that we consume 25 to 38 grams of fiber every day depending on age, gender and so on. Based on this sage advice, the average American manages to eat about half that amount. Why can’t (or won’t) we do better? One of the reasons might be that the most widely consumed produce items in the typical American diet has very little fiber, especially when you consider what fiber-rich items our not-so-distant ancestors had available.

The health benefits of fiber are well known. From regulating weight, lowering cholesterol, stabilizing blood glucose levels, and enhancing immune system, to good ol’ improved regularity, its hard to argue against increasing our consumption of fiber. (Click this link to read more about the health benefits of fiber and its role in promoting the health and well-being of our intestinal flora).

The USDA’s Economic Research Service keeps a snazzy little database on the amount of produce (veggies and fruit) that we grow and consume in this country. When you take into account the amount of produce lost during processing and due to spoilage, the amount that we actually consume is small. This meager amount is reduced further still when you consider how much is left on our plates or tossed from the refrigerator because it has wilted. The average American consumes only about 1.5 cups of vegetables and about a half cup of fruit per day. To put that in perspective, 16 grapes or 4 large strawberries equal a half cup. Not much.

Interestingly, the USDA data also reveal the lack of diversity in the fruits and veggies we consume. Even though there are tens of thousands of edible plants in our environment, Americans eat only about 30 or so vegetable products on a regular basis, and of these, a whopping five plants (potatoes, tomatoes, head lettuce, romaine/leaf lettuce, and onions) account for 66 percent of all veggies consumed. The top 10 plants account for 82 percent. It doesn’t take much research to discover our favorite vegetable in terms of consumption:  the potato. As for fruits; apples, bananas, grapes, strawberries, and oranges account for 63 percent of our consumption. (Note that tomatoes are technically a fruit because they contain seeds).

When you glance over the list of the top fruits and vegetables consumed by Americans, you immediately see the lack of diversity. You also see that, except for potatoes, the produce we eat is predominantly made up of water. This leaves very little “dry matter” or macronutrients such as protein, fat, and of course, fiber.

While we also get a portion of our fiber-fix from breads, pastas, and legumes/beans, there is a specific public health message to consume more fruits and veggies for a number of health reasons, much of it built on the assumption that these foods are also a good source of fiber. When viewed from an evolutionary perspective, our modern choices of which fruits and veggies we actually consume, preferences which are heavily influenced by geography, marketing, shelf life, and culinary, cultural and palate preferences, may not be an effective way to accumulate the quantity and diversity of fiber our bodies need. This lack of diversity is even more pronounced when you realize that the current recommendations for fiber intake are only a fraction of the 75 to 100 grams a day we should be consuming.

To illustrate the lack of fiber in among modern fruits and veggies, it’s useful to compare diets that are more in line with the nutritional landscape upon which we evolved a physiological need for fiber. The chart below shows the fiber content of the typical modern American diet compared to the intake of Australian Aborigines and Hadza foragers of northern Tanzania (who still live a non-westernized lifestyle), and detailed archaeological data from dry caves inhabited by hunter-gatherers for thousands of years in the Lower Pecos region of west Texas.


The data presented above are for dry weights (water subtracted) and are displayed as grams of fiber per 100 grams of dry food weight (left axis). Using published nutritional data from 30 of the most popular veggies and 25 of the most consumed fruits in the American diet, we see that we get approximately 10.70 grams of fiber per 100 grams (about 3 ounces) of dry weight of produce consumed. When compared to fiber intake recorded at various places and times on the landscape represented by our three “ancestral or ancestral-like diets,” we have discordance. It’s interesting to note that the data displayed for the Australian Aborigines is derived from nutritional analyses of the 800-plus plants the Aborigines are known to consume.

Given our current choices of fruits and veggies consumed in America, we would need to double our consumption to meet the fiber levels present in plants that dominated our ancestral diet. Clearly, our current choices in plants is not going to result in any meaningful improvement in fiber consumption unless we increase our diversity to include plants with higher fiber content: high fiber legumes, beans, and tubers, for example.

With the issue of prevention moving slowly ahead of cure and management in our list of national healthcare priorities, any honest public health strategy will need to consider the underlying problems in the food supply. These issues go beyond the low hanging fruit of too many calories and too little exercise to a better understanding that while market forces and our own preferences shift our food supply, our evolutionary-determined needs are moving at a much, much slower pace. Eat more fiber.

**Each slice of NAKEDpizza delivers fiber from greater than 10 sources.

Eggs: Are they better for you raw?

Eggs are one of very few animal foods that you can store at room temperature for weeks with absolutely no processing. How perfect. A single chicken egg can supply a variety of proteins in the proportions that you need, all safely delivered in a hard, bacteria-resistant shell. Again, how perfect.

eggsStarting in the 1950s with Steve Reeves— Hollywood’s Hercules—and continuing with Sylvester Stallone’s character Rocky Bolboa and The Governator, Arnold Schwarzenegger, generations of muscle-seeking citizens have downed large quantities of raw eggs as part of their training regimes. Much of the thinking that raw eggs are the ideal source of calories can be traced back to 1904, when raw-foodists Molly and Eugene Christian wrote that, “An egg should never be cooked…and that in its natural state it is easily dissolved and readily taken up by all the organs of digestion.”

Yeah, raw eggs are slimy, and there’s a campaign today to convince you that they’re also dangerous. But what if the raw eggs in front of you are safe and you’re not grossed out by a little slime? Does cooking an egg really make it less nutritious than if it were raw? Some Belgian researchers claim to have the answer.

In a set of experiments, some gastroenterologists analyzed the fate of egg protein after it was consumed by various test subjects. For the most accurate results, the researchers fed hens a diet rich in “labeled” atoms of stable isoptopes of carbon, nitrogen, and hydrogen. The researchers could then measure how much of this labeled protein  remained in the food collected in the ileum, at the end of the 35 feet or so of the test subject’s small intestine. Any protein that traversed the entire length of the small intestine was not absorbed. This protein was essentially metabolically useless because from this point on, bacteria in the colon digest the protein for their own selfish needs.

When the eggs were cooked, 91 to 94 percent of the cooked proteins were absorbed before in the small intestine, but only 51 to 65 percent of the raw proteins were absorbed. In other words, 35 to 49 percent of the protein from raw eggs was not absorbed and metabolized. In short, the researchers determined that cooking increased the available protein value of eggs by as much as 40 percent. The denaturation of proteins through the application of heat weakens the internal bonds of the proteins, making the three-dimensional structure more accessible to digestive enzymes, which in turn increases the amount of protein absorbed.

It’s worth pointing out that cooking also takes eggs from an essentially liquid form to a more solid food. As discussed in the earlier post on food texture and calorie burn, the human digestive process then has to take the solid eggs “back to” a liquid form to maximize protein absorption. This process is metabolically expensive and results in increases thermogenesis and increased calorie burn. Hell yeah.

A strict diet of pay attention is what it’s all about. eatNAKED friends, and live well.

Is food texture more important than calories in preventing weight gain?

Is a calorie just a calorie? Whether from a wheat bagel or a Snickers bar, is 300 calories, regardless of the source, just that: 300 calories? The experts say yes. This thought process then leads to the positive-caloric-balance hypothesis as an explanation of weight gain and obesity. In short, eat more calories than your body uses and you will gain weight. Seems logical.

These same researchers will quickly point to the first law of thermodynamics (the law of energy conservation) to bolster a cause and effect that implies that any change in body weight must equal the difference in the amount consumed versus the amount expended. This energy balance equation looks like this:

Change in energy stores = Energy intake – Energy expenditure

To this day, nearly a century of obesity research has been based on this simple formula. However, most obesity researchers and public health officials rely only on the right side of the equation (Energy intake – Energy expenditure) to explain obesity, conveniently ignoring the left side (Changing in energy stores). These experts correctly assume that a positive caloric balance is associated with weight gain, but they assume without justification that positive caloric balance is the cause of obesity. Any adult female can attest to the role of hormones in weight gain—a gain that is unrelated to caloric balance. This is made more clear during pregnancy, when hormone-driven evolutionary forces promote hunger, weight gain, and lethargy—all to assure that sufficient calories are available for the newborn. This and other misconceptions of weight gain and obesity have lead to over a century of misguided obesity research that continues to this day.

In a series of blog posts beginning with this one, we will lay out some basic evolutionary, biological, and cultural adaptations (and maladaptations) that may provide some insight into weight gain and overall health and well being.

To cook or not to cook (if so, how long?)

With all due respect to the raw food movement, cooked food just tastes better. And from an evolutionary perspective, the application of heat to our food has played a significant role in the success of our species. But are we cooking our food a little too much and for too long?

Cooking makes a food more digestible than the same food without the benefit of cooking. In carbohydrate-rich foods, the application of heat to a moisture-rich food (e.g., a potato) causes hydrogen bonds in the glucose polymers to weaken, causing the tight crystalline structure to loosen and gelatinize. As long as water is present in the food or the cooking environment, the starch will gelatinize. Once consumed, the gelatinized starches are more easily cleaved by our digestive enzymes, thus more digestible. The same process occurs in meats through denaturation of proteins through the application of heat.

From an evolutionary perspective, we may be cooking some of our food a tad too much for cultural and culinary reasons, and in the process affecting some time-honored physiological requirements of the human body—specifically, the role of the stomach in energy balance, satisfaction, and hunger.

It was not that long ago that all of our foods were minimally processed. In short, more crunchy, more grainy, and definitely less refined. There is no doubt that our ancestors would marvel at the sleek and gelatinized angel hair pasta of today and the pasty softness of a steamed carrot torpedo. While convenient and tasty, our modern processing (in the case of the finely milled flour in the pasta) and cooking techniques (“hyper” steamed veggies) have moved digestion from the stomach to the stovetop. All the extra cooking in our modern lifestyle has slowly eliminated—or at least reduced—the role the human stomach evolved to play in digestion. And herein lies the discordance between our modern lifestyle and its nifty technological tools and cultural preferences, with our evolution-determined physiology and specifically, energy balance.

The carrot-like tuber your ancestors ate either raw or minimally cooked has been replaced by foods with a texture like baby food. This texture comes from weakened glucose polymers caused by the application of super-efficient cooking techniques. The modern cooked carrot is easily digested and therefore rapidly moved through the stomach. It’s safe to say that modern humans are experiencing some of the fastest rates of gastric emptying in human history. Gone are the days when minimally processed foods stayed in the stomach for two, three, or even four hours. There’s no longer a need for food to stay in the stomach; the stovetop started the digestion well ahead of ingestion, greatly speeding the work of gastric enzymes.

This effect is nicely captured in the widely popular Glycemic Index (GI). The GI ranks foods according to their effect on blood glucose levels. High GI foods, like highly processed donuts and sugary soft drinks, cause a rapid rise in blood glucose and subsequent insulin levels. Not good. Foods that are processed less generally have a lower GI.

But cooking has a significant and often unappreciated effect on the GI of a food. A raw carrot, which takes some crunching to break down, is transferred to the stomach, where some time-honored digestion takes place in a natural and slow way. Thus, a raw carrot has a low GI (about 16). However, a peeled and boiled carrot is easy to chew and rapidly processed in the stomach, as it has been predigested on the stovetop. This cooked carrot has a higher GI, around 60. This translates to rapid gastric emptying and subsequent rapid absorption—resulting in elevated glucose and insulin levels. Not good.

So what does this have to do with weight gain? Aside from some issues related to elevated levels of insulin in the blood—which has a dramatic impact on fat metabolism (something we will cover in a subsequent post)—the processed carrot versus the unprocessed carrot is tinkering with some evolutionary processes related to thermogenesis, and may be playing an unrealized role in our national epidemic of obesity. This is nicely demonstrated in an elegant study recently published by Japanese researchers. (Hang in there, almost to the point!)

In this study, a team of Japanese scientists divided 20 rats into two groups of 10 at four weeks of age. Over the next 22 weeks, both groups ate a nutritionally identical diet of rat chow. However, for one group of rats, the hard-to-chew pellets were injected with a tiny bit of air, making them softer and easier to chew. This is more or less similar to our raw versus steamed carrot discussion above.

The air-injected pellets were more like breakfast cereal and required about half as much force to chew and break down. The hard and the  soft pellets were the same in how they were cooked, in their nutrient composition, and in their water content. Based on the “calorie is a calorie” argument and the first law of thermodynamics discussed above, rats reared under identical conditions and consuming the same nutrition should grow at the same rate and size and with the same amount of body fat and overall weight. But they did not.

Even though the rats had identical energy intake throughout the 22-week experiment, the rats which consumed the soft pellets slowly become heavier. It was gradual at first, but the rats fed soft pellets weighed about 6% more than the harder pellet eaters and had 30 percent more abdominal fat—enough to be classified as obese. The difference documented was due to the cost of digestion.

Before and after feeding, the researchers measured the body temperature of each rat.  At every meal the rats experienced a rise in body temperature, but the rise was less in the soft pellet group. The difference in temperature was most significant between the groups within an hour of ingesting a meal, when the stomach is churning and secreting. The researchers concluded that the softer diet resulted in obesity simply because it was less costly to digest. Increased heat during digestion burns calories at a faster rate, similar to the weight loss we experience when we’re sick with a fever.

We all know that weight gain does not happen over night. It’s a slow process that takes place over long periods of time and can fluctuate dramatically. And because this is a slow and gradual process, we also know the body is constantly trying to regulate energy intake to energy expended. The body strives for balance, not an imbalance. This is why you are hungry after vigorous physical exercise; your body wants to replace the calories you just burned. This also explains why a lumberjack needs 5,000 to 8,000 calories a day, but an advertising executive might only need 2,500 or so. If exercise were the answer, then all meter maids would be thin. But they are not.

Weight gain among a given population has more to do with tinkering with evolutionary processes than with sloth or one’s willpower. It’s uniquely biological. If the experts are correct in that small changes, like 90 minutes of exercise a week or 100 calorie snacks are the answer, then paying attention to the level of processing of our food discussed in this blog should have at least as much merit in fighting the obesity epidemic.

We are not advocating a raw food diet—oh hell no! That’s a sure way to guarantee you don’t get sufficient nutrients and will surely bore yourself and your loved ones to near death—literally. And raw beef is downright dangerous. We are suggesting that you take a closer look at the amount of processed food you eat. Think, before you steam rice for 15 minutes, that maybe 12 minutes would be enough— making it a little crunchier and therefore a tad harder for your stomach to break down. This will, in turn, elevate your body temperature slightly as your stomach churns and burns calories. You might also consider doing the same with steamed broccoli. In addition, don’t cut off and throw away the stalks and eat just the yummy crown. Cut that fiber-rich stalk into thin slices and steam away. By increasing the fiber in your meals, you will also give your stomach a chance to do its job as well.

The multi-grain crust at NAKEDpizza is a great starting point, far better than the over-processed offerings at other pizza places as well. Live long. EatNAKED, friendos.

Can our cooling bodies be playing a role in obesity?

The human body temperature more or less hovers around 98.6 °F (37.0 °C), although this varies throughout your day according to how active you are, what you eat, and so forth. The weight loss we experience when we exercise is related to, among other things, our increased body temperature. A higher core body temperature increases the rate of moisture evaporation and burns more calories. This is why our appetites increase following prolonged exercise or physical activity—for example, a burly lumberjack working outside all day may burn twice as many daily calories as an accountant, and that lumberjack will need to replace those calories just to maintain his original weight.

We have all lost weight while lying flat on our backs after catching some nasty bug. True, some of this has to do with reduced appetite and loss of fluids, but much of it has to do with our increased core body temperature, which rapidly burns through our stored calories: fat deposits in adipose tissue. Depending on the severity and length of your illness, body temperatures in the range of 99 to 101 degrees can increase the number of daily calories burned by 5 to 20%. Staggering when you think about it.

thinbushmanThis “infection” burn rate for calories, when viewed through the lens of our evolutionary past, becomes interesting—or at least it should—for those of us fighting the bulge at home or in guiding public health policy.

Before the age of infection-fighting drugs and antibiotics—and you can include antimicrobial soaps as well—our ancestors battled infection and its temperature-raising, calorie-burning effects on a daily basis. When I say ancestors, I’m talking about our pre-agricultural predecessors, not the old world folks who created a freakish relationship with the microbial world through crowded cities and tainted water and food supplies—all of which resulted in average life expectancy of a little more than 20 years.

If modern medicine has given us anything, it has been lower core body temperatures through reduced infection rates. While high fever and infection get all the attention, it’s the “low grade” infections and “slightly” high temperatures that should interest obesity researchers and public health officials. We have all heard that 3,500 or so calories equal a pound of body weight. That is, burn 3,500 calories on the treadmill or running around the park and you will lose one pound. Now think for a minute. The effects of a low-grade infection that raises your core body temperature from our current average of 98.6 degrees to, say, between 99.0 and 99.6 degrees on a more or less permanent basis. Without boring you with the math, this modest increase in body core temperature will increase our resting metabolic rate (the amount of energy one burns just sitting on the couch and not eating, as digestion burns calories, too) by 2 to 7 percent, depending on a dizzying number of variables. Imagine for a moment that your body automatically burned (needed) 2 to 7 percent more calories on a daily basis, and that your increased body temperature was not noticeable and did not affect your daily routine. For someone on a 2,500 kcal daily diet, an increased burn rate from a slightly elevated core body temperature (from a constant low grade infection) might result in an additional 100 or more calories burned in a day. Using the less than perfect assumption that 3,500 calories burned equals one pound lost, you would carve almost a pound from your frame every month if you just went about your daily routine. And in a year, well, you get the idea.

So what was the low-grade infection that our ancestors experienced? While there were a number, the most likely characters were various parasitic worms (e.g., helminths) that lived deep in our ancestral gastrointestinal tract. The presence of these parasites sends our immune system into action causing body temperatures to rise. Because the parasites compete with “us” for nutrients in our intestinal tract and “can” cause a great deal of problems if they get out of hand, the World Health Organization and modern medicine spend a lot of time trying to eradicate them from our species. These efforts are especially intense in developing countries, where dirty water and minimally processed meats easily transmit these parasitic infections. An estimated 40 million Americans have some form of parasitic infection. That said, much of the world’s population lives in a symbiotic relationship with our evolutionary hitchhikers. In fact, intestinal parasites are used to combat some autoimmune diseases.

Anyone in the livestock industry will tell you that administering antibiotics to your herd or flock will result in weight gain. In other words, reducing low-grade inflammation and general infection reduces calories burned and thus increases weight. Might our overuse of antibiotics and antimicrobials have reduced our “natural” low-grade inflammation just enough to tip the scales against us? Makes you wonder.

Did our ancestors really live longer than us?

British nutrition researcher Geoffrey Cannon recently restated in the journal of PUBLIC HEALTH NUTRITION a widespread affirmation that “Paleolithic people usually did not survive into what we call later middle age.” His underlying point, which is widely shared among researchers and the public at-large, is that our ancestors did not live long enough to develop cancer, heart disease and other chronic illnesses. All of which forms the basis for the near universal belief that ancient hunter-gatherers (our ancestors) really were not healthier or fitter than us moderns, and therefore their ancient dietary practices have little relevance to modern health, well-being, and longevity.

On the initial point, Cannon is correct. The average life span of our ancestors was short, compared to that of modern humans in developed countries where one can expect to live into their 60s, 70s and possibly early 80s, on “average.” Conversely, a Neanderthal living in ancient Europe was lucky to live past her teens, and if you lived to your mid-thirties you might have been considered old in Ancient Egypt. More recently, the average life expectancy in the United States in 1900 was 47.3 years. By 1935, that age had risen to 64 years and today that number hovers in the 70s for both women and men (though women can expect to live a few years longer, on average).

The first problem with this line of thinking is that the “average life span” math is misleading and tells us very little about the health and longevity of an individual, but rather gives us an average age of death for a given group or population. For example, a couple that lived to the ages of 76 and 71, but had one child that died at birth and another at age two ([76+ 71 + 0 + 2] / 4), would produce an average life span of 37.25. Using this methodology it is easy to see how one would come to the conclusion that this group was not very healthy.

However, the precept that diet played a significant role in the abbreviated average life span of our ancestors is simply not true. There are few among us that believe our so-called “westernized diet” of highly processed grains and added sugars and fats are an optimal diet for anyone – past or present. Our soaring rates of obesity and an ever-growing list of acute and chronic diseases – occurring in alarming frequency among younger sections of the population – speak to the discordance.

It is useful to point out that our species reached our current anatomical and physiological standing nearly 200,000 years ago. That is, while components of what we discern as hallmarks of behaviorally modern human beings, such as language, art, trade networks, and advanced weapons, have only occurred within the last 50,000 years, the hardware has been in place for 150,000 years. While we may drive around in hybrid cars today, we do so in very ancient bodies and with a genome that was selected, for the most part, on a nutritional landscape very different than the one on which we find ourselves today.

Before the advent and widespread adoption of agriculture, which depending on where you lived occurred between 1,000 and 9,000 years ago, humans organized in highly mobile groups of dozens or a few hundred individuals. Archaeological data and analysis of burial populations reveal that life was harsh and dominated by warfare, strife, destruction, human trophy taking, and the all-to-often practice of infanticide. All of these facts of ancient life, in conjunction with the lack of simple antibiotics and modern surgical practices, resulted in shorter average life spans than many of us enjoy today. As agriculture took hold around the globe and groups settled down and built more permanent communities and ultimately socio-politically complex civilizations, the more homogenous and centralized food and water supply was easily contaminated by human waste. While war and even larger massacres continued throughout the agricultural revolution, tiny microbial killers took their share of victims, especially among the young and undernourished, further reducing the cyclical nature of the average life span. As European ships set sail just a few centuries ago, new ills and evils further reduced the average life span of populations they encountered – albeit punctuated.

As war, contaminated water, killer microbes, and illness pulsed through humanity over time, our basic underlying physiological and nutritional parameters have changed little in the last few hundred thousand years. Our modern genome is in fact an ancient one and natural and cultural selection has built it to last. Under optimal nutritional conditions, such as those our genome evolved on, us modern hunter-gatherers can live healthy and long lives. We need only look to the modern Hunza of northern Pakistan or the southernmost Japanese state of Okinawa to witness the longevity that our ancient genome is selected for. With the threat of war and violence greatly reduced, and upon a sound footing of a safe food supply, our ancient bodies can be healthy well beyond “our best-before date” Cannon writes about. Based on a low-calorie, high-fiber plant-based diet, a significant portion of the population enjoy healthy and active lives into their 80s, 90s, and often beyond 100. Incredibly, the aged portions of these populations have lower rates of obesity, heart disease, diabetes, hypertension, high cholesterol, cancer, and other chronic diseases compared to western populations.

The modern world owes much to antibiotics and advanced surgical procedures of the last half-century, resulting in dramatic increases in average life span for much of the developed and developing world. Though horrific events in Darfur and other African regions remind us how significant gains in average life span can easily be erased. In Iraq, a male or female could expect to live to an average age of 66.5 in 1990, but today following years of foreign occupation and endless violence, life expectancy has dropped to a mere 59 for both sexes – and slightly younger for males. The self-confidence that comforts us today as we review the average life span of our ancestors is misguided and tenuous when viewed through the captivating haze of modern medicine that literally props most of us up into our golden years. I doubt our ancestors would call this living. While we may live longer than our ancestors, we are in fact dying slower. So rather than rest on our perceived cultural and medical success as it pertains to our longevity, we should challenge ourselves and our genomes to maximize our health for optimal longevity. For those not trusting of the past and the nutritional landscape upon which we evolved, our genetic cousins, the Hunza and Okinawans, have shown us a way forward.