Some dedicated pet owners are feeding their cats and dogs grain-free and soy-free pet food formulas, on the logic that those foods don’t resemble the diets of wild animals, so might make their pets fat or sick. Others go further and apply the same logic to humans. After all, agriculture was invented only about 10,000 years ago. Before that, we probably evolved on a diet consisting of only what we could gather with our bare hands or kill with a sharp stick. Thus, the “paleolithic diet:” no grains, potatoes, milk, or legumes; only meat, fish, greens, tubers, fruit, nuts, and honey. (Orthodox Paleo shuns dairy as a “neolithic” food, but some may permit butter, which can be seen as an animal fat delivery vehicle, largely free of the potentially unhealthy milk proteins and sugars.)
The “Standard American Diet” appears to be giving rise to a high prevalence of chronic diseases like type 2 diabetes, obesity, coronary heart disease, high blood pressure, and so on. These maladies are scarce among hunter-gatherers. So the idea of eating like hunter-gatherers has some intuitive appeal.
Many followers of Paleo specifically call for a low-carbohydrate diet, based on the idea that high-carb foods like grain, legumes, honey, and potatoes would not have been a significant soure of calories in prehistoric times. Some go so far as to promote low enough carbohydrate intake that one enters the metabolic state of ketosis (a la Atkins or Banting).
There’s an unsolved mystery: How did humans diverge so radically from evolutionary ancestors and develop such large brains? According to the “Hunting Hypothesis,” learning to hunt and eat meat was the reason, because the increased calories and protein from meat provided the capability for increased intelligence, and the complexity of hunting behavior provided a motivator and an evolutionary advantage for developing it.
Early humans ate meat; no one credibly disputes this. The question is about the relative importance of meat in the diets and development of early humans. Unfortunately, the Hunting Hypothesis is not really falsifiable. Other possible explanations for the development of modern humans exist, but there is very little conceivable evidence that could turn up tomorrow and show that the hunting hypothesis is wrong (or right).
One alternative explanation is that ancient humans lived near the water, learned to walk upright because of wading, and grew big brains because of the omega-3 fatty acids in fish they ate. Another possibility is that early humans stumbled on the innovation of digging for tubers, which provided access to lots of calories in the form of starch, and in this case meat (either hunted or scavenged) would have played only a supplementary role in the diet. But try to imagine what evidence could prove one of these explanations correct, and thereby prove that the Hunting Hypothesis is wrong. Short of using a time machine, there isn’t much. No matter how many ancient grindstones we find with plant food residue on them, no matter how many tooth microwear studies show that plant foods were consumed, proponents of the Hunting Hypothesis can still claim that meat eating was primary.
What would motivate anyone to promote an idea that can’t be proved or disproved? Here is a possible reason: All mammals like meat. Chimpanzees, for example, although long thought herbivores, have been found to hunt small mammals, and use the desirable meat to bargain for sex. So, it’s possible that people who like meat may be less than objective when speculating about humanity’s “natural” diet.
Research by paleoanthropologist Dr. Nathaniel Dominy points to starchy tubers as the most likely staple food of ancient humans. He observes that for hunter-gatherers, the majority of calories always comes from plant sources; meat is “just too unpredictable.”
In 2007, DNA research found more copies of genes that produce amylase (starch-digesting enzyme) in human saliva compared to that of other primates; for example the average human has ~3 times more AMY1 copies than chimpanzees, consistent with the fact that salivary amylase levels are ~6-8 times higher in humans than in chimpanzees, which are predominantly frugivorous. This research even found differences in number of amylase gene copies between populations that have traditionally relied on a starch staple (Japanese, Europeans) vs. populations that did not (e.g. rainforest hunter gatherers who rely on meat, insects, fruit, seeds, and honey). Further, according to this research, the low level of nucleotide sequence divergence among the three AMY1 gene copies found in the human genome reference sequence implies a relatively recent origin. This is strong support for the idea that early humans made adaptations to starch-eating.
Comparative anatomy provides another clue. Humans have long intestines and flat teeth for grinding, which would suggest that like our simian cousins, we are adapted to a primarily vegetarian diet. Carnivorous animals, by contrast, have sharp teeth for tearing, and short intestines. Any claim that human evolution has adapted us to heavy meat consumption has to include the assumption that human metabolism made this change before human anatomy had a chance to catch up. Actually, there are some differences between the human gut and that of close relatives like chimpanzees, such as the fact that the human stomach and colon are smaller, and the human small intestine is much longer. These are not adaptations that would occur to an animal that is eating more meat than its predecessors. Adaptation to eating more meat would probably have added features similar to those found in carnivores; for example, stomach size and acidity would likely have increased, and intestinal length would have decreased. Instead, the increased length of the small intestine vs. the large intestine in the human digestive system would be consistent with a change from the primarily low-calorie leafy greens of the ape, to calorie-dense starchy tubers.
To be fair, there are some areas, particularly in northern latitudes (e.g. the British isles), where archaeological evidence from the neolithic period suggests that mostly meat or animal products were eaten. Similarly, the Inuit (Eskimo) are well known for their reliance on animal foods since their cold environment is basically devoid of plants for much of the year. However, most of human and pre-human development did not occur in such northern latitudes. Any meaningful evolutionary adaptation likely occurred in Africa, where plant foods were relatively much more common.
Humans certainly can live on a low-carb, primarily-meat diet if necessary. And for now, we must acknowledge that we have no way to know the plant-to-animal food ratio of the typical ancient human’s diet. But do we have any other clues about the ancient human diet? And putting aside speculation about an evolutionary diet, does a low-carb diet make sense today for health, practical, or other reasons?
There is a bit of a quandary faced by individuals attempting to follow a meat-based, low-carbohydrate paleolithic diet.
Wild game meat is much lower in fat than modern domesticated food animals. According to H.P. Ledger (1968), cited by Don Matesz of “Primal Wisdom”:
This means that the wild game provided only about 30% of calories from fat, while modern domesticated meats can have as much as 65% of calories from fat or more. Note that this is fat content by calories, not by weight; when you buy even the “90% lean” variety of hamburger meat, that 10% of fat by weight contributes 51% of the calories (because fat provides 9 calories per gram vs. protein’s 4 calories per gram).
Ledger’s analysis of wild game meat showed that on average a 100 g portion provides 133 kcal, 22 g protein, and 4.3 g fat, 32% of which, on average (based on 17 species evaluated), occurred as polyunsaturated fats (ranging from 20-60%, largely as linoleic acid, omega-6), and well under 40% saturated fats. For comparison, four types of untrimmed domestic meats (beef, pork, lamb, ham) can supply an average of as much as 386 kcal and 29 g fat per 100 g, and average 45% saturated fat, and only 7 percent as polyunsaturates.
Indeed, the USDA National Nutrient Database for Standard Reference is even more conservative, and reports that antelope meat yields 15.8% calories from fat and deer meat yields 18.2% calories from fat. We can assume that these values are reported for muscle meat, and organs are higher in fat, so perhaps including organ meats in the analysis might have yielded values closer to those of Ledger. From reading various hunting how-to guides, it seems that the rule of thumb is that you get about half of an animal’s live weight in meat. “If a bull weighs 600 pounds, then the shoulders, hindquarters, and backstraps (without the 50-pound ribcage) will weigh about 250. […] The liver and heart can vary, but generally weigh about 15-20 pounds together.” Based on this, the organs don’t amount to much of the animal, relatively speaking, so wouldn’t be able to boost the fat content of the whole animal by a significant amount.
This suggests that our ancestors would not have been able to obtain much fat. Dairy was not available, game meat was (as we have seen) low in fat, wild nuts were rare and seasonal, and pressed oils from seeds or olives were not possible. Even if we ate nothing but meat on the savannah, we could not have consumed more than about 30% of calories from fat at the extreme, and if plant foods were present in the diet, that percentage would have only gone down. So, if the American Heart Association’s recommendation of 30% or less of total calories from fat is considered a “low fat” diet, then ancient humans were likely eating a low-fat diet. For more discussion on this subject, Don Matesz’s post quoted above is detailed reading.
Yet, if on the Paleo diet you are not supposed to eat many calories from carbohydrate, and if you are not supposed to eat high fat either, what is left? Protein. These constraints would suggest that you would need to get at least 60% of your calories from protein.
The problem is, this type of diet is not practically possible. If a person needs 2500 calories per day, and if 60% of those calories come from protein, then at 4 calories per gram, that is 375 grams of protein. But that figure is at the outside limit of the amount of protein that the human liver and kidneys are believed to be able to safely metabolize. In a phenomenon known as protein poisoning or “rabbit starvation,” having to live only on lean meat such as rabbit with very little fat or carbohydrate leads to insatiable hunger and symptoms like diarrhea and fatigue. Indeed, the Inuit (Eskimo), who are well known for their traditional diet of almost entirely animal food, rely mostly on the blubber of arctic animals, and actively avoid excessive protein, which can become difficult during late winter when animals grow lean. The famous explorer Vilhjalmur Stefansson, following a medically-supervised diet that attempted to imitate that of the Inuit, ate 100-140 grams of protein a day, out of a total of 2000-2500 calories per day, which comprises 17-25% of calories from protein — far from the 60% or more that it would seem a true Paleo diet requires. And from an empirical point of view, a high-protein/low-fat/low-carb diet is unpleasant. Try it and you’ll see what I mean.
So the quandary facing a low-carb Paleo adherent is: try to follow the low-fat, high-protein diet that their hypothesis prescribes? Or eat a high-fat, moderate protein diet, and equivocate about the resemblance of this diet to the foods that were actually available to our ancestors? Most Paleo adherents seem to choose the latter.
But since a sustainable low carbohydrate diet requires high fat intake, and since human ancestors did not have a significant fat source, the diet of human ancestors must not have been a low-carb diet.
So the answer to the quandary is: neither. High-carbohydrate plant foods (such as tubers) must have formed a significant part of the paleolithic diet.
There is a popular claim about the role of the hormone insulin which is used to support the idea that a low-carbohydrate diet is healthiest for humans. Here is a summary from one of the most visible proponents, Gary Taubes, from the introduction to his book Why We Get Fat:
First, when insulin levels are elevated we accumulate fat in our fat tissue; when these levels fall, we liberate fat from the fat tissue and burn it for fuel. This has been known since the early 1960s and has never been controversial. Second, our insulin levels are effectively determined by the carbohydrates we eat — not entirely, but for all intents and purposes. The more carbohydrates we eat, and the easier they are to digest and the sweeter they are, the more insulin we will ultimately secrete, meaning that the level of it in our bloodstream is greater and so is the fat we retain in our fat cells. “Carbohydrate is driving insulin is driving fat,” is how George Cahill, a former professor of medicine at Harvard Medical School, recently described this to me. […]
[…] These carbohydrates literally make us fat, and by driving us to accumulate fat, they make us hungrier and they make us sedentary.
Unfortunately, this picture is full of misconceptions. I recommend reading one or both of the following:
Insulin: An Undeserved Bad Reputation, and its follow-ups, at Weightology Weekly
The Carbohydrate Hypothesis of Obesity: a Critical Examination by obesity researcher Stephan Guyenet
I’ll summarize the relevant points here:
Insulin does not make you hungry. Actually, it suppresses appetite.
Carbohydrate is not the only thing that elevates insulin. Protein causes the secretion of insulin just as well as carbs, if not more. (Surprise, a diet heavy in meat can cause “insulin spikes” too.)
A high carbohydrate diet does not cause chronically high insulin levels (aka insulin resistance). In fact, quite the opposite, a high carbohydrate diet increases insulin sensitivity. Conceptually, it’s similar to the way frequent stair-climing doesn’t make you constantly tired, it makes you stronger and able to tolerate more stair-climbing.
Insulin is not singly responsible for causing energy to be stored. Energy storage takes place even without the presence of insulin.
What about the claim that carbohydrate itself turns into fat on the body? Consider these facts:
It is quite difficult for the body to turn glucose into fat. This is a process called “de novo lipogenesis,” and it is something that humans (unlike cows or bees) are very bad at. Only when there is no other alternative does the body convert glucose into fat, and even when it does, it is such an inefficient process that the metabolic cost is 30% of the calories consumed.
Body fat in humans, when analyzed, closely resembles the fatty acid profiles of the dietary fat that human primarily eats (fish, beef, cheese, vegetable oil, etc.) In other words, human body fat largely comes from foods in the diet, as opposed to being synthesized by the body.
When you put these points together, it is clear that in normal metabolism, when you take in excess calories, your body will eagerly designate as surplus — and store — calories that are already in the form of fat. In other words, no matter how high your insulin gets, if you’re only eating carbohydrate, it will not be meaningfully stored as body fat; and if you’re eating fat, no matter how low your insulin gets, it will continue to be stored as fat if there are too many calories.
Despite this, there are many credible, well-educated individuals, including doctors, who continue to promote the Insulin Myth. One is Dr. Ron Rosedale. In this lecture, he states: “A high-complex-carbohydrate diet is nothing but a high-glucose diet, or a high-sugar diet. Your body is just going to store it as saturated fat, and the body makes it into saturated fat quite readily.” But as we have already seen, humans cannot efficiently convert sugar to saturated fat (or any kind of fat). In that same article, Rosedale also states: “So every time you have a surge of sugar and you have a surge of insulin, you get more and more insulin resistant and risk all of the problems we’ve talked about.” In other words, he is claiming that being exposed to sugar and insulin provokes insulin resistance. However, that idea is contradicted by the evidence that a high-carbohydrate, low-fat diet increases insulin sensitivity. Finally, later in that lecture, he has the audacity to state: “No scientist out there is really going to dispute what I’ve said about carbohydrates.”
In The Hunting Hypothesis, we established that the archaeological evidence is inconclusive regarding the claim that prehistoric humans primarily ate meat. In The Fat Content Quandary, we observed that the ancient human diet was low and fat and therefore could not have been low in carbohydrate. In The Insulin Myth, we laid to rest the idea that carbohydrate (glucose) is unhealthy as a calorie source.
The health value of a high-fat diet is unclear at best. It may be true that there is no real connection between saturated fat consumption and heart disease (or maybe there is), but high-fat diets in general have been correlated to insulin resistance, stroke, elevated cortisol and inflammation, and even excessive daytime sleepiness. There are cautions about low-carb diets for pregnant women or those concerned about fertility. Dr. Atkins himself suffered heart disease and hypertension before his death.
High protein intake may not be optimal, either. Proposed links between meat consumption and various health issues remain controversial (case in point: The China Study and its various critics), but it is hard to ignore the Blue Zone work. Blue Zones are populations with a documented concentration of centenarians much higher than average. One striking feature of all five Blue Zones is that they eat very little meat: one serving per week on average. Along these lines, protein restriction has been proposed as a technique to extend lifespan through stimulating autophagy. In fact, baseline human protein requirements are astonishingly low: human breast milk provides less than 6% of calories from protein, compared to rice at a whopping 8%.
Moreover, the social and environmental impact of high meat diets is significant. Factory farming creates prodigious quantities of manure and methane, and the meat industry employs people in gruesome, demoralizing slaughterhouses. Meat as an energy source is also dreadfully inefficient, considering that it takes at least 2.5 to 8 pounds of grain to raise 1 pound of beef.
What would a high-carbohydrate, moderate-protein Paleo diet look like?
One model might be the Kitavan diet. The Kitavans are a fairly isolated hunter-gatherer culture in Papua New Guinea. Tubers, fresh fruit, coconut, and fish are the mainstays of the Kitavan diet, with fat intake at about 20% of calories. It is interesting to note that the Kitavans are not particularly physically active.
Another dietary model could be the traditional Okinawan diet. This diet is about as high-carb as it gets with 85%, 9% and 6% of total calories from carbohydrates, protein, and fat respectively. Pork was eaten, but mainly reserved for holiday feasts. The calorie density of the Okinawan diet was low, at around one calorie per gram of food. The staple was the nutrient-rich sweet potato, not rice as was typical in the rest of Japan. The Okinawan population has become famous for their longevity and much lower prevalence of many modern diseases. The Okinawans were also quite lean, with a BMI of 20. Note that contemporary diets on the island of Okinawa may no longer resemble the traditional diets. (Okinawa is, of course, one of the Blue Zones.)
Apparently it’s hard to go very wrong even eating nothing but potatoes.
Eating some amount of meat is well supported by what we know of our evolutionary history, as well as empirical studies. In fact, it’s possible that hunting activities shaped our upright stance. But almost every known culture, civilization, and hunter-gatherer tribe throughout history has had access to and used some form of starch as a staple: yams, sweet potatoes, cassava, plantain, corn, rice, wheat, even acorns. Starches have long been the most reliable energy source, and our high-amylase saliva and uniquely elongated small intestine suggests that we are best adapted to use them.