Quantcast
Channel: archaeology
Viewing all articles
Browse latest Browse all 11

Paleo Fantasies: Debunking The Carnivore Ape Narrative

0
0

Every two years or so I notice a cyclical trend in the online “paleo” community. It’s the resurgence of dogmatic carnivory. It has two main themes: plants are “poisons” that cause most of our health problems and humans “evolved to be” very low carb. Always an undercurrent with some very zealous devotees (“The Bear” of Grateful Dead fame was probably one of its most prominent popularizers), it suddenly finds popularity among normally more moderate people, picking up some non-paleo low-carb followers in the process. Then it goes away again, hilariously with some of its top cheerleaders renouncing it in the process (like Danny Roddy).

It’s been back again lately. A few readers have written me about Anna who writes the blog Life Extension*. She is a graduate student in archaeology and social anthropology. Anna’s most popular post so far is “Debunking and Deconstructing Some ‘Myths of Paleo’. Part One: Tubers.” Sadly, an opportunity for greater communication to the public from a much-maligned discipline becomes a manifesto for low-carb diets. The tagline is “Glucose restriction represents not only the most crucial component of ancestral diets but is by far the easiest element to emulate.” I think we’ve heard this one before, but this time it is in language that is more authoritative than usual. This is the kind of writing I would have liked Paleofantasy to take on.

Unfortunately she doesn’t refer to sources directly in her text, so I’ve done my best to figure out which sources she is referring to.

Most archaeologists don’t go around promoting diets, because they recognize the limitations of their field. There is so much that is unknown and unknowable. It’s pretty easy for nearly anyone to pigeonhole what we do have to fit their own narratives.

The reduction in size and robusticity of the human skeleton is a clear temporal trend of newly agricultural communities. Diachronic skeletal comparisons reveal large-scale, significant reductions in growth rates.

Yes,  of some newly agricultural communities, and that doesn't mean it stayed this way. I’ve written about it more than I would have liked. I just wrote about it in my last post about Paleofantasy (which cites this review).

Then a funny thing happened on the way from the preagricultural Mediterranean to the giant farms of today: people, at least some of them, got healthier, presumably as we adapted to the new way of life and food became more evenly distributed. The collection of skeletons from Egypt also shows that by 4,000 years ago, height had returned to its preagricultural levels, and only 20 percent of the population had telltale signs of poor nutrition in their teeth. Those trying to make the point that agriculture is bad for our bodies generally use skeletal material from immediately after the shift to farming as evidence, but a more long-term view is starting to tell a different story. - Marlena Zuk

It also brings up how questionably height is used in these narratives. The few hunter-gatherers that exist today are very very short (mostly due to genetics). The rest of the world has grown taller and taller. Staffan Lindeberg in his magnum opus suggests we are too tall from overnutrition. Other markers that extremists attempt to use to show that agricultural humans show a downward trend in terms of health suffer from similar limitations.

 

Instances of porotic hyperostosis brought on by iron deficiency anaemia increased dramatically in agricultural settings.

A perfect example of why archaeology is not the best approach for deciding what is good to eat, as this particular marker has become controversial and has been re-evaluated.

There is a new appreciation of the adaptability and flexibility of iron metabolism; as a result it has become apparent that diet plays a very minor role in the development of iron deficiency anemia. It is now understood that, rather than being detrimental, hypoferremia (deficiency of iron in the blood) is actually an adaptation to disease and microorganism invasion.”- Porotic hyperostosis: A new perspective

Either way, I’m not sure what the transition these communities in upheaval experienced has to do with whether or not tubers or any carbohydrates are bad for you. It wasn’t just the food that changed for these people, it was their entire way of life, and it was a transition that changed their biology. And while there are trends, there is no linear health decline. There is a more systematic database of human remains and health markers that is in the process of being created right now that should be a great resource in the future. At this point a lot of papers claiming a decline are using inappropriate sample sizes and statistical methods.

Far too little evolutionary time has passed for us to be successfully acclimated to the novel conditions of agricultural life.

 

Another common thread that is begging the question. How long is long enough? How many adaptations are enough?

Speaking of evolutionary time:

Spending most of our human history in glacial conditions, our physiology has consequently been modelled by the climatologic record, with only brief, temperate periods of reprieve that could conceivably allow any significant amount of edible plant life to have grown.

Like Nora Gedgauda's paleo book Primal Body, Primal Mind, which she cites for unknown reasons, this sentences implies to her lay readers than glacial conditions = something out of the movie Ice Age. Which is just not true. A glacial maximum left some people in the cold, but Africa was still quite warm, and if we are talking about evolutionary time, that’s where we spent most of it. Outside Africa, most humans seem to have clustered in fairly temperate refugia such as Southern Iberia during the last ice age.

Many think of the late Pleistocene as the “Ice Age”, a time when continental glaciers coveredmuch of the earth and where the land not under ice was inhabited by giant cold-adapted animals—wooly mammoth, wooly rhinoceros, and cave bears—pursued by hardy humanhunters. While this image may be somewhat accurate for part of the world, most of the earthremained unglaciated throughout the Pleistocene.” -In Glacial Environments Beyond Glacial Terrains: Human Eco-Dynamics in LatePleistocene Mediterranean Iberia

Of course “significant amount” is also going to be a point of contention. Only in the very coldest tip of the arctic do levels of plants in human diet fall to close to zero. Beyond that, many people might not be aware of levels of starch and sugar available in the environment because traditions surrounding them have died out. I have written quite a bit about Northern sources of carbohydrates- “Siberian potatoes” and Alaskan native plant foods.

 

Further information on the evolution of our diet can be garnered from the genetic data of present populations, which demonstrates the historically-late biological adaptation to less than minimal quantities of starch and to only few and specific starch compounds.

I assume this refers to amylyse (AMY1) copy number, the function and history of which is not quite clear, much like lactase persistence. For example, I do not possess lactase persistence, even though my ancestors probably raised livestock for dairy, they were diversified pastoralists, so it’s likely there was not enough selective pressure for them to develop this trait. They consumed dairy, but the majority of their diet was not dairy.

It is unlikely the ancestral human diet was as high in starch as some horticulturalist tropical diets are now, where the majority of calories come from starch. But in the end, the differences in AMY1 copy number between humans are small compared to our differences with other primates, indicating that perhaps this was selected for in our own evolution. And in the original paper it is kind of mind-boggling they use the Mbuti as a “low-starch” population given their high starch consumption.

The Mbuti are particularly interesting because they are hunter-gatherers, but trade their surplus meat for starch and have done this for quite some time (when this isn't available there are forest tubers utilized as fall-backs). The only time they don’t trade is when honey is in abundance.

Anna’s assertion that starch is comparatively “inefficient” compared to meat using optimal foraging models doesn’t mean that humans would have chosen to eat only or mostly meat. That data includes game from South American environments, which is unusually fatty in comparison to African game. Even in South America, such game is not available in unlimited amounts in the first place, which is why even hunter-gatherer cultures that have access to it like the Ache also extensively gather and process starch and gather honey.

The consequences of limited availability and time investment of edible Palaeolithic plant foods has been analysed by Stiner, who compared food class returns amongst contemporary hunter-gatherer groups. Stiner found the net energy yield of roots and tubers to range from 1,882 kj/hour to 6,120 kj/hour (not to mention the additional time needed to dedicate to preparation) compared to 63,398 kj/hour for large game.

Anna’s assertions stand in stark contrast to the paper she seems to cite:

 

Staple plant resources present an extreme contrast to large game animals with respect to prevailing economic currencies (Table 11.1). Large animals generally yield high returns per unit foraging time (kJ per hour) but are unpredictable  resources. Seeds and nuts give much lower net yields per increment time (kJ per kilogram acquired), but they have potentially high yields with respect to the volume obtained and the area of land utilized.

Surveys of hunter-gatherers show overwhelmingly that preferred foods are fatty game and honey, highly caloric (and delicious), yet these are not the majority of the diet because they are not available in high predictable amounts, like the modern equivalents are.

As Kim Hill, who studies the Ache says“High-ranked items may be so rarely encounteredthat they represent only a very small proportion of the diet; low-ranked items in the optimalset may be encountered with sufficient frequency to contribute the bulk. It is interesting to note that on several occasions, reports of nearby palm fruit (ranked 12) were ignored, something that did not happen with oranges. On several other occasions people discussed the relative merits of hunting monkeys (ranked 11). reaching consensus that monkeys should not be pursued “because they are not fat.”  

Anthropologists have theorized on the importance of having carbohydrate fallback foods in the event that high-fat game is not available, either because of seasonality or over-hunting. In these cases, “rabbit starvation” from excess protein is a real danger. Surviving off of game is a real challenge, which probably accounts for the fact that many humans have any exploited seemingly tedious to gather plant resources in nearly every environment. 

Some of Anna’s arguments indicate that she has decided on some issues that are actually very controversial in anthropology and archaeology, such as the date of regular fire use (Anna asserts it was much later than many think) and that “However, plants have been preserved in the Lower Palaeolithic, and they are used primarily for functional and material – rather than nutritional – purposes.”

She does admit that “I will concede however that absence of evidence is not evidence of absence” but then goes on to list some sites that show possible non-food-related plant use that aren’t even associated with Homo sapiens, many are hominid offshoots that are unlikely to have contributed to our line (except for some of us who have a possible small amount of neanderthal ancestry). Other sites she mentions aren’t dated to the lower Paleolithic anyway.

Later sites such as Kebara she also dismisses, implying that legumes would have been used as fire starters rather than food. But admits that hominids would have supplemented their diet with “low glycemic” foods when meat was scarce.

Firstly, Neanderthals were highly carnivorous and physiologically inept at digesting plant foods. This can be measured using the megadontia quotient of Neanderthal postcanine tooth area in relation to body mass, which reveals that H. neanderthalensis must have consumed a greater than 80% animal diet. Nonetheless, the evidence of phytoliths and grains from Neanderthal skeletons at Shanidar Cave may reveal the rare consumption of starches in this singular context, but not the deleterious costs to the health of those that ate them.

The megadontia quotient, which is controversial in the first place, is not meant to be used in this way. Neither is the also mentioned expensive tissue hypothesis. They are meant to analyze use of uncooked fibrous plant foods and is not particularly enlightening in the case of large-brained hominids with cultural adaptations to food such as cooking. Some of the most recent research that reappraises the carnivorous theory of neanderthals is covered in this recent talk by neanderthal experts Dr. Margaret J. Schoeninger and Dr. Alison S. Brooks.

Humans show up as carnivores, even when they are known corn-eating agriuculturalists, like these people. But what happens when you plot other plants?

Now the data makes more sense (remember this data is showing where protein in the diet came from, it doesn't tell us how much protein was eaten). 

As you can see, initial isotopic studies (which can only show where the protein came from, not the amount of protein in the diet) that showed neantherthals as top carnivores came into question when farming populations were showing similar values. They realized that they needed to consider analyzing plants based on their most nutritious fractions, since when was the last time anyone sane ate something like a whole stalk of corn, husk and all? Another great paper by John D. Speth also summarizes some of the recent research on neanderthal diets and debunks hypercarnivory. 

humans were no longer able to transmute fibre into fat – as other primates can (consequently, they eat a high-fat diet) – through fermentation in the large intestine.

This, as anthropologist Dr. Richard Wrangham has pointed out, could also be an adaptation to cooking. And we didn’t lose this ability, it is just reduced, though no biologist would argue it the SCFA produced in the colon, which can provide calories and also modulate inflammation, are unimportant. SCFA metabolism is not comparable to longer chain fatty acid metabolism, so it’s not really appropriate to call these diets “high fat.” Furthermore, there are other primates with similar guts to ours like capuchins, who most certainly do not eat a carnivorous diet– they eat sugary fruit. But it’s very hard to compare our guts to the guts of other animals since cultural traits like cooking are so important for our food consumption.

I think it’s a bit amusing to read these posts alongside those of PaleoVeganology, written by a graduate student in paleontology who criticizes many popular paleo narratives. However much I disagree with him on the issue of modern diet choices, I commend him for not using his expertise to promote his chosen diet- he is explicit that his dietary choices are built on modern ethics and not the murky past.

The skeletons at Shanidar are certainly the first of many analyses of starches on teeth, which rules out theories like that plants were only used as decorations or fire starters. Since that first paper was published, others using the same method have followed and more will. But there is no way to use such data to speculate on how often or how much of these foods were consumed.

The coprolite “paper” that Nora Gedgaudas frequently cites also comes up, which I’ve addressed here.

Another common thread in carnivore narratives is that plants were used “only” as medicinals. I would not consider this as insignificant in any way– in most cultures, the line between food and medicine is a thin one. Many foods we enjoy as foods these days have medicinal roots.

Anna rightly criticizes the use of non-hunter-gatherers as hunter-gatherer proxies in writings about the so-called paleo diet and then cites a study that does the exact same thing-

In an attempt to reconstruct the diet of ice age hominids, a recent study analysed the macronutrient dietary composition of existing hunter-gatherer groups within latitude intervals from 41° to greater than 60°.

But where did this data come from? Anthropologist Katherine Milton responded quite well to this paper by Cordain:

The hunter-gatherer data used by Cordain et al (4) came from the Ethnographic Atlas (5), a cross-cultural index compiled largely from 20th century sources and written by ethnographers or others with disparate backgrounds, rarely interested in diet per se or trained in dietary collection techniques. By the 20th century, most hunter-gatherers had vanished; many of those who remained had been displaced to marginal environments. Some societies coded as hunter-gatherers in the Atlas probably were not exclusively hunter-gatherers or were displaced agricultural peoples. Because most of the ethnographers were male, they often did not associate with women, who typically collect and process plant resources.- Katherine Milton

The Ethnographic Atlas used in the “study” is available online and quite clearly does not contain 229 pure hunter-gatherer cultures. The 229 Cordain uses includes people who trade for or cultivate foods.

There is no evidence that mostly carnivorous groups of humans have particularly high longevity and in fact mummies, whatever their limits, have shown people eating these diets were not in fantastic condition, which of course like the bad condition of some early agriculturalists cannot be blamed on their diet.

It is awfully convenient to build a narrative to convince people to eat a limited diet based on the murky unknowns of the far past and near-mythical groups of supposedly extremely healthy carnivorous hominids. The carnivore-ape hypothesis is about as credible as the aquatic ape one.

One of the problems with human evolution, as opposed to, say, rocket science, is that everybody feels that their opinion has value irrespective of their prior knowledge (the outraged academic in the encounter above was a scientist, but not a biologist, still less an evolutionary biologist). The reason is obvious – we are all human beings, so we think we know all about it, intuitively. What we think about human evolution "stands to reason". Hardly a month goes by without my receiving, at my desk at Nature, an exegesis on the reasons how or why human beings evolved to be this way or that. They are always nonsense, and for the same reason. They find some quirk of anatomy, extrapolate that into a grand scheme, and then cherry-pick attributes that seem to fit that scheme, ignoring any contrary evidence. Adherence to such schemes become matters of belief, not evidence. That's not science – that's creationism.

I saw the same story building among vegans, who often craft similar narratives around our lineage's long plant-eating past. It speaks for a deep desire for people to justify their own choices. What all these dietary narratives have in common is that they confirm a particular limited diet is our “natural” diet and one that is best for humans, animals, and the environment. It’s not possible for them all to be right, and that’s because none of them are.

Ancient humans ate a large variety of foods, which is why we are adapted to so many. Human variation is high though, since our lineage has become so populous and geographically wide-ranging. There are many reasons for a modern human to adopt a low-carbohydrate or limited carbohydrate diet either temporarily or permanently. None of those have to do with this being the optimal diet for all humans or with a mostly-carnivorous ancestry.  


Viewing all articles
Browse latest Browse all 11

Latest Images

Trending Articles





Latest Images