top of page

Facultative Carnivore Reasons


The ingestion of fat in the human diet unlocked the evolutionary process that led to rational thinking and a higher level of cognition.

The purpose of this article is to reconcile the hypotheses that: (1) brain evolution occurred due to a change in diet, and (2) it occurred due to pressures related to understanding more and more about the underlying causes, such as understanding increasingly complex manipulative and cooperative intentions on the part of the other, as well as understanding reality itself (and how to interact with it beyond group issues). I argue that the ingestion of fat, a highly energy-efficient food, would have unlocked the evolutionary process that culminated in the emergence of the practice of reasoning about underlying causes; and that the consolidation of such a practice resulted in a continuous pressure to increase cognition about “whys”; so that many explanations ended up imposing the need for additional ones, and with that came a high level of awareness and the need for the brain to evolve not only in terms of providing a higher level of cognition but also in size.

Reality is permeated by patterns of organization, many of which, if identified by an organism so that it can manifest in advance an adequate response pattern to what reality (response pattern that somehow also needs to be identified), allows a substantial increase in the chances of survival in the face of the challenges that reality imposes (see Bates, 2005; Winter, 1998). That said, we can conceive that the evolution of cognitive structures and processes arises from competition for information. However, such evolution can be impeded by limiting mechanisms, such as metabolic and energy costs of the brain (Foley, 1995). In this sense, I see that for an organism to be able, compared to others, to experience a significantly higher rate of cognitive evolution, it must first discover and enjoy a superior source of energy. This seems to have happened with a species of animal, the one that gave rise to humans.   Under the pressure of tremendous food shortages, by accident, a primate lineage appears to have discovered and become adapted to eating food from within the bones of carcasses (e.g., marrow and brain), which contain a highly efficient macronutrient. to generate energy: fat (Thompson et al., 2019). The use of fat as a source of energy would then have enabled the development of a highly effective type of reasoning to increase the organism’s chances of survival: causal reasoning. In this regard, it is important to mention that non-human animals seem to be capable of a rudimentary kind of causal reasoning (Dickinson & Balleine, 2000; Völter & Call, 2017). More specifically, such animals seem to be able to represent, in some way, the invariant sequence of events they detect in reality (e.g., “A” is usually followed by “B”). However, their reasoning seems to stop there. That is, they do not reflect on the unobservable, thus not being able to represent an intangible variable that would possibly be mediating the detected sequence (for example, that “A” is followed by “B” because of “C”) (Penn et al., 2008; Povinelli, 2000, p. 299).   In this sense, I see that in evolutionary time, as the species of animals that gave rise to humans came into contact with reinforcing consequences arising from the cognitive effort to extract underlying causes and act on the basis of such extractions, interest in the “whys” was being established in its nature to the point of becoming consolidated. With that, there would be the consolidation of the tendency to construct and use heuristics that go back to the whys about everything possible, especially why “I” should seek a certain end and through a certain means1. This, in turn, would have been responsible for the emergence of the level of awareness only seen in humans: the awareness that we act in virtue of achieving ends through means, that is, the awareness that we are agents2. The achievement of this point, or evolutionary stage, would coincide with the emergence of a new way of acting in nature, the “rational” way (see also Boyle, 2012), which we can define as: making an informed decision by analyzing whether or not it is worth pursuing an end and by certain means, which takes place through a weighting of consequences based on our network of theories, which involves: 1) assess whether the achievement of an end tends to generate good consequences; and 2) to assess which means are most suitable for the consequence “achievement of an end”, in case it has been decided that it is really worth trying to achieve it (Osmo, 2021; Osmo, Borri, & Falcão, 2022).    The concomitant consolidation of the interest in the whys (in acquiring wisdom), which would have provided the emergence of agency consciousness, and the interest in acting on the basis of explanations (in acting rationally), would then coincide with the emergence of humans, which we conceive today as being “sapiens” and rational”. In fact, we name the current species as “sapiens”, but the interest in accumulating explanations of reality is probably something that would have already been established in other “homos”. Nevertheless, it is a fact that the brain size of the current “homo” is significantly larger. So we need to contemplate, in this analysis, the reason for this evolution to have occurred.   In my view, there would have been a continuous pressure to increase cognition due to another dimension of reality having opened up with the interest in causal mechanisms; so that many explanations ended up imposing the need for additional explanations. This is the case, for example, when one member of the group develops a successful theory of how to exploit the other; the other needs to build a theory about the explorer’s theory so that he or she can detect the exploitative attempt in time to avoid it3 (see Byrne, 1997). I also see that the pressure to increase cognition in “homo” was accompanied by a gradual elaboration and consolidation of rational heuristics, shortcuts related to how to evaluate something and behave in the face of this something that goes back to the network of explanations that the individual has so far. In material terms, this would be reflected in the gradual emergence and expansion of brain structures responsible for enabling the elaboration of increasingly refined rational heuristics and accommodation of neuronal networks that allow their use whenever reality, in some way, imposes the need (cf. Pinker, 2021, p. 97; Fowers, 2015, pp. 47–48). In this regard, it is worth mentioning that the emergence and expansion of brain structures necessarily imply the growth of its size, since evolution does not occur by discarding old structures to make room for new ones (as when we change a processor from a personal computer for a more efficient one without needing a bigger computer for that) (Pinker, 2001, p. 182); instead, evolution takes place through the use of old structures, from which new ones develop, and this also applies to structures responsible for cognition (Cosmides & Tobby, 1997; Fowers, 2015, pp. 50–52).  Final Considerations  In this article, I argued that it would have been the adoption of a fat-focused diet that would have provided the necessary energy support to make it possible to reason about underlying causes and, therefore, to experience the reinforcing consequences that the reach of explanations of reality, and of the acting on the basis of them, is capable of providing. The ingestion of fat would then have unlocked the evolutionary process that culminated in the emergence of humans. However, I would like to point out that, in my view, it was not the consumption of fat per se that would have led to the growing interest in underlying causes and acting on “whys”, but rather the reinforcing consequences experienced in discovering such causes. In other words, the consumption of fat would have allowed for initially accidental thinking about explanations to occur; and when it happened, this type of thinking ended up being reinforced by advantageous consequences, becoming then, little by little, a consolidated practice. In this process, there would have been a continuous pressure to increase cognition, since another dimension of reality opened up with the interest in underlying causal mechanisms; so many explanations ended up imposing the need for additional explanations, and with that came the need for the brain to evolve not only in terms of providing a higher level of cognition but also in size.

This study has produced a hypothesis that there is a gas pedal behind human intelligence that could be as simple as the ingestion of animal fat over millions of years. This has always been one of my fundamental ideas behind the creation of this website. I just don't think it's fair to credit other explanations as more fundamental than the ingestion of animal fat. 


Saturated fat doesn't correlate to heart disease, meaning that eating it throughout evolution would have been healthy

Saturated fat: villain and bogeyman in the development of cardiovascular disease?

Cardiovascular disease (CVD) is the leading global cause of death. For decades, the conventional wisdom has been that the consumption of saturated fat (SFA) undermines cardiovascular health, clogs the arteries, increases risk of CVD and leads to heart attacks. It is timely to investigate whether this claim holds up to scientific scrutiny.

The purpose of this paper is to review and discuss recent scientific evidence on the association between dietary SFA and CVD.

PubMed, Google scholar and Scopus were searched for articles published between 2010 and 2021 on the association between SFA consumption and CVD risk and outcomes. A review was conducted examining observational studies and prospective epidemiologic cohort studies, RCTs, systematic reviews and meta analyses of observational studies and prospective epidemiologic cohort studies and long-term RCTs.

Collectively, neither observational studies, prospective epidemiologic cohort studies, RCTs, systematic reviews and meta analyses have conclusively established a significant association between SFA in the diet and subsequent cardiovascular risk and CAD, MI or mortality nor a benefit of reducing dietary SFAs on CVD rick, events and mortality. Beneficial effects of replacement of SFA by polyunsaturated or monounsaturated fat or carbohydrates remain elusive.

Findings from the studies reviewed in this paper indicate that the consumption of SFA is not significantly associated with CVD risk, events or mortality. Based on the scientific evidence, there is no scientific ground to demonize SFA as a cause of CVD. SFA naturally occurring in nutrient-dense foods can be safely included in the diet.

Saturated Fat may have been a main part of our diet if it didn't lead to heart disease. We'd expect an evolutionary appropriate diet wouldn't lead to chronic disease.


Humans are dependent upon eating choline, found mostly in meat and animal products like eggs.

Humans are unique in their diet, physiology and socio-reproductive behavior compared to other primates. They are also unique in the ubiquitous adaptation to all biomes and habitats. From an evolutionary perspective, these trends seem to have started about two million years ago, coinciding with the emergence of encephalization, the reduction of the dental apparatus, the adoption of a fully terrestrial lifestyle, resulting in the emergence of the modern anatomical bauplan, the focalization of certain activities in the landscape, the use of stone tools, and the exit from Africa. It is in this period that clear taphonomic evidence of a switch in diet with respect to Pliocene hominins occurred, with the adoption of carnivory. Until now, the degree of carnivorism in early humans remained controversial. A persistent hypothesis is that hominins acquired meat irregularly (potentially as fallback food) and opportunistically through klepto-foraging. Here, we test this hypothesis and show, in contrast, that the butchery practices of early Pleistocene hominins (unveiled through systematic study of the patterning and intensity of cut marks on their prey) could not have resulted from having frequent secondary access to carcasses. We provide evidence of hominin primary access to animal resources and emphasize the role that meat played in their diets, their ecology and their anatomical evolution, ultimately resulting in the ecologically unrestricted terrestrial adaptation of our species. This has major implications to the evolution of human physiology and potentially for the evolution of the human brain.

This dependence on meat could also have triggered important changes in early hominin physiology by adapting to a regular consumption of animal protein and fat. There is evidence that modern human physiology, which makes our species highly dependent on regular intake of cobalamine, may also have its origins in the early Pleistocene63. Choline, an essential nutrient that plays a crucial role in gene expression (through methylation of its oxidized form, S-adenosylmethionine) and in brain and liver function is also most abundant in meat and animal products, with very few plants containing any substantial amount of it64,65. Humans are also more dependent on this essential nutrient than other primates and failure to meet minimum doses leads to serious pathological conditions.

Humans are reliant upon choline from animal source foods indicating we're at least obligate carnivores if not facultative carnivores.


Humans spent less time on feeding compared to other apes, indicating that meat was changing anatomy of molar size through evolution.

Phylogenetic rate shifts in feeding time during the evolution of Homo

Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.

Starting with Homo erectus, humans developed smaller molars and also began to spend a lot less time on feeding than would be predicted from body mass and phylogeny with other apes (only 5% instead of a predicted 48% of daily activity in Homo sapiens) [Organ et al. 2011].


In this paper, we have taken advantage of phylogenetic methods to reevaluate existing hypotheses and promote the generation of hypotheses. As in many recent phylogenetically based studies, our analysis made a critical distinction between observable differences (typological) and the evolution of those differences (transformational), with the latter type of question explicitly addressed by phylogenetic comparative methods (29). This type of comparative phylogenetic analysis allows quantitative testing of hypotheses about the evolution of traits, including brain size in hominins (30), body size in animals (31), and differences in promiscuity in birds (32). These studies have provided insights into evolution by analyzing traits for which observable variation had long been known. 

Concerning the work presented here, the question is whether the overall rate of craniodental evolution across primates can explain the decrease in relative tooth size in hominins under a random walk (Brownian motion) model of character change. Our approach moves this question into a broader comparative framework and connects feeding time (a behavior) with the evolution of anatomical characters. If we had found that the evolutionary change in molar size of Homo was predicted from evolutionary rates across primates, we would have concluded that the transformation of tooth size in hominins was not associated with a specific new behavior. With our approach, however, we made the opposite finding; human feeding time and molar size are truly exceptional compared with other primates, and their oddity began around the start of the Pleistocene. 

The evolution of morphology and physiology in animals can be driven by the prior evolution of functionally correlated behaviors. For example, changes in diet for members of Homo relative to other hominins have been inferred from changes in molar size and structure in the fossil record (11–14, 21, 22, 33, 34), with dramatic drops in relative molar size occurring with the evolution of H. erectus (20). The evolutionary shift in dietary habits (including reduced feeding time) likely causally preceded these morphological adaptations, because cooking or nonthermally processing food decreases its toughness, which reduces the need for high bite forces and changes feeding patterns (15–17, 35). The exact biomechanical processes that link jaw and tooth morphology to cooked and processed food are unknown. Experimental work on craniofacial biomechanics has shown that soft food, of the kind resulting from food processing, contributes to changes in facial size and shape during ontogeny of mammals (36). This finding suggests that there is a link between eating soft (cooked) food and evolutionary changes throughout the human face, including smaller teeth and jaws, despite increases in body size (37). 

Changes in body size have important ramifications for feeding, because large animals generally have greater caloric requirements. Large-bodied animals can accommodate this need by ingesting larger food boluses, eating a greater number of food items at a time, and feeding more often throughout the day. Our results show that the amount of the day spent feeding scales with body size in primates, probably to compensate, in part, for the per chew food processing rate, which declines with increased body size (38, 39). The phylogenetic expectation is that human feeding time should be similar to the feeding time of great apes such as chimpanzees. The dramatic difference in feeding time between chimpanzees and humans contrasts sharply with our close phylogenetic distance and indicates that feeding time was substantially reduced on the lineage to modern humans. 

Larger animals typically consume more food each day than might be expected, because large-bodied animals generally eat lower-quality food (40). Humans are able to spend less time feeding because they typically consume higher-quality food than chimpanzees, and because they use cooking and nonthermal processing to render more calories available from food (2, 3). Cooking and nonthermally processing foods also reduces food particle size and increases starch gelatinization, which results in earlier bolus formation and swallowing (41). These facts suggest that a dramatic increase in caloric intake from cooking and nonthermally processing food played an important role in shaping our evolutionary history. 

Previous research has shown that some of the cross-species variation in feeding time is also explained by changes in the number and types of food items consumed (39). For example, our result that humans are evolutionary outliers for the small amount of time spent feeding could be explained by the inclusion of large amounts of meat in the human diet (42), except that feeding time was measured for modern humans whose diets were dominated by plant material. Furthermore, human tooth morphology is clearly not adapted for obligatory carnivory (42), and only extreme high-latitude populations are able to survive solely on animal foods (26). The best explanation for our result is that a shift in consumption (from raw unprocessed foods to soft cooked and nonthermally processed foods) originated somewhere along the line to modern H. sapiens after the human– chimpanzee split. 

Early H. erectus (ergaster) lived in southern and eastern Africa from 1.9 to 1.5 Mya (43). Based on reconstructions indicating that it had small molars and a small gut volume, H. erectus has been hypothesized to have cooked its food (1). Our findings support this view by showing that, by the time that H. erectus evolved, the molars in our lineage were so small that changes in body mass cannot account for the change in molar size. Hence, they spent substantiality less of their day engaging in feeding activities. Facultative food processing, including cooking, likely originated, therefore, before the appearance of H. erectus, perhaps even in H. habilis or H. rudolfensis. Although distinct morphological correlates of feeding time are difficult to distinguish in these species, inference of feeding time based on body size and phylogenetic position suggests that H. habilis is within the human range (μ = 7.2%, σ = 2.3), whereas H. rudolfensis (μ = 9.5%, σ = 3.3) borders the human range. Outside of the genus Homo, we have no a priori reason to expect species to have had feeding times like modern humans. Our model predicts that Paranthropus spent an average of 43% (σ = 11.4) of its day feeding, which is similar to the time that chimpanzees spend feeding (37%). Nevertheless, our phylogenetic analyses reveal that behavioral, physiological, and other nonfossilizing adaptations related to feeding and now necessary for long-term survival of modern humans evolved by the time of H. erectus and before our lineage left Africa.

We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split.

This order of magnitude difference could be due to eating fatty meat instead of fibrous low quality plants.


Eating of meat allowed reduction of size in teeth, jawbone, and chewing muscles.

Diet and the evolution of the earliest human ancestors

Over the past decade, discussions of the evolution of the earliest human ancestors have focused on the locomotion of the australopithecines. Recent discoveries in a broad range of disciplines have raised important questions about the influence of ecological factors in early human evolution. Here we trace the cranial and dental traits of the early australopithecines through time, to show that between 4.4 million and 2.3 million years ago, the dietary capabilities of the earliest hominids changed dramatically, leaving them well suited for life in a variety of habitats and able to cope with significant changes in resource availability associated with long-term and short-term climatic fluctuations.

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

The origins of the genus Homo are murky, but by H. erectus, bigger brains and bodies had evolved that, along with larger foraging ranges, would have increased the daily energetic requirements of hominins1,2. Yet H. erectus differs from earlier hominins in having relatively smaller teeth, reduced chewing muscles, weaker maximum bite force capabilities, and a relatively smaller gut3,4,5. This paradoxical combination of increased energy demands along with decreased masticatory and digestive capacities is hypothesized to have been made possible by adding meat to the diet6,7,8, by mechanically processing food using stone tools7,9,10, or by cooking11,12. Cooking, however, was apparently uncommon until 500,000 years ago13,14, and the effects of carnivory and Palaeolithic processing techniques on mastication are unknown. Here we report experiments that tested how Lower Palaeolithic processing technologies affect chewing force production and efficacy in humans consuming meat and underground storage organs (USOs). We find that if meat comprised one-third of the diet, the number of chewing cycles per year would have declined by nearly 2 million (a 13% reduction) and total masticatory force required would have declined by 15%. Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.


The australopithecines exhibited a complex of morphological features related to diet that are unique compared with living hominoids or Miocene apes. These early hominids all had small- to moderate-sized incisors; large, flat molars with little shear potential; a ratio of first to third molar area that was low compared with those of extant apes, but generally higher than those of Miocene apes; thick tooth enamel; and thick mandibular corpora. This suite of traits is distinctive of australopithecines and suggests a dietary shift at or near the stem of hominid evolution. Their thickenameled, flattened molars would have had great difficulty propagating cracks through tough foods, suggesting that the australopithecines were not well suited for eating tough fruits, leaves, or meat. The dental microwear data agree with this conclusion, as the australopithecine patterns documented to date are most similar to those of modern-day seed predators and soft fruit eaters. Furthermore, given their comparatively small incisors, these hominids probably did not specialize in large, husked fruits or those requiring extensive incisal preparation. Instead, the australopithecines would have easily been able to break down hard, brittle foods. Their large flat molars would have served well for crushing, and their thick enamel would have withstood abrasion and fracture. Their mandibular corpora would probably have conferred an advantage for resisting failure, given high occlusal loads. In essence, for much of their history, the australopithecines had an adaptive package that allowed them ready access to hard objects, plus soft foods that were not particularly tough. The early hominids could also have eaten both abrasive and nonabrasive foods. This ability to eat both hard and soft foods, plus abrasive and nonabrasive foods, would have left the early hominids particularly well suited for life in a variety of habitats, ranging from gallery forest to open savanna. Fig. 5. Mandibular corpus shape (data from refs. 75, 76, and 85 and M. Leakey, personal communication). 

 Does this mean we can talk of a characteristic ‘‘australopithecine’’ dietary pattern? Perhaps to some extent, but although the australopithecines shared many features in common, they also differed from one another, suggesting a change in diet through time. Such morphological changes occurred as a mosaic, much as that seen for locomotor anatomy. Much of the evidence for Ardipithecus ramidus is not yet available, but despite its thin molar enamel and absolutely smaller teeth than those of later hominids, it shows molar size proportions that may hint at dietary changes to come. A. anamensis shows the first indications of thicker molar enamel in a hominid, and its molar teeth were equivalent in size to those of A. afarensis. Still, its mandibular corpus is intermediate in robusticity between those of living great apes and later australopithecines. This combination of features suggests that A. anamensis might have been the first hominid to be able to effectively withstand the functional demands of hard and perhaps abrasive objects in its diet, whether or not such items were frequently eaten or were only an important occasional food source. A. afarensis was similar to A. anamensis in relative tooth sizes and probably enamel thickness, yet it did show a large increase in mandibular robusticity. This increase may be due to changes in peak force magnitude or degree of repetitive loading in mastication. Either way, hard and perhaps abrasive foods may have become even more important components of the diet of A. afarensis. A. africanus shows yet another increase in postcanine tooth size, which by itself would suggest an increase in the sizes and abrasiveness of foods. However, its molar microwear does not show the degree of pitting one might expect from a classic hard-object feeder. Thus, even A. africanus has evidently not begun to specialize in hard objects, but rather has emphasized dietary breadth. In contrast, subsequent ‘‘robust’’ australopithecines do show hard-object microwear and craniodental specializations, suggesting a substantial departure in feeding adaptive strategies early in the Pleistocene. In sum, diet was probably an important factor in the origin and early evolution of our family. The earliest australopithecines show a unique suite of diet-related features unlike those of Miocene apes or living hominoids. Such features suggest that the earliest hominids may have begun to experiment with harder, more brittle foods at the expense of softer, tougher ones early on. This does not mean that all of the australopithecines were specialized hard-object feeders. It merely means that, through time, they acquired the ability to feed on hard objects. Many modern primates need to consume critical ‘‘fall-back foods’’ at certain times of the year (6), and it may well be that the earliest australopithecines resorted to the consumption of hard objects only in such situations, whereas the robust australopithecines relied on them far more regularly. Another important aspect of early hominid trophic adaptations is evident from data presented here—the dietary shift from apes to early hominids did not involve an increase in the consumption of tough foods, and so the australopithecines were not preadapted for eating meat. This conclusion runs counter to (i) recent isotope work suggesting that the australopithecines did in fact consume significant amounts of meat (7) and (ii) nutritional work suggesting that meat may have provided critical nutrients for both young and old hominids (77–79). There would seem to be three different ways to reconcile these perspectives. First, the present study has reviewed only craniodental features related to diet. If the australopithecines used other means for ingesting and processing meat (e.g., tools), they might have been able to process meat more efficiently than the craniodental evidence suggests (80, 81). Second, the heavy C3 signature found in A. africanus (7) may reflect the consumption of underground storage organs of C3 plants rather than meat (82). Third, the functional analyses of the teeth assume that all meat has the same degree of toughness. This may not be the case. Studies of the physical properties of food have thus far focused on plant remains, with only brief mention of the toughness of materials like skin (40, 46). Variations in toughness between animal tissues might well be due to variations in the arrangement and density of collagen matrix. Furthermore, the physical effects of decomposition might render meat less tough and more readily processed by hominids. If this is so, it could be further evidence in support of scavenging as part of the early hominid way of life. Investigators have tried to relate patterns of hominid evolution to patterns of climatic change for some time (3, 4). The focus of much of the recent work has been on the origin of the genus Homo. Can the dietary shifts in the earliest hominids also be tied to such changes? Whereas there is some evidence of large-scale climatic changes around the Mediterranean (83) and unusual faunal turnover in parts of western Asia (84), there are no large-scale changes evident in sub-Saharan Africa until after the earliest hominids have arrived on the scene (i.e., not until 1.5–2.5 million years ago). There is the slow and inexorable cooling and drying of the Miocene, but perhaps the crucial result of this was an increase in microhabitat variability. Certainly, there are limits to our paleoecological evidence from this period, but as Potts (4) has noted, ‘‘in general, the oldest hominids were associated with a diverse range of habitats.’’ These included lake and river margins, woodland, bushland, and savanna. Potts (4) has emphasized that locomotor versatility was a crucial adaptation of the earliest hominids in the face of such varied environmental conditions. We feel that this perspective needs to be extended to the dietary adaptations of the earliest hominids as well. In such a land of variable opportunities, the generalized craniodental toolkit of the earliest hominids may have had a distinct advantage, as it allowed our forbears the flexibility to cope with short-term and long-term climatic variations and the resultant changes in resource availability.

Because the mechanical properties of foods vary depending on many factors such as species and type of portion consumed, further research is necessary to examine additional foods and processing techniques important to human evolution. More research is also needed to quantify the impacts of variations in masticatory morphology on chewing efficiency because dental topography and facial shape affect the relationship between food fracture and chewing effort (for example, sharper cusps increase applied chewing stresses, and relatively shorter jaws increase the mechanical advantage of the adductor muscles). Even so, we speculate that despite the many benefits of cooking for reducing endogenous bacteria and parasites29, and increasing energy yields23,24, the reductions in jaw muscle and dental size that evolved by H. erectus did not require cooking and would have been made possible by the combined effects of eating meat and mechanically processing both meat and USOs. Specifically, by eating a diet composed of one-third meat, and slicing the meat and pounding the USOs with stone tools before ingestion, early Homo would have needed to chew 17% less often and 26% less forcefully. We further surmise that meat eating was largely dependent on mechanical processing made possible by the invention of slicing technology. Meat requires less masticatory force to chew per calorie than the sorts of generally tough plant foods available to early hominins, but the ineffectiveness of hominin molars to break raw meat would have limited the benefits of consuming meat before the invention of stone tools approximately 3.3 Ma. Although recent and contemporary hunter–gatherers are less dependent on stone tools than early Homo because they eat mostly cooked meat, many of the oldest tools bear traces of being used to slice meat9, and the use of tools (now mostly metal knives) to process foods such as meat is well documented ethnographically30. This dependency on extra-oral mechanical processing, however, does not apply to other animal-based foods such as marrow, brains and visceral organs that might have been difficult to access without tools, but are easier to chew than muscle. Although it is possible that the masticatory benefits of food processing and carnivory favoured selection for smaller teeth and jaws in Homo, we think it is more likely that tool use and meat-eating reduced selection to maintain robust masticatory anatomy, thus permitting selection to decrease facial and dental size for other functions such as speech production, locomotion, thermoregulation, or perhaps even changes in the size and shape of the brain16. Whatever selection pressures favoured these shifts, however, they would not have been possible without increased meat consumption combined with food processing technology.

The shift from fibrous plants to including animal source foods, together with the use of tools, paralleled a decrease in teeth size and jawbones, a reduction in chewing muscles, and weaker maximum bite force capabilities [Teaford & Ungar 2000; Zink & Lieberman 2016]. Homo molars gained steeper slopes and more relief, also suggestive to an adaptation to meat eating [Ungar 2004].


Are humans evolved specialists for running in the heat? Man vs. horse races provide empirical insights

Many mammals run faster and for longer than humans and have superior cardiovascular physiologies. Yet humans are considered by some scholars to be excellent endurance runners at high ambient temperatures, and in our past to have been persistence hunters capable of running down fleeter quarry over extended periods during the heat of the day. This suggests that human endurance running is less affected by high ambient temperatures than is that of other cursorial ungulates. However, there are no investigations of this hypothesis. We took advantage of longitudinal race results available for three annual events that pit human athletes directly against a hyper-adapted ungulate racer, the thoroughbred horse. Regressing running speed against ambient temperature shows race speed deteriorating with hotter temperatures more slowly in humans than in horses. This is the first direct evidence that human running is less inhibited by high ambient temperatures than that of another endurance species, supporting the argument that we are indeed adapted for high temperature endurance running. Nonetheless, it is far from clear that this capacity is explained by an endurance hunting past because in absolute terms humans are slower than horses and indeed many other ungulate species. While some human populations have persistence hunted (and on occasion still do), the success of this unlikely foraging strategy may be best explained by the application of another adaption – high cognitive capacity. With dedication, experience and discipline, capitalising on their small endurance advantage in high temperatures, humans have a chance of running a more athletic prey to exhaustion.


Although horses are substantially larger animals than humans (approx. 500 vs. 70 kg, respectively), they have comparable stride lengths at endurance running speeds (Figure 4 of Bramble & Lieberman, 2004; Heglund & Taylor, 1988). However, because they have superior cardiovascular systems (Williams et al., 2015), it is not surprising that horses typically traverse MvH courses more quickly than do human competitors (Figure 1). Yet the time gap between the two species closes on hotter days; in the heat, the degree of deterioration in race performance of horses is greater than that of humans. This finding was sometimes subtle but always apparent in each of the three race events we analysed (Figure 1). Being larger, horses have a lower surface area-to-volume ratio and greater thermal inertia meaning that all else being equal, they lose heat to a cooler ambient environment more slowly. But even accounting for size, data provided in Lindinger (1999) on human and horse sweat rates (the percentage of sweat used for cooling, and the percentage of heat loss from various routes) indicates that the rate of evaporative heat loss of horses is about half that of humans. Consequently core temperature rises much more slowly in humans than horses, and this probably explains why humans experience a relatively slow loss of physical capability compared to horses when running in the heat. Clearly, humans have an exceptional capacity to dump excess heat through sweating (Lindinger, 1999; Schmidt-Nielsen, 1964).

Can this subtle advantage be enough to proclaim that humans have a heat adaptation enabling them to out-run prey in hot environments? The very high sweat rates of humans, the fact that human persistence hunters tend to select the hottest time of day to hunt, the heat exhaustion exhibited by their prey, and the reduced detriment of high ambient temperatures to running speed (Figure 1) together suggest that heat tolerance during running is key to the success of human persistence hunters. Yet horses in MvH races are still running more quickly than humans, even on hotter days, all, of course, while carrying a human rider and without it being imperative to survival. And data for wild endurance species such as African wild dogs and grey wolves indicate that they too travel faster while hunting than do humans (Hubel et al., 2016; Mech, 1994), although ambient temperature was not reported. Thus in answer to the question as to whether humans are comparatively well adapted to endurance running at high temperatures, the evidence suggests no – in absolute terms, even in the heat humans are not fast runners.

While horses are not a prey species of human persistence hunters, the MvH comparison highlights that humans will be off the pace in trying to chase down ungulate prey even at high ambient temperatures. During persistence hunts, what are the choices that humans can make that can bridge this performance gap? Human persistence hunting involves a group of communicating individuals, with only the very best completing the final stages (Liebenberg, 2013). After spending time carefully observing a herd, they predominantly target an animal weakened from age, injury or emaciation, or otherwise an individual with large, heavy horns (e.g. Liebenberg, 2013). They know that as they get close to their quarry, it will flee at pace. Although hunter and hunted cover approximately the same distance, the latter is doing so through stop–start bursts of intermittent locomotion, which could be energetically less economic (Kramer & McLaughlin, 2001; Seethapathi & Srinivasan 2015), thus generating more heat. Furthermore, if pauses are too short, intermittent locomotion can result in detrimental high-energy phosphate depletion and lactate accumulation from anaerobic metabolism (Edwards & Gleeson, 2001; Kramer & McLaughlin, 2001; Weinstein & Full, 2000). Persistence hunters assess information on their own physical state and their perceptions of the physical state of their quarry, then attempt to optimise the pace at which they track their prey so that the prey overheats, and they do not. Finally, humans persistence hunters have the advantage of hands – some stave off dehydration-induced fatigue by carrying water on hunts, for example in ostrich shells (Liebenberg, 2013).

Our synthesis of the current literature, coupled with our analysis of MvH race times, leads us to the following interpretation, which we put forward for debate and critique. We humans clearly have a superior capacity for endurance than do other primates (Pontzer, 2017), allowing us to expend more energy on foraging – an investment humans pay back with a greater calorie intake (Leonard & Robertson 1997; Pontzer et al., 2016). Nonetheless, compared to a broader range of species, human endurance is not exceptional. Carnivores in general travel further than herbivores in search of food (Carbone, Cowlishaw, Isaac, & Rowcliffe, 2005; Joly et al., 2019) requiring a capacity to locomote for extended periods, and humans appear to fit this mould with anatomical adaptations such as a significant bias toward slow-twitch skeletal muscle fibres (O'Neill, Umberger, Holowka, Larson, & Reiser, 2017). Yet our general running endurance is not as impressive as that of many other mammals. Rather than being the elite heat-endurance athletes of the animal kingdom, humans are instead using their elite intellect to leverage everything they can from their moderate endurance capabilities, optimising their behaviours during a hunt to bridge the gap between their limited athleticism and that of their more physically capable prey. Our capacity for profuse sweating provides a subtle but essential boost to our endurance capabilities in hot environments. This is a slight but critical advantage that our ingenuity magnifies to achieve the seemingly impossible: the running down of a fleeter-footed quarry.

New Findings
What is the central question of this study?Do available comparative data provide empirical evidence that humans are adapted to endurance running at high ambient temperatures?
What is the main finding and its importance?Comparing the results of races that pit man against horse, we find that ambient temperature on race day has less deleterious effects on running speed in humans than it does on their quadrupedal adversary. This is evidence that humans are adapted for endurance running at high ambient temperatures. We debate whether this supports the hypothesis that early man was evolutionarily adapted for persistence hunting.


Human-like Cmah inactivation in mice increases running endurance and decreases muscle fatigability: implications for human evolution

Compared to other primates, humans are exceptional long-distance runners, a feature that emerged in genus Homo approximately 2 Ma and is classically attributed to anatomical and physiological adaptations such as an enlarged gluteus maximus and improved heat dissipation. However, no underlying genetic changes have currently been defined. Two to three million years ago, an exon deletion in the CMP-Neu5Ac hydroxylase (CMAH) gene also became fixed in our ancestral lineage. Cmah loss in mice exacerbates disease severity in multiple mouse models for muscular dystrophy, a finding only partially attributed to differences in immune reactivity. We evaluated the exercise capacity of Cmah−/− mice and observed an increased performance during forced treadmill testing and after 15 days of voluntary wheel running. Cmah−/− hindlimb muscle exhibited more capillaries and a greater fatigue resistance in situ. Maximal coupled respiration was also higher in Cmah null mice ex vivo and relevant differences in metabolic pathways were also noted. Taken together, these data suggest that CMAH loss contributes to an improved skeletal muscle capacity for oxygen use. If translatable to humans, CMAH loss could have provided a selective advantage for ancestral Homo during the transition from forest dwelling to increased resource exploration and hunter/gatherer behaviour in the open savannah.

4. Discussion and conclusion

Hominin evolution is like a bush, with many lineages of hominins coexisting throughout much of the past 6 Myr [45,46]. While hominin bipedalism emerged early and possibly more than once, CMAH loss [19,20] occurred later and roughly coincides with the major biomechanical and environmental changes that took place as hominins probably transitioned to a more carnivorous diet [13,47]. Such a transition could have been greatly facilitated by an increase in the capabilities of ancient hominins to perform persistence hunting and explore a wider range for resources. For this reason, the exact timing of CMAH loss in the fossil record is of interest, and a method to measure a stable Neu5Gc metabolite in 4 Myr-old fossil material has recently been developed [48].

Emulating human CMAH loss in mice generates an increased capability to use oxygen. This is most evident by an increase in endurance running performance, muscle fatigue resistance in situ and myofibre respiration ex vivo. Importantly, these differences appear to be completely independent of differences in biomechanics or eccrine sweat glands already associated with the success of humans for long distance running compared with other vertebrates [5]. At least part of this difference in oxygen use could be owing to a difference in baseline skeletal muscle capillarity. This was observed in the more oxidative soleus but not the more glycolytic plantaris. Notably, the soleus is a highly oxidative slow twitch muscle compared with most muscles in a mouse and more closely resembles the type 1 and type 2A fibre type distribution prevalent in humans and other relatively large mammals [49,50]. Compared to other primates, human muscle also contains a greater proportion of myosin heavy chain I (MHC I) fibres [15,51] and likely associated capillaries, another predictor of human endurance [52]. Increasing the number of capillaries supplying each myofibre increases vital nutrient and oxygen availability to mitochondria during periods of prolonged endurance exercise or providing resistance to muscle fatigue as we measured in situ [38,40,5356]. In this regard, a comparable performance of muscle fatigability ex vivo further strengthens the hypothesis that greater oxygen availability contributes to the superior muscle fatigue resistance measured in situ in mice with Cmah loss [5759]. This is supported by the technical limitation in detecting small differences in O2 use ex vivo owing to the known diffusion limitations in isolated muscles externally bathed in an O2-saturating solution [60]. The observed increase in ADP-stimulated OXPHOS of saponin-permeabilized muscle fibre bundles, however, does suggest that Cmah−/− myofibres have a higher capacity to use O2.

The heat map of metabolites illustrates a major effect of exercise adaptation on the muscle metabolite profile. Although there is no significant difference in the citric acid cycle metabolites measured (citrate, malate and succinate) between WT and Cmah−/− exercise-adapted mice, greater increases in anabolic amino acids such as the BCAAs, leucine and isoleucine, were observed in Cmah−/− exercised muscle. In addition to their anabolic effects after physical exercise [61], increased muscle BCAA can also prevent oxidative damage and enhance physical endurance in mice [62]. The higher prevalence of metabolites of the anabolic pentose phosphate pathway in Cmah−/− exercised mice could also help to combat oxidative stress [6366].

One of the disparate clues that lead us to test endurance capacity in Cmah−/− mice was the finding that when crossing this genotype into the human-like Duchenne Muscular Dystrophy mouse model (mdx), Cmah−/−/mdx mice display a much more severe and human-like muscular dystrophy pathology [26,27]. The C/EBP family of transcription factors connect changes associated with metabolism [67] to the inflammatory response [68,69] and muscle wasting [70]. We have previously shown that the family member (C/EBPβ) could be modulated simply by causing uptake and metabolic incorporation of Neu5Gc into macrophages ex vivo [29] and that C/EBPβ was differentially expressed in WT versus Cmah−/− macrophages. Alterations in macrophage C/EBPβ expression or activity during the development and/or polarization of macrophages could be a contributing factor towards the differences in baseline capillary to muscle fibre ratios observed in the soleus muscles of Cmah−/− mice [71]. Previously reported genechip analysis revealed that the expression of another C/EBP family member (C/EBPδ) and the transcriptional activity of CREB1 are upregulated in Cmah−/− gastrocnemius muscle compared to WT controls [26].

The single oxygen atom added to Neu5Ac by CMAH generates Neu5Gc, and this conversion from an acetyl group to a glycolyl group probably alters the amphipathicity and/or charge of the primary sugar molecule as well as the macromolecules carrying them at the membrane surface. We believe that surface Neu5Gc loss could increase membrane surface hydrophobicity, which could facilitate a greater oxygen diffusion rate, but this is difficult to test. On the other hand, the intracellular turnover of these sialic acids would generate acetate and glycolate, respectively, which could intrinsically alter cellular metabolic flux. Sialic acid-binding proteins and sialidases can also differentiate between the two types of chemical structures. Given that the great majority of self-surface and secreted molecules of all cell types (including muscle) express such sialic acids (often at high densities), the loss of Neu5Gc (and the resulting excess of Neu5Ac) in the hominin lineage is likely to have had multiple effects on multiple pathways and systems. Thus, there are many mechanisms possible and we have only begun to explore some of them. Our current work suggests that there were probably complex multilevel effects of Cmah loss on skeletal muscle and vascular physiology during the evolution of hominins. Integrated changes in the O2 transport system provide a greater capability for long distance running in vivo, resistance to muscle fatigability in situ and greater maximal ADP-stimulated OXPHOS in skeletal muscle, despite no measurable difference in fatigue resistance ex vivo. These data suggest a critical role for oxygen delivery and use in the muscle endurance phenotype.

Given that Neu5Gc loss altered the surfaces of almost all cells in the body, it is not surprising that no single mechanism can fully account for the increase in spontaneous exercise and maximal endurance observed in Cmah−/− mice. Further study of all components of the integrated oxygen transport system, including cardiac function, are needed. For the time being, given the timing of the mutation and the potential relevance of its fixation to the emergence of the genus Homo, it is reasonable to speculate that this mutation may have been essential for running faster and further. Thus, the emergence of an endurance phenotype critical to our ancestral lineage: an increased range for resource exploration and the ability to chase down prey over long distances.

If translatable to humans, CMAH loss could have provided a selective advantage for ancestral Homo during the transition from forest dwelling to increased resource exploration and hunter/gatherer behaviour in the open savannah. (I.E. hunting animals using persistence hunting, tracking, chasing)


Rapid changes in the gut microbiome during human evolution

Human lifestyles profoundly influence the communities of microorganisms that inhabit the body, that is, the microbiome; however, how the microbiomes of humans have diverged from those found within wild-living hominids is not clear. To establish how the gut microbiome has changed since the diversification of human and ape species, we characterized the microbial assemblages residing within hundreds of wild chimpanzees, bonobos, and gorillas. Changes in the composition of the microbiome accrued steadily as African apes diversified, but human microbiomes have diverged at an accelerated pace owing to a dramatic loss of ancestral microbial diversity. These results suggest that the human microbiome has undergone a substantial transformation since the human–chimpanzee split.
Humans are ecosystems containing trillions of microorganisms, but the evolutionary history of this microbiome is obscured by a lack of knowledge about microbiomes of African apes. We sequenced the gut communities of hundreds of chimpanzees, bonobos, and gorillas and developed a phylogenetic approach to reconstruct how present-day human microbiomes have diverged from those of ancestral populations. Compositional change in the microbiome was slow and clock-like during African ape diversification, but human microbiomes have deviated from the ancestral state at an accelerated rate. Relative to the microbiomes of wild apes, human microbiomes have lost ancestral microbial diversity while becoming specialized for animal-based diets. Individual wild apes cultivate more phyla, classes, orders, families, genera, and species of bacteria than do individual humans across a range of societies. These results indicate that humanity has experienced a depletion of the gut flora since diverging from Pan.

We identified 35 instances in which the relative abundance of a microbial taxon shifted since the divergences of the extant species of African ape (Fig. 1), 17 of which occurred in humans since the divergence of Homo and Pan. Several of these changes in the composition of the human microbiome have functional implications for host nutrition. The relative abundance of Bacteroides, which has been positively associated with diets rich in animal fat and protein (9), has increased in relative abundance more than fivefold in humans. Conversely, the archaeon Methanobrevibacter, which promotes the degradation of complex plant polysaccharides by using the end products of fermentation for methanogenesis (10), has undergone a more than fivefold reduction within humans. Similarly, the abundance of Fibrobacter, a common plant-fermenting bacterial genus (11) of the microbiomes of wild apes, has been greatly reduced in humans.


Comparisons of the gut microbiomes of populations of humans, chimpanzees, bonobos, and gorillas provide insight into the evolution of hominid microbiomes. In particular, we have reconstructed how human microbiomes have changed since humans diverged from Pan by identifying the features of the microbiome shared across human populations to the exclusion of African apes. Our results demonstrate the utility of incorporating information about the phylogenetic relationships among hosts into analyses of their microbiomes.It has been proposed that recent lifestyle changes in humans have depleted the human microbiome of microbial diversity that was present in our wild-living ancestors (4); however, this hypothesis has not been tested through comparisons of humans and closely related host species. A previous survey of two humans and 24 wild apes found that the humans contained lower levels of 99% OTU diversity than the wild apes (8), but the small sample sizes precluded both the statistical evaluation of this trend and the ability to identify bacterial taxa that are consistently not recovered from human hosts. We observed that the mean level of microbial diversity within an individual’s gut microbiome differed substantially among ape species, with the microbiomes of humans being the least diverse. This trend does not appear to be the product of any specific cultural practice and is apparent in humans regardless of whether they resided in cities in the United States, small towns in Malawi, or villages in the Amazonas or Venezuela. This observation confirms the hypothesis that the levels of microbial diversity in the human microbiomes have decreased during human evolution. Of the human populations, humans from cities in the United States harbored the lowest levels of diversity, a trend previously observed by Yatsunenko et al. (5), suggesting that microbial diversity has been reduced even further in this group.An alternative, but less parsimonious, explanation for the differences among the observed levels of diversity within hosts of each species is that Pan and Gorilla have experienced independent increases in the levels of microbial diversity since diverging from humans. Providing an explanation for these independent increases is difficult, however, whereas cultural and ecological differences between humans and wild apes provide clear causes for the reduced microbial diversity along the human lineage. Extending sampling to wild-living populations of more distantly related primate species will provide further evaluation of these the competing hypotheses that explain the current variation in diversity levels across human and wild ape microbiomes.Despite marked differences among the microbiomes of human and African ape species, there exists a set of bacterial taxa shared across host populations, potentially representing the ancestral core of the African ape microbiome. Moreover, co-occurrence patterns among many of these taxa are recapitulated across host species (Fig. S2). This result mirrors previous descriptions of “enterotypes” (1416) or “community types” (17): bacterial assemblages within the gut microbiomes of humans, chimpanzees, and mice defined by differential representation of Prevotella, Bacteroides, Ruminococcus, and Parabacteroides (1416). Thus, it is possible that these consistent co-occurrence patterns among bacterial taxa result from ecological relationships that predate the diversification of human and African ape species.Sampling the gut microbiomes of hundreds of individuals from each host species also allows the identification of population-level differences in the mean relative abundances of scores of bacterial taxa. Analyzing these differences in a phylogenetic context provides insight into how the composition of the human gut microbiome has been reshaped since humans diverged from other species. Consistent with the known dietary shifts that occurred during human evolution (18), taxa that have been associated with the digestion of animal foodstuffs (9) have risen in relative abundance in the human gut microbiome, whereas taxa that have been associated with the digestion of plant-based diets (9) have become less prominent.Phylogenetic comparisons of populations of host species can reveal the consistent differences between their microbiomes that arose since the host species diverged; however, the relative roles of genetic divergence and ecological/cultural divergence between host species in generating the differences between their microbiomes remain unclear. The sampling of hosts consuming similar diets in similar environments can reveal the extent to which the contents of microbiomes are attributed to innate differences between the hosts as opposed to differences between the hosts' environments or lifestyles. Some attempts have been made to compare the microbiomes of different host species that co-occur. For example, Ley et al. (19) showed that differences between the gut microbiomes of distantly related mammal taxa were maintained when hosts resided within the same zoo. Similarly, Song et al. (20) found that cohabiting dogs and humans shared more bacterial OTUs compared with hosts from separate households, but the gut microbiomes of dogs remained distinct from those of their cohabiting humans. Likewise, Moeller et al. (21) showed that sympatric chimpanzees and gorillas harbored more similar sets of bacterial species than did the gut microbiomes of allopatric chimpanzees and gorillas, but chimpanzee and gorilla microbiomes can always be differentiated, even when the host species live in sympatry (21). These results suggest that, although shared environments might lead to the exchange of some bacterial taxa, many of the differences among the microbiomes of host species are robust to environmental influences.

We analyzed the microbiomes of hundreds of humans and African apes in a phylogenetic framework to reconstruct how microbiomes have diverged over the course of hominid evolution. This approach, which relies on population-level microbiome data from a clade of host species for which the phylogenetic relationships are known, can be applied to interrogate the evolutionary history of the microbiomes of a diversity of host groups. Relative to the microbiomes of wild apes, human microbiomes have experienced a reduction in ancestral microbial diversity and an increase in the frequency of bacterial taxa associated with animal-based diets. The consequences of this reduction of bacterial diversity in the human gut microbiome remain unexplored; however, low levels of bacterial diversity in the microbiome have been associated with gastrointestinal disorders (22), obesity (23), and autoimmune disease (24). Understanding how recent changes in the gut microbiome have influenced human health can benefit from further study of the ancient relationships between wild apes and their resident microbial communities.

Compositional change in the microbiome was slow and clock-like during African ape diversification, but human microbiomes have deviated from the ancestral state at an accelerated rate. Relative to the microbiomes of wild apes, human microbiomes have lost ancestral microbial diversity while becoming specialized for animal-based diets.


A Cross-species Analysis of Carnivore, Primate, and Hominid Behaviour

The traditional assumption that the origin of human behavior is found within the higher primates rather than the social carnivores is based on failure to adequately define primate and carnivore behavior. Phyletic classification takes no account of convergence and divergence; the behavior of a species is not necessarily characteristic of its order. Of 8 behavior variables that distinguish the order primates from the order carnivora, preagricultural man resembles the carnivores on 7 items: food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, feeding of young, and division of labor; resembling the order primates only in group defense. The original form ofmuch human behavior is found within the carnivores.

1. Introduction 

In attempting to deduce the origin and evolution of human behavior, anthropologists have frequently looked to the nonhuman primates for clues. Some evidence has also been offered for the relevance of terrestrial carnivores because of common evolution as a hunter (Dart, 1953; Ardrey, 1961; Schaller & Lowther, 1969). Both approaches have usually provided lists of similarities between human behavior and the behavior of a particular species. But lists of similarities between species, however long, are only a partial answer. Individual species behavior can vary drastically as a result of divergence and convergence. Primate behavior is behavior which characterizes the order primates; carnivore behavior is behavior which characterizes the order carnivora. When comparisons are made at a specific level ambiguities result. For example, Teleki (1973) compares the predatory behavior of chimpanzees with human hunting as evidence for deriving this behavior from the primates. However, the predatory behavior of chimpanzees is not characteristic of species in the order primates, but is in fact deviant. A more likely theory is that chimpanzees and hominids are both converging with terrestrial carnivores and have acquired their similar behavior through convergence on a common niche. The central thesis of this study is that there is incontrovertible evidence of the convergence of human behavior with carnivore behavior. 

2. Methodology 

The approach outlined above uses the following procedures. A hypothesis is proposed as to which of two paired behaviors is characteristic of carnivores. Each hypothesis is posed in such a manner that one of two variabIes is presumed to be causal; i.e. one variable is presumed to be independent, one dependent. The independent variable used throughout this study is taxonomic inclusion within the order carnivora and order primates. Since the dependent behavior variable is also a dichotomy, usually A and .& the reIationship of the two variables is expressed in a 2 x 2 contingency table. A hypothesis is considered confirmed if more carnivore species exhibit A' than A and more primate species exhibit A than A and the probability that this relationship could have occurred by chance alone is less than 0.05. Because more primate and carnivore species are known biologically than behaviorally, it is not possible to select a random sample of species within these orders for study. Instead, a representative sample of primate and carnivore species was selected. Table 1 gives these species with their references. There is a noticeable bias in the sample toward higher species in the orders. This is because the study deals with convergent and parallel behaviors and these are more likely to be associated with advanced species than primitive ones. This bias limits the universe to which the findingj may be legitimately generalized, which becomes the primate families Cebidae, Cercopithecidae, Hylobatidae, and Pongidae, and the carnivore families Canidae, Hyaenidae, and Felidae. The family Colobidae is probably not adequately represented by its one species, Colobus guereza. The variables selected were ones that were mentioned in the literature dealing with naturalistic behavior of carnivores and primates, were available in the field studies of the sample species, and could be formulated in a manner suitable for dichotomous classification. Since the statistics used in this study operate most efficiently when variables are dichotomized close to the median, a characteristic was also required to be present in at least 20 % of the species in a 2 x 2 table. 

3. Carnivores and Primates at the First Level of Abstraction 


Meat is a high calorie food relative to vegetal matter, requiring less bulk consumption. Meat eaters often share food among themselves; herbivores do not, since the calorie content is so low the effort is scarcely justified. Even in carnivores which exist as solitary adults, except for brief periods of mating, food is shared between a mother and her young. Moreover, a carnivorous diet requires a highly developed weapon system to hold and kilI prey. A carnivore must capture his food by careful stalk or long chase; the food of a herbivore cannot escape. Sharing of food acquired only with difficulty could be instrumental in maintaining sick or immature animals in carnivora, much less so in primates. Given the practice needed for successful capture of prey, it is doubtful that the young of predators could survive if they were not allowed to feed from the kills of aduits well after being weaned. With the exception of the chimpanzee and Savannah baboon, which on occasions kill and eat smaller mammals, the primates are noncarnivorous and even in the chimpanzee and baboons meat eating represents only an infrequent supplement to a basic vegetarian diet (Teleki, 1973). These observations can be phrased into hypothesis form. Meat has both higher calorie value than vegetal matter and requires more skill to obtain, therefore it would be adaptive for meat eaters to share food within their social group, whereas primates, being vegetarian, would find the abundant supply of vegetal matter makes sharing unnecessary and the bulk of comparable vegetal calories makes it very difficult. Table 2 shows that food sharing is significantly positively correlated with carnivores (# = 0.92, xa = 17.72, P < O-001). The only primate to share food is the chimpanzee, and it shares food only when it is meat (Teleki, 1973). Carnivores also diverge from primates in food storing. Leopards frequently suspend their kills in trees to keep them from hyenas and return for them later; hyenas submerge kills in water; lions drag kills into thickets; and foxes bury them. There are no comparable instances of caching food among primates. A second hypothesis on primate and carnivore behavior is thus offered in food storing. Since meat is difficult to procure and relatively compact, carnivores would evolve methods of storing it; however, for primates to store bulky vegetal matter would be difficult and of doubtful adaptive value given the easy access. Table 3 shows food storing is positively correlated with carnivores (4 = 0.85, xe = 14.52, P < O-001). The two exceptions, cheetah and wild dog, share the highest hunting success rate. Among primates, cannibalism is nonexistent. However, it does occur among carnivores, although the factors which trigger it remain unknown and its occurrence is less frequent than pure opportunism predicts. Nevertheless, it can be hypothesized that cannibalism is associated with carnivorism.

Table 4 shows cannibalism is positively correlated with carnivorism (r$ = 0.77, x2 = 11.81, P < 0.001).

Interspecies aggression 

During the Pleistocene, preagricultural man hunted a huge number of large mammals into extinction. Martin (1967) calling the phenomenon “Pleistocene Overkill,” estimated the loss at 200 genera. Some evidence strongly suggests that other predators are capable of hunting prey to extinction. Kruuk (1972) using the term “surplus killing” described how 19 hyenas took advantage of a dark night with heavy rain to kill or badly injure 109 Thompson’s gazelle, eating only 13 of 59 dead ones. There was no evidence of selection for healthy or mature animals. In commenting on a fox which had killed 239 gulls during a night storm, Tinbergen & Ennion (1967) stated that “a fox must have an ingrained habit to kill on sight because, clearly in lean times it cannot afford to lose the chance of a meal by hestitating even for a split second.” The following hypothesis is suggested by these observations. Surplus killing is a characteristic of carnivores.

Table 5 shows surplus killing is positively correlated with carnivores (o = 0.77, X* = 11.81, P < 0.001). Among primates there is little evidence, exclusive of natural enemies, of interspecific aggression. Primates are tolerant of other species (nonpredators) and even derive benefits from coexistence by using other species communication signals to warn them of approaching predators they have themselves not yet detected (Sarel & DeVore, 1965). Among carnivores there is, in addition to predator-prey aggression, a general animosity toward other carnivores. Lions make unprovoked attacks on leopards and cheetahs and kill them (Schaller, 1972). Wild dogs have attacked lions and tigers and eaten them (Schaller, 1968, 1972). Kruuk (1972) reports that lions account for 55 ‘A of hyena mortality. Relations are similar in the New World. Schaller & Lowther (1969) believe this intracarnivore intolerance is the result of competition for the same food resource, particularly the tendency of predators to usurp one another’s kills. The following hypothesis is thus offered. Carnivores manifest interspecific aggression not related to predator-prey; primates manifest no aggression other than predator-prey. 

Table 6 shows that nonpredator-prey mortal interspecies aggression is positively correlated with carnivores (q5 = 0.86, X* = 14.86, P < 0.001). The chimpanzee and Savannah baboon will prey on other primates, but even this hostility has not caused a complete intolerance. Teleki (1973) reports that even baboon troops that have experienced losses to predatory chimpanzees commonly allow these same chimpanzees to pass among them. Therefore, there is some doubt that the above hypothesis has any exceptions at all. The traditional enmity between cat and dog is as apparent in the largest carnivores as it is between tabby cat and village cur. 

Group defense 

In baboons and macaques the males have evolved large canines to defend the troop against predators, while the canines of the females are by comparison much more modest and are not typically used in troop defense. Male-dominant defense occurs in other primates as well, but is scarce in carnivores. Carnivores need to maximize the number of food providers and for that function a male is as useful as a female, the lion being a notable exception. Nonhuman primates lack a scarce food supply and their social system often contains harems, similar to those found in ungulates species which are commonly preyed upon. The following hypothesis is thus suggested. It is adaptive for nonhuman primates to raise the maximum number of young, for which the role of the female is more active than that of the male; it is adaptive for carnivores to maximize the number of food providers, and for that there is no sexual difference. Therefore, males are more expendable for group defense in primates, but not in carnivores. 

Table 7 shows that males are used more for group defense in primates than in carnivores ($ = O-58, xa = 6.43, P < 0.025). The male lion assumes the primary burden for pride defense because the lioness assumes the primary burden for pride economy (Schaller, 1972). Lions are, therefore, a dubious exception. Feeding of young The young of solitary carnivores are entirely dependent on their mother. The young of social carnivores and primates get some added care from other adults, except in feeding. Carnivore young are vulnerable to the feast or famine economy of a hunter; primate young are not. Consequently, it can be hypothesized that carnivores would minimize the severity of lean seasons by dividing the feeding of young among more than one adult whenever possible, whereas primate young would be dependent wholly on their mother. Since multiple-adult feeding of young depends logically rather than empirically on the presence of two or more adults, only social species will be considered. 

Table 8 shows that among social species multiple-adult feeding of young is positively correlated with carnivores (o = 0.86, P < 0.001, using Fisher’s exact test). For wolves, wild dogs, and jackals multiple-adult feeding includes both sexes, who will carry food in pieces or regurgitate it to den young. Male lions will not feed pride cubs, but lionesses feed and suckle cubs communally. For hyenas, cannibalistic tendencies of stranger hyenas are probably responsible for delegating feeding young wholly to their mother and, according to Kruuk (1972), making the females bigger and stronger than the males.

4. The Carnivore Sophistication Scale and The Evolution of Hominid Behavior 

Seven variables dealing with carnivore and primate behavior have been described in the previous section. For each variable a theoretical argument was derived, based when possible on existing literature on carnivore and primate behavior. The hypothesis resulting from theoretical argument was tested, using taxonomic affinity within the order carnivora and order primates as the independent variable and a behavior dichotomy as the dependent variable. The findings of this analysis are presented in Table 9. The similarity and dissimilarity of hominid behavior to carnivore and primate behavior can now be established by comparison matching. Table 9 shows that preagricultural man resembles the carnivores in food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, and feeding of young. Only the male-dominant group defense of preagricultural man resembles the primates. 

A scale of carnivore sophistication was constructed from the characteristics listed in Table 9. For a behavior variable to be included in this scale it must meet two criteria: first, the behavior had to be associated with the order carnivora. Second, the behavior had to be solely dependent on the independent variable; i.e. a variable was not included if it was confounded with two or more independent variables. Thus, solitary carnivores were not coded for multiple-adult feeding of young because this variable couId as likely depend on their social structure which lacks two or more adults as on their classification as carnivores. 

The degree of carnivore sophistication of any species is computed by totaling the number of characteristics which are listed under carnivore then dividing this number by the total number of behaviors which are reported for that species. In this analysis social species have seven variables and solitary species six. For the example of preagricultural man this statistic is + or O-86. Since each behavior variable used to construct this scale is associated with order carnivora, it is expected that high scores should be carnivores and low scores primates. This expectation can be formulated as a testabIe hypothesis. The higher the carnivore sophistication score the more likely the animal is classified as a carnivore. Table 10 shows that this prediction is strongly confirmed (rDb = O-94, t = 13.38, P < 0*0005). The validity of this scale as a continuous measure of carnivore characteristics can also be verified by using it as the independent variable to test a hypothesis about carnivore and primate behavior, e.g. one for which the x2 cannot be used owing to low marginal frequency.

The young of carnivores and primates are born largely helpless and would die without care from their mother. The smaller home ranges and more restricted mobility of primates allows the adults to keep the young with them at all times. The larger home ranges and greater mobility of solitary carnivores often results in leaving the young unprotected while the mother hunts, jeopardizing their safety to no mean extent. Some social carnivores have evolved mechanisms to cope with this dilemma. Schaller (1972) argued that group existence in the lion made possible a division of labor not available to solitary cats. A lioness often serves as a guard for small cubs while others hunt. One pride lion may protect a carcass while others fetch cubs. Kuhme (1965) reports that some wild dogs guard the den while others hunt. The following hypothesis is offered. Lesser mobility allows primates to guard their young at all times; greater mobility means social carnivores can guard their young at all times only by performing two tasks concurrently, calling for a division of labor. Three species manifest such a division of labor: lions 0.86, wild dogs 0.57, and preagricultural man 0.86. The mean carnivore sophistication score of this group is significantly greater than the mean carnivore sophistication score of all social species without a division of labor (rsb = O-52, t = 2.59, P < O-01). The male-dominant group defense has been retained in hominid evolution despite a general divergence from the primates. The reason might be attributed to the characteristics that are associated with it. In preagricultural man, the male dominant defense is present with the division of labor in which men serve as hunters and women as gatherers. The division of labor is associated with carnivore Lophistication, therefore the male-dominant defense may have been retained because it is associated with the adaptive division of labor. Since behavior itself does not fossilize, it is difficult to know how early in hominid evolution the behavior exhibited by preagricultural man was attained. However, Dart (1959) reports cannibalism and intraspecific intolerance (against contemporary carnivores) among fossil australopithecines at Taung, Sterkfontein, Makapan, and Makapansagat sites : “with early universal cannibalism . . . habit . . , this common bloodlust differentiator, this predacious separates man dietetically from his anthropoidal relatives and allies him rather with the deadliest of Carnivora” (Raymond A. Dart, 1953).  

I am grateful to Dr J. E. King and Dr C. A. Rogers for their critical reading of the manuscript.

Preagricultural man resembles the carnivores on 7 items: food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, feeding of young, and division of labor; resembling the order primates only in group defense. The original form of much human behavior is found within the carnivores. 


A carnivore diet can provide all essential nutrients

Can a carnivore diet provide all essential nutrients?

Can a carnivore diet provide all essential nutrients?


Purpose of review: The aim of this study was to summarize current contributions affecting knowledge and predictions about the nutritional adequacy of plant-free diets, contextualized by historical accounts.

Recent findings: As demonstrated in recent experiments, nutrient interactions and metabolic effects of ketogenic diets can impact nutritional needs, sometimes resulting in nutrient-sparing effects. Other studies highlight conflicting hypotheses about the expected effect on metabolic acidosis, and therefore mineral status, of adding alkaline mineral-rich vegetables.

Summary: A carnivore diet is a newly popular, but as yet sparsely studied form of ketogenic diet in which plant foods are eliminated such that all, or almost all, nutrition derives from animal sourced foods. Ketogenic diets are already nutritionally controversial due to their near-complete absence of carbohydrate and high dietary fat content, but most ketogenic diet advocates emphasize the inclusion of plant foods. In this review, we discuss the implications of relying solely on animal sourced foods in terms of essential nutrient status.


Large brains evolved from animal fat and cholesterol in diet.

Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain

Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain

ANIMAL FAT AND CHOLESTEROL MAY HAVE HELPED PRIMITIVE MAN EVOLVE A LARGE BRAIN FRANK D. MANN* In 1962, James V. Neel published a paper entitled "Diabetes Mellitus: A 'Thrifty' Genotype Rendered Detrimental by 'Progress'?" [I]; 20 years later, Neel revisited his theory, suggesting appropriate revisions because of significant additions to our knowledge of diabetes. During the intervening period, just as predicted by Neel's theory, a high frequency of diabetes was observed in populations emerging from the sparse diet of a primitive economy into the "progress" of a developed economy [2]. A prominent example of such a population, the Pima Indian tribe, has been and continues to be thoroughly studied [3]. Neel himself carried out a difficult field investigation of two South American tribes, showing that, in accordance with his theory, they were not diabetic while remaining in a primitive state [2]. An interest of many years' duration in Neel's original concept of a "thrifty" genotype led me to consider the possibility of an analogous case of a heritage now detrimental but originally advantageous. Overproduction of cholesterol, associated with consumption of animal fat but probably also with evolutionary changes in hepatic physiology, may have been helpful in meeting increased need for cholesterol in the evolution of the large brain of modern humans. I shall present an analysis of evidence for this hypothLeakey 's Reasoning: A Rich Meat Diet was Required for the Large Human Brain Leakey has been led by his reading of the anthropological language of fossil teeth and skulls to an unequivocal judgment of the importance of a meat diet [4]. Leakey does not attempt to evaluate the individual nutritional components of the meat diet, but considers a nutritionally richer diet to be necessary for the maintenance of the metabolically expensive large brain. The modern human brain surely is metabolically expensive, requiring 20 percent of the total resting oxygen consumption of the body, although it is only 2 percent of the body weight [5]. Chamberlain also considers the meat diet to be important and focuses on the essential fatty acids. While noting that these substances are present in vegetable sources, he suggests that the ample supply of the essential as well as other unsaturated fatty acids in meat could have conferred a survival advantage, a three-fold increase in the size of the human brain within 3 million years of evolution [6]. To obtain meat as the principal constituent of diet, primitive man would have had to compete with animals such as cats, which had already evolved formidable built-in equipment for killing: fangs and claws. Rapid evolution of a large brain was needed in this deadly competition. A diet rich in animal fat would have markedly increased the intake and hepatic synthesis of cholesterol. Would this have helped the development of the large brain? Evidence regarding the functions of cholesterol and the mechanisms of the supply and distribution is pertinent to this hypothesis. Cholesterol and Membranes Cholesterol has long been known to be an essential constituent of cell membranes and has been reported to comprise, on a molecular basis, half of the lipid content of the external cell membrane [7]. Myelinated nerve fibers make up a large part of the total mass of the brain. Their highly specialized covering membranes contain a large quantity of various lipids. Thus, early biochemists found brain to be a convenient source material from which to prepare cholesterol. Fielding has described the complex mechanism of transporters and receptors which have evolved to assure that all membranes have the appropriate individual content of free cholesterol [8]. Tight regulation of the very different amounts of free cholesterol in different membranes is necessary for various vital functions performed by structures located in the membranes . These include the ion pumps necessary for the life ofevery cell, and adenylate cyclase, which is required for the cell to respond to transmitter substances such as norepinephrine. Schroeder and colleagues have recently reviewed the evidence that highly asymmetric distributions of free cholesterol, within the membranes themselves...


Human brains use much more energy because of rich energy diets high in animal fat

Metabolic correlates of hominid brain evolution

While the brain of an adult primate consumes <10% of the total resting metabolic rate, this amounts to 20-25% in the case of anatomically modern humans [Leonard et al. 2003].

Metabolic correlates of hominid brain evolution


Large brain sizes in humans have important metabolic consequences as humans expend a relatively larger proportion of their resting energy budget on brain metabolism than other primates or non-primate mammals. The high costs of large human brains are supported, in part, by diets that are relatively rich in energy and other nutrients. Among living primates, the relative proportion of metabolic energy allocated to the brain is positively correlated with dietary quality. Humans fall at the positive end of this relationship, having both a very high quality diet and a large brain size. Greater encephalization also appears to have consequences for aspects of body composition. Comparative primate data indicate that humans are ‘under-muscled’, having relatively lower levels of skeletal muscle than other primate species of similar size. Conversely, levels of body fatness are relatively high in humans, particularly in infancy. These greater levels of body fatness and reduced levels of muscle mass allow human infants to accommodate the growth of their large brains in two important ways: (1) by having a ready supply of stored energy to ‘feed the brain’, when intake is limited and (2) by reducing the total energy costs of the rest of the body. Paleontological evidence indicates that the rapid brain evolution observed with the emergence of Homo erectus at approximately 1.8 million years ago was likely associated with important changes in diet and body composition.

bottom of page