Facultative Carnivore Reasons
A Cross-species Analysis of Carnivore, Primate, and Hominid Behaviour
The traditional assumption that the origin of human behavior is found within the higher primates rather than the social carnivores is based on failure to adequately define primate and carnivore behavior. Phyletic classification takes no account of convergence and divergence; the behavior of a species is not necessarily characteristic of its order. Of 8 behavior variables that distinguish the order primates from the order carnivora, preagricultural man resembles the carnivores on 7 items: food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, feeding of young, and division of labor; resembling the order primates only in group defense. The original form ofmuch human behavior is found within the carnivores.
In attempting to deduce the origin and evolution of human behavior, anthropologists have frequently looked to the nonhuman primates for clues. Some evidence has also been offered for the relevance of terrestrial carnivores because of common evolution as a hunter (Dart, 1953; Ardrey, 1961; Schaller & Lowther, 1969). Both approaches have usually provided lists of similarities between human behavior and the behavior of a particular species. But lists of similarities between species, however long, are only a partial answer. Individual species behavior can vary drastically as a result of divergence and convergence. Primate behavior is behavior which characterizes the order primates; carnivore behavior is behavior which characterizes the order carnivora. When comparisons are made at a specific level ambiguities result. For example, Teleki (1973) compares the predatory behavior of chimpanzees with human hunting as evidence for deriving this behavior from the primates. However, the predatory behavior of chimpanzees is not characteristic of species in the order primates, but is in fact deviant. A more likely theory is that chimpanzees and hominids are both converging with terrestrial carnivores and have acquired their similar behavior through convergence on a common niche. The central thesis of this study is that there is incontrovertible evidence of the convergence of human behavior with carnivore behavior.
The approach outlined above uses the following procedures. A hypothesis is proposed as to which of two paired behaviors is characteristic of carnivores. Each hypothesis is posed in such a manner that one of two variabIes is presumed to be causal; i.e. one variable is presumed to be independent, one dependent. The independent variable used throughout this study is taxonomic inclusion within the order carnivora and order primates. Since the dependent behavior variable is also a dichotomy, usually A and .& the reIationship of the two variables is expressed in a 2 x 2 contingency table. A hypothesis is considered confirmed if more carnivore species exhibit A' than A and more primate species exhibit A than A and the probability that this relationship could have occurred by chance alone is less than 0.05. Because more primate and carnivore species are known biologically than behaviorally, it is not possible to select a random sample of species within these orders for study. Instead, a representative sample of primate and carnivore species was selected. Table 1 gives these species with their references. There is a noticeable bias in the sample toward higher species in the orders. This is because the study deals with convergent and parallel behaviors and these are more likely to be associated with advanced species than primitive ones. This bias limits the universe to which the findingj may be legitimately generalized, which becomes the primate families Cebidae, Cercopithecidae, Hylobatidae, and Pongidae, and the carnivore families Canidae, Hyaenidae, and Felidae. The family Colobidae is probably not adequately represented by its one species, Colobus guereza. The variables selected were ones that were mentioned in the literature dealing with naturalistic behavior of carnivores and primates, were available in the field studies of the sample species, and could be formulated in a manner suitable for dichotomous classification. Since the statistics used in this study operate most efficiently when variables are dichotomized close to the median, a characteristic was also required to be present in at least 20 % of the species in a 2 x 2 table.
3. Carnivores and Primates at the First Level of Abstraction
Meat is a high calorie food relative to vegetal matter, requiring less bulk consumption. Meat eaters often share food among themselves; herbivores do not, since the calorie content is so low the effort is scarcely justified. Even in carnivores which exist as solitary adults, except for brief periods of mating, food is shared between a mother and her young. Moreover, a carnivorous diet requires a highly developed weapon system to hold and kilI prey. A carnivore must capture his food by careful stalk or long chase; the food of a herbivore cannot escape. Sharing of food acquired only with difficulty could be instrumental in maintaining sick or immature animals in carnivora, much less so in primates. Given the practice needed for successful capture of prey, it is doubtful that the young of predators could survive if they were not allowed to feed from the kills of aduits well after being weaned. With the exception of the chimpanzee and Savannah baboon, which on occasions kill and eat smaller mammals, the primates are noncarnivorous and even in the chimpanzee and baboons meat eating represents only an infrequent supplement to a basic vegetarian diet (Teleki, 1973). These observations can be phrased into hypothesis form. Meat has both higher calorie value than vegetal matter and requires more skill to obtain, therefore it would be adaptive for meat eaters to share food within their social group, whereas primates, being vegetarian, would find the abundant supply of vegetal matter makes sharing unnecessary and the bulk of comparable vegetal calories makes it very difficult. Table 2 shows that food sharing is significantly positively correlated with carnivores (# = 0.92, xa = 17.72, P < O-001). The only primate to share food is the chimpanzee, and it shares food only when it is meat (Teleki, 1973). Carnivores also diverge from primates in food storing. Leopards frequently suspend their kills in trees to keep them from hyenas and return for them later; hyenas submerge kills in water; lions drag kills into thickets; and foxes bury them. There are no comparable instances of caching food among primates. A second hypothesis on primate and carnivore behavior is thus offered in food storing. Since meat is difficult to procure and relatively compact, carnivores would evolve methods of storing it; however, for primates to store bulky vegetal matter would be difficult and of doubtful adaptive value given the easy access. Table 3 shows food storing is positively correlated with carnivores (4 = 0.85, xe = 14.52, P < O-001). The two exceptions, cheetah and wild dog, share the highest hunting success rate. Among primates, cannibalism is nonexistent. However, it does occur among carnivores, although the factors which trigger it remain unknown and its occurrence is less frequent than pure opportunism predicts. Nevertheless, it can be hypothesized that cannibalism is associated with carnivorism.
Table 4 shows cannibalism is positively correlated with carnivorism (r$ = 0.77, x2 = 11.81, P < 0.001).
During the Pleistocene, preagricultural man hunted a huge number of large mammals into extinction. Martin (1967) calling the phenomenon “Pleistocene Overkill,” estimated the loss at 200 genera. Some evidence strongly suggests that other predators are capable of hunting prey to extinction. Kruuk (1972) using the term “surplus killing” described how 19 hyenas took advantage of a dark night with heavy rain to kill or badly injure 109 Thompson’s gazelle, eating only 13 of 59 dead ones. There was no evidence of selection for healthy or mature animals. In commenting on a fox which had killed 239 gulls during a night storm, Tinbergen & Ennion (1967) stated that “a fox must have an ingrained habit to kill on sight because, clearly in lean times it cannot afford to lose the chance of a meal by hestitating even for a split second.” The following hypothesis is suggested by these observations. Surplus killing is a characteristic of carnivores.
Table 5 shows surplus killing is positively correlated with carnivores (o = 0.77, X* = 11.81, P < 0.001). Among primates there is little evidence, exclusive of natural enemies, of interspecific aggression. Primates are tolerant of other species (nonpredators) and even derive benefits from coexistence by using other species communication signals to warn them of approaching predators they have themselves not yet detected (Sarel & DeVore, 1965). Among carnivores there is, in addition to predator-prey aggression, a general animosity toward other carnivores. Lions make unprovoked attacks on leopards and cheetahs and kill them (Schaller, 1972). Wild dogs have attacked lions and tigers and eaten them (Schaller, 1968, 1972). Kruuk (1972) reports that lions account for 55 ‘A of hyena mortality. Relations are similar in the New World. Schaller & Lowther (1969) believe this intracarnivore intolerance is the result of competition for the same food resource, particularly the tendency of predators to usurp one another’s kills. The following hypothesis is thus offered. Carnivores manifest interspecific aggression not related to predator-prey; primates manifest no aggression other than predator-prey.
Table 6 shows that nonpredator-prey mortal interspecies aggression is positively correlated with carnivores (q5 = 0.86, X* = 14.86, P < 0.001). The chimpanzee and Savannah baboon will prey on other primates, but even this hostility has not caused a complete intolerance. Teleki (1973) reports that even baboon troops that have experienced losses to predatory chimpanzees commonly allow these same chimpanzees to pass among them. Therefore, there is some doubt that the above hypothesis has any exceptions at all. The traditional enmity between cat and dog is as apparent in the largest carnivores as it is between tabby cat and village cur.
In baboons and macaques the males have evolved large canines to defend the troop against predators, while the canines of the females are by comparison much more modest and are not typically used in troop defense. Male-dominant defense occurs in other primates as well, but is scarce in carnivores. Carnivores need to maximize the number of food providers and for that function a male is as useful as a female, the lion being a notable exception. Nonhuman primates lack a scarce food supply and their social system often contains harems, similar to those found in ungulates species which are commonly preyed upon. The following hypothesis is thus suggested. It is adaptive for nonhuman primates to raise the maximum number of young, for which the role of the female is more active than that of the male; it is adaptive for carnivores to maximize the number of food providers, and for that there is no sexual difference. Therefore, males are more expendable for group defense in primates, but not in carnivores.
Table 7 shows that males are used more for group defense in primates than in carnivores ($ = O-58, xa = 6.43, P < 0.025). The male lion assumes the primary burden for pride defense because the lioness assumes the primary burden for pride economy (Schaller, 1972). Lions are, therefore, a dubious exception. Feeding of young The young of solitary carnivores are entirely dependent on their mother. The young of social carnivores and primates get some added care from other adults, except in feeding. Carnivore young are vulnerable to the feast or famine economy of a hunter; primate young are not. Consequently, it can be hypothesized that carnivores would minimize the severity of lean seasons by dividing the feeding of young among more than one adult whenever possible, whereas primate young would be dependent wholly on their mother. Since multiple-adult feeding of young depends logically rather than empirically on the presence of two or more adults, only social species will be considered.
Table 8 shows that among social species multiple-adult feeding of young is positively correlated with carnivores (o = 0.86, P < 0.001, using Fisher’s exact test). For wolves, wild dogs, and jackals multiple-adult feeding includes both sexes, who will carry food in pieces or regurgitate it to den young. Male lions will not feed pride cubs, but lionesses feed and suckle cubs communally. For hyenas, cannibalistic tendencies of stranger hyenas are probably responsible for delegating feeding young wholly to their mother and, according to Kruuk (1972), making the females bigger and stronger than the males.
4. The Carnivore Sophistication Scale and The Evolution of Hominid Behavior
Seven variables dealing with carnivore and primate behavior have been described in the previous section. For each variable a theoretical argument was derived, based when possible on existing literature on carnivore and primate behavior. The hypothesis resulting from theoretical argument was tested, using taxonomic affinity within the order carnivora and order primates as the independent variable and a behavior dichotomy as the dependent variable. The findings of this analysis are presented in Table 9. The similarity and dissimilarity of hominid behavior to carnivore and primate behavior can now be established by comparison matching. Table 9 shows that preagricultural man resembles the carnivores in food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, and feeding of young. Only the male-dominant group defense of preagricultural man resembles the primates.
A scale of carnivore sophistication was constructed from the characteristics listed in Table 9. For a behavior variable to be included in this scale it must meet two criteria: first, the behavior had to be associated with the order carnivora. Second, the behavior had to be solely dependent on the independent variable; i.e. a variable was not included if it was confounded with two or more independent variables. Thus, solitary carnivores were not coded for multiple-adult feeding of young because this variable couId as likely depend on their social structure which lacks two or more adults as on their classification as carnivores.
The degree of carnivore sophistication of any species is computed by totaling the number of characteristics which are listed under carnivore then dividing this number by the total number of behaviors which are reported for that species. In this analysis social species have seven variables and solitary species six. For the example of preagricultural man this statistic is + or O-86. Since each behavior variable used to construct this scale is associated with order carnivora, it is expected that high scores should be carnivores and low scores primates. This expectation can be formulated as a testabIe hypothesis. The higher the carnivore sophistication score the more likely the animal is classified as a carnivore. Table 10 shows that this prediction is strongly confirmed (rDb = O-94, t = 13.38, P < 0*0005). The validity of this scale as a continuous measure of carnivore characteristics can also be verified by using it as the independent variable to test a hypothesis about carnivore and primate behavior, e.g. one for which the x2 cannot be used owing to low marginal frequency.
The young of carnivores and primates are born largely helpless and would die without care from their mother. The smaller home ranges and more restricted mobility of primates allows the adults to keep the young with them at all times. The larger home ranges and greater mobility of solitary carnivores often results in leaving the young unprotected while the mother hunts, jeopardizing their safety to no mean extent. Some social carnivores have evolved mechanisms to cope with this dilemma. Schaller (1972) argued that group existence in the lion made possible a division of labor not available to solitary cats. A lioness often serves as a guard for small cubs while others hunt. One pride lion may protect a carcass while others fetch cubs. Kuhme (1965) reports that some wild dogs guard the den while others hunt. The following hypothesis is offered. Lesser mobility allows primates to guard their young at all times; greater mobility means social carnivores can guard their young at all times only by performing two tasks concurrently, calling for a division of labor. Three species manifest such a division of labor: lions 0.86, wild dogs 0.57, and preagricultural man 0.86. The mean carnivore sophistication score of this group is significantly greater than the mean carnivore sophistication score of all social species without a division of labor (rsb = O-52, t = 2.59, P < O-01). The male-dominant group defense has been retained in hominid evolution despite a general divergence from the primates. The reason might be attributed to the characteristics that are associated with it. In preagricultural man, the male dominant defense is present with the division of labor in which men serve as hunters and women as gatherers. The division of labor is associated with carnivore Lophistication, therefore the male-dominant defense may have been retained because it is associated with the adaptive division of labor. Since behavior itself does not fossilize, it is difficult to know how early in hominid evolution the behavior exhibited by preagricultural man was attained. However, Dart (1959) reports cannibalism and intraspecific intolerance (against contemporary carnivores) among fossil australopithecines at Taung, Sterkfontein, Makapan, and Makapansagat sites : “with early universal cannibalism . . . habit . . , this common bloodlust differentiator, this predacious separates man dietetically from his anthropoidal relatives and allies him rather with the deadliest of Carnivora” (Raymond A. Dart, 1953).
I am grateful to Dr J. E. King and Dr C. A. Rogers for their critical reading of the manuscript.
Preagricultural man resembles the carnivores on 7 items: food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, feeding of young, and division of labor; resembling the order primates only in group defense. The original form of much human behavior is found within the carnivores.
Can a carnivore diet provide all essential nutrients?
Purpose of review: The aim of this study was to summarize current contributions affecting knowledge and predictions about the nutritional adequacy of plant-free diets, contextualized by historical accounts.
Recent findings: As demonstrated in recent experiments, nutrient interactions and metabolic effects of ketogenic diets can impact nutritional needs, sometimes resulting in nutrient-sparing effects. Other studies highlight conflicting hypotheses about the expected effect on metabolic acidosis, and therefore mineral status, of adding alkaline mineral-rich vegetables.
Summary: A carnivore diet is a newly popular, but as yet sparsely studied form of ketogenic diet in which plant foods are eliminated such that all, or almost all, nutrition derives from animal sourced foods. Ketogenic diets are already nutritionally controversial due to their near-complete absence of carbohydrate and high dietary fat content, but most ketogenic diet advocates emphasize the inclusion of plant foods. In this review, we discuss the implications of relying solely on animal sourced foods in terms of essential nutrient status.
Animal Fat and Cholesterol May Have Helped Primitive Man Evolve a Large Brain
ANIMAL FAT AND CHOLESTEROL MAY HAVE HELPED PRIMITIVE MAN EVOLVE A LARGE BRAIN FRANK D. MANN* In 1962, James V. Neel published a paper entitled "Diabetes Mellitus: A 'Thrifty' Genotype Rendered Detrimental by 'Progress'?" [I]; 20 years later, Neel revisited his theory, suggesting appropriate revisions because of significant additions to our knowledge of diabetes. During the intervening period, just as predicted by Neel's theory, a high frequency of diabetes was observed in populations emerging from the sparse diet of a primitive economy into the "progress" of a developed economy . A prominent example of such a population, the Pima Indian tribe, has been and continues to be thoroughly studied . Neel himself carried out a difficult field investigation of two South American tribes, showing that, in accordance with his theory, they were not diabetic while remaining in a primitive state . An interest of many years' duration in Neel's original concept of a "thrifty" genotype led me to consider the possibility of an analogous case of a heritage now detrimental but originally advantageous. Overproduction of cholesterol, associated with consumption of animal fat but probably also with evolutionary changes in hepatic physiology, may have been helpful in meeting increased need for cholesterol in the evolution of the large brain of modern humans. I shall present an analysis of evidence for this hypothLeakey 's Reasoning: A Rich Meat Diet was Required for the Large Human Brain Leakey has been led by his reading of the anthropological language of fossil teeth and skulls to an unequivocal judgment of the importance of a meat diet . Leakey does not attempt to evaluate the individual nutritional components of the meat diet, but considers a nutritionally richer diet to be necessary for the maintenance of the metabolically expensive large brain. The modern human brain surely is metabolically expensive, requiring 20 percent of the total resting oxygen consumption of the body, although it is only 2 percent of the body weight . Chamberlain also considers the meat diet to be important and focuses on the essential fatty acids. While noting that these substances are present in vegetable sources, he suggests that the ample supply of the essential as well as other unsaturated fatty acids in meat could have conferred a survival advantage, a three-fold increase in the size of the human brain within 3 million years of evolution . To obtain meat as the principal constituent of diet, primitive man would have had to compete with animals such as cats, which had already evolved formidable built-in equipment for killing: fangs and claws. Rapid evolution of a large brain was needed in this deadly competition. A diet rich in animal fat would have markedly increased the intake and hepatic synthesis of cholesterol. Would this have helped the development of the large brain? Evidence regarding the functions of cholesterol and the mechanisms of the supply and distribution is pertinent to this hypothesis. Cholesterol and Membranes Cholesterol has long been known to be an essential constituent of cell membranes and has been reported to comprise, on a molecular basis, half of the lipid content of the external cell membrane . Myelinated nerve fibers make up a large part of the total mass of the brain. Their highly specialized covering membranes contain a large quantity of various lipids. Thus, early biochemists found brain to be a convenient source material from which to prepare cholesterol. Fielding has described the complex mechanism of transporters and receptors which have evolved to assure that all membranes have the appropriate individual content of free cholesterol . Tight regulation of the very different amounts of free cholesterol in different membranes is necessary for various vital functions performed by structures located in the membranes . These include the ion pumps necessary for the life ofevery cell, and adenylate cyclase, which is required for the cell to respond to transmitter substances such as norepinephrine. Schroeder and colleagues have recently reviewed the evidence that highly asymmetric distributions of free cholesterol, within the membranes themselves...
Human brains use much more energy because of rich energy diets high in animal fat
Metabolic correlates of hominid brain evolution
While the brain of an adult primate consumes <10% of the total resting metabolic rate, this amounts to 20-25% in the case of anatomically modern humans [Leonard et al. 2003].
Metabolic correlates of hominid brain evolution
Large brain sizes in humans have important metabolic consequences as humans expend a relatively larger proportion of their resting energy budget on brain metabolism than other primates or non-primate mammals. The high costs of large human brains are supported, in part, by diets that are relatively rich in energy and other nutrients. Among living primates, the relative proportion of metabolic energy allocated to the brain is positively correlated with dietary quality. Humans fall at the positive end of this relationship, having both a very high quality diet and a large brain size. Greater encephalization also appears to have consequences for aspects of body composition. Comparative primate data indicate that humans are ‘under-muscled’, having relatively lower levels of skeletal muscle than other primate species of similar size. Conversely, levels of body fatness are relatively high in humans, particularly in infancy. These greater levels of body fatness and reduced levels of muscle mass allow human infants to accommodate the growth of their large brains in two important ways: (1) by having a ready supply of stored energy to ‘feed the brain’, when intake is limited and (2) by reducing the total energy costs of the rest of the body. Paleontological evidence indicates that the rapid brain evolution observed with the emergence of Homo erectus at approximately 1.8 million years ago was likely associated with important changes in diet and body composition.
Smaller molars and less time spent feeding despite body size show that meat eating was causal in our evolution.
Phylogenetic rate shifts in feeding time during the evolution of Homo
Starting with Homo erectus, humans developed smaller molars and also began to spend a lot less time on feeding than would be predicted from body mass and phylogeny with other apes (only 5% instead of a predicted 48% of daily activity in Homo sapiens) [Organ et al. 2011].
Chris Organ, Charles L. Nunn, Zarin Machanda, and Richard W. Wrangham
Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.
Decrease in teeth size, jawbones, reduction of chewing muscles, and weaker bite force indicate shift to animal source foods
Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans
Furthermore, the shift from fibrous plants to including ASFs, together with the use of tools, paralleled a decrease in teeth size and jawbones, a reduction in chewing muscles, and weaker maximum bite force capabilities [Teaford & Ungar 2000; Zink & Lieberman 2016].
Mark F. Teaford and Peter S. Ungar
Over the past decade, discussions of the evolution of the earliest human ancestors have focused on the locomotion of the australopithecines. Recent discoveries in a broad range of disciplines have raised important questions about the influence of ecological factors in early human evolution. Here we trace the cranial and dental traits of the early australopithecines through time, to show that between 4.4 million and 2.3 million years ago, the dietary capabilities of the earliest hominids changed dramatically, leaving them well suited for life in a variety of habitats and able to cope with significant changes in resource availability associated with long-term and short-term climatic fluctuations.
Since the discovery of Australopithecus afarensis, many researchers have emphasized the importance of bipedality in scenarios of human origins (1, 2). Surprisingly, less attention has been focused on the role played by diet in the ecology and evolution of the early hominids (as usually received). Recent work in a broad range of disciplines, such as paleoenvironmental studies (3, 4), behavioral ecology (5), primatology (6), and isotope analyses (7), has rekindled interests in early hominid diets. Moreover, important new fossils from the early Pliocene raise major questions about the role of dietary changes in the origins and early evolution of the Hominidae (8–10). In short, we need to focus not just on how the earliest hominids moved between food patches, but also on what they ate when they got there.
This paper presents a review of the fossil evidence for the diets of the Pliocene hominids Ardipithecus ramidus, Australopithecus anamensis, Australopithecus afarensis, and Australopithecus africanus. These hominids offer evidence for the first half of human evolution, from our split with prehistoric apes to the earliest members of our own genus, Homo. The taxa considered are viewed as a roughly linear sequence from Ardipithecus to A. africanus, spanning the time from 4.4 million to 2.5 million years ago. As such, they give us a unique opportunity to examine changes in dietary adaptations of our ancestors over nearly 2 million years. We also trace what has been inferred concerning the diets of the Miocene hominoids to put changes in Pliocene hominid diets into a broader temporal perspective. From such a perspective, it becomes clear that the dietary capabilities of the early hominids changed dramatically in the time period between 4.4 million and 2.3 million years ago. Most of the evidence has come from five sources: analyses of tooth size, tooth shape, enamel structure, dental microwear, and jaw biomechanics. Taken together, they suggest a dietary shift in the early australopithecines, to increased dietary flexibility in the face of climatic variability. Moreover, changes in diet-related adaptations from A. anamensis to A. afarensis to A. africanus suggest that hard, abrasive foods became increasingly important through the Pliocene, perhaps as critical items in the diet.
Katherine D. Zink &
Daniel E. Lieberman
The origins of the genus Homo are murky, but by H. erectus, bigger brains and bodies had evolved that, along with larger foraging ranges, would have increased the daily energetic requirements of hominins1,2. Yet H. erectus differs from earlier hominins in having relatively smaller teeth, reduced chewing muscles, weaker maximum bite force capabilities, and a relatively smaller gut3,4,5. This paradoxical combination of increased energy demands along with decreased masticatory and digestive capacities is hypothesized to have been made possible by adding meat to the diet6,7,8, by mechanically processing food using stone tools7,9,10, or by cooking11,12. Cooking, however, was apparently uncommon until 500,000 years ago13,14, and the effects of carnivory and Palaeolithic processing techniques on mastication are unknown. Here we report experiments that tested how Lower Palaeolithic processing technologies affect chewing force production and efficacy in humans consuming meat and underground storage organs (USOs). We find that if meat comprised one-third of the diet, the number of chewing cycles per year would have declined by nearly 2 million (a 13% reduction) and total masticatory force required would have declined by 15%. Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.
Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics
Hunting for meat was a critical step in all animal and human evolution. A key brain-trophic element in meat is vitamin B3 / nicotinamide. The supply of meat and nicotinamide steadily increased from the Cambrian origin of animal predators ratcheting ever larger brains. This culminated in the 3-million-year evolution of Homo sapiens and our overall demographic success. We view human evolution, recent history, and agricultural and demographic transitions in the light of meat and nicotinamide intake. A biochemical and immunological switch is highlighted that affects fertility in the ‘de novo’ tryptophan-to-kynurenine-nicotinamide ‘immune tolerance’ pathway. Longevity relates to nicotinamide adenine dinucleotide consumer pathways. High meat intake correlates with moderate fertility, high intelligence, good health, and longevity with consequent population stability, whereas low meat/high cereal intake (short of starvation) correlates with high fertility, disease, and population booms and busts. Too high a meat intake and fertility falls below replacement levels. Reducing variances in meat consumption might help stabilise population growth and improve human capital.
Meat, NAD, and Human Evolution
Archaeological and palaeo-ontological evidence indicate that hominins increased meat consumption and developed the necessary fabricated stone tools while their brains and their bodies evolved for a novel foraging niche and hunting range, at least 3 million years ago. This ‘cradle of mankind’ was centred around the Rift Valley in East Africa where the variable climate and savannah conditions, with reductions in forests and arboreal living for apes, may have required clever and novel foraging in an area where overall prey availability but also predator dangers were high44–50 (Figure 2). Tools helped hunting and butchery and reduced time and effort spent chewing as did cooking later.51 Another crucial step may have been the evolution of a cooperative social unit with divisions of labour, big enough to ensure against the risks involved in hunting large game and the right size to succeed as an ambush hunter – with the requisite prosocial and altruistic skills to also share the spoil across sexes and ages.52 The ambitious transition from prey to predator hunting the then extensive radiation of megaherbivores so big that they are normally considered immune to carnivores, needed advanced individual and social cognition as humans do not have the usual physical attributes of a top predator.53–59 Adult human requirements to run such big brains are impressive enough, but during development, they are extraordinarily high with 80% to 90% of basal metabolic rate necessary in neonates – this is probably not possible after weaning without the use of animal-derived foods.51,60,61
Long distance endurance running may have played a role in persistence hunting
Endurance running and the evolution of Homo
Striding bipedalism is a key derived behaviour of hominids that possibly originated soon after the divergence of the chimpanzee and human lineages. Although bipedal gaits include walking and running, running is generally considered to have played no major role in human evolution because humans, like apes, are poor sprinters compared to most quadrupeds. Here we assess how well humans perform at sustained long-distance running, and review the physiological and anatomical bases of endurance running capabilities in humans and other mammals. Judged by several criteria, humans perform remarkably well at endurance running, thanks to a diverse array of features, many of which leave traces in the skeleton. The fossil evidence of these features suggests that endurance running is a derived capability of the genus Homo, originating about 2 million years ago, and may have been instrumental in the evolution of the human body form.
Nicholas B. Holowka, Daniel E. Lieberman
Journal of Experimental Biology 2018 221: jeb174425 doi: 10.1242/jeb.174425 Published 6 September 2018
Adaptive explanations for modern human foot anatomy have long fascinated evolutionary biologists because of the dramatic differences between our feet and those of our closest living relatives, the great apes. Morphological features, including hallucal opposability, toe length and the longitudinal arch, have traditionally been used to dichotomize human and great ape feet as being adapted for bipedal walking and arboreal locomotion, respectively. However, recent biomechanical models of human foot function and experimental investigations of great ape locomotion have undermined this simple dichotomy. Here, we review this research, focusing on the biomechanics of foot strike, push-off and elastic energy storage in the foot, and show that humans and great apes share some underappreciated, surprising similarities in foot function, such as use of plantigrady and ability to stiffen the midfoot. We also show that several unique features of the human foot, including a spring-like longitudinal arch and short toes, are likely adaptations to long distance running. We use this framework to interpret the fossil record and argue that the human foot passed through three evolutionary stages: first, a great ape-like foot adapted for arboreal locomotion but with some adaptations for bipedal walking; second, a foot adapted for effective bipedal walking but retaining some arboreal grasping adaptations; and third, a human-like foot adapted for enhanced economy during long-distance walking and running that had lost its prehensility. Based on this scenario, we suggest that selection for bipedal running played a major role in the loss of arboreal adaptations.
Eyes - Humans can communicate with gazes, useful for communication when hunting.
Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye
In order to clarify the morphological uniqueness of the human eye and to obtain cues to understanding its adaptive significance, we compared the external morphology of the primate eye by measuring nearly half of all extant primate species. The results clearly showed exceptional features of the human eye: (1) the exposed white sclera is void of any pigmentation, (2) humans possess the largest ratio of exposed sclera in the eye outline, and (3) the eye outline is extraordinarily elongated in the horizontal direction. The close correlation of the parameters reflecting (2) and (3) with habitat type or body size of the species examined suggested that these two features are adaptations for extending the visual field by eyeball movement, especially in the horizontal direction. Comparison of eye coloration and facial coloration around the eye suggested that the dark coloration of exposed sclera of nonhuman primates is an adaptation to camouflage the gaze direction against other individuals and/or predators, and that the white sclera of the human eye is an adaptation to enhance the gaze signal. The uniqueness of human eye morphology among primates illustrates the remarkable difference between human and other primates in the ability to communicate using gaze signals.
Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant
It is likely that humans preferred large herbivores given the abundance of their biomass, the relative ease of hunting them, net caloric returns, and their higher fat content, which accommodates physiological limits on protein consumption
Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant
The worldwide association of H. erectus with elephants is well documented and so is the preference of humans for fat as a source of energy. We show that rather than a matter of preference, H. erectus in the Levant was dependent on both elephants and fat for his survival. The disappearance of elephants from the Levant some 400 kyr ago coincides with the appearance of a new and innovative local cultural complex – the Levantine Acheulo-Yabrudian and, as is evident from teeth recently found in the Acheulo-Yabrudian 400-200 kyr site of Qesem Cave, the replacement of H. erectus by a new hominin. We employ a bio-energetic model to present a hypothesis that the disappearance of the elephants, which created a need to hunt an increased number of smaller and faster animals while maintaining an adequate fat content in the diet, was the evolutionary drive behind the emergence of the lighter, more agile, and cognitively capable hominins. Qesem Cave thus provides a rare opportunity to study the mechanisms that underlie the emergence of our post-erectus ancestors, the fat hunters.
Hunting large prey as humans did is exclusively associated with hypercarnivory
The impact of large terrestrial carnivores on Pleistocene ecosystems
At very high densities, populations of the largest herbivores, such as elephants, have devastating effects on the environment. What prevented widespread habitat destruction in the Pleistocene, when the ecosystem sustained many species of huge herbivores? We use data on predator–prey body mass relationships to predict the prey size ranges of large extinct mammalian carnivores, which were more diverse and much larger than living species. We then compare these prey size ranges with estimates of young mammoth sizes and show that juvenile mammoths and mastodons were within predicted prey size ranges of many of the Pleistocene carnivores. From this and other fossil evidence we argue that, by limiting population sizes of megaherbivores, large carnivores had a major impact on Pleistocene ecosystems.
Caries Through Time: An Anthropological Overview
Although tooth plaque isn’t suitable for determining the HTL within the vast zooarchaeological landscape [53, 54], it may be marginally accurate for identifying shifts. For instance, Neanderthals were known to rely heavily on animal-sourced foods and only showed six caries (signs of decay) out of 1,250 of their teeth that were examined . Caries started appearing in substantial numbers between 13,700 and 15,000 years ago in Morocco, alongside evidence of increased starch consumption . The low occurrence of caries during most of the Pleistocene corresponds to a low carbohydrate, high HTL pattern.
Reconstructing diet is critical to understanding hominin adaptations. Isotopic and functional morphological analyses of early hominins are compatible with consumption of hard foods, such as mechanically-protected seeds, but dental microwear analyses are not. The protective shells surrounding seeds are thought to induce complex enamel surface textures characterized by heavy pitting, but these are absent on the teeth of most early hominins. Here we report nanowear experiments showing that the hardest woody shells – the hardest tissues made by dicotyledonous plants – cause very minor damage to enamel but are themselves heavily abraded (worn) in the process. Thus, hard plant tissues do not regularly create pits on enamel surfaces despite high forces clearly being associated with their oral processing. We conclude that hard plant tissues barely influence microwear textures and the exploitation of seeds from graminoid plants such as grasses and sedges could have formed a critical element in the dietary ecology of hominins.
Non-occlusal, buccal tooth microwear variability has been studied in 68 fossil humans from Europe and the Near East. The microwear patterns observed suggest that a major shift in human dietary habits and food processing techniques might have taken place in the transition from the Middle to the Late Pleistocene populations. Differences in microwear density, average length, and orientation of striations indicate that Middle Pleistocene humans had more abrasive dietary habits than Late Pleistocene populations. Both dietary and cultural factors might be responsible for the differences observed. In addition, the Middle Paleolithic Neanderthal specimens studied show a highly heterogeneous pattern of microwear when compared to the other samples considered, which is inconsistent with a hypothesis of all Neanderthals having a strictly carnivorous diet. The high density of striations observed in the buccal surfaces of several Neanderthal teeth might be indicative of the inclusion of plant foods in their diet. The buccal microwear variability observed in the Neanderthals is compatible with an overall exploitation of both plant and meat foods on the basis of food availability. A preliminary analysis of the relationship between buccal microwear density and climatic conditions prevailing in Europe during the Late Pleistocene has been attempted. Cold climatic conditions, as indicated by oxygen isotope stage data, seem to be responsible for higher densities of microwear features, whereas warmer periods could correspond to a reduced pattern of scratch density. Such a relationship would be indicative of less abrasive dietary habits, perhaps more meat dependent, during warmer periods.
Bioanthropological researches carried out in the last few decades have given special emphasis to the study of the relation between disease, as well as social and environmental phenomena, enhancing the already strong connection between lifestyle and health conditions during history of humankind (Cohen & Armelagos, 1984; Katzenberg & Saunders, 2008; Larsen, 1997). Because infectious diseases result from the interaction between host and agent, modulated by ecological and cultural environments, the comparative study of the historic prevalence of diseases in past populations worldwide can provide important data about their related factors and etiology. The study of dental diseases (such as caries) has been given special attention from Paleopathology2. The tooth, for its physical features tends to resist destruction and taphonomic conditions better than any other body tissue and therefore, is a valuable element for the study on individual’s diet, and social and cultural factors related to it, from a population perspective. Caries is one of the infectious diseases more easily observable in human remains retrieved from archaeological excavations. For their long time of development and non-lethal nature the lesions presented at the time of the death remain recognizable indefinitely, allowing to infer, along with other archaeological and ecological data, the types of food that a specific population consumed, the cooking technology they used, the relative frequency of consumption, and the way the food was shared among the group (Hillson, 2001 2008; Larsen, 1997; Rodríguez, 2003).
We present early evidence linking a high prevalence of caries to a reliance on highly cariogenic wild plant foods in Pleistocene hunter-gatherers from North Africa. This evidence predates other high caries populations and the first signs of food production by several thousand years. We infer that increased reliance on wild plants rich in fermentable carbohydrates caused an early shift toward a disease-associated oral microbiota. Systematic harvesting and processing of wild food resources supported a more sedentary lifestyle during the Iberomaurusian than previously recognized. This research challenges commonly held assumptions that high rates of caries are indicative of agricultural societies.