My Items
I'm a title. Click here to edit me.
The ingestion of fat in the human diet unlocked the evolutionary process that led to rational thinking and a higher level of cognition.
The purpose of this article is to reconcile the hypotheses that: (1) brain evolution occurred due to a change in diet, and (2) it occurred due to pressures related to understanding more and more about the underlying causes, such as understanding increasingly complex manipulative and cooperative intentions on the part of the other, as well as understanding reality itself (and how to interact with it beyond group issues). I argue that the ingestion of fat, a highly energy-efficient food, would have unlocked the evolutionary process that culminated in the emergence of the practice of reasoning about underlying causes; and that the consolidation of such a practice resulted in a continuous pressure to increase cognition about “whys”; so that many explanations ended up imposing the need for additional ones, and with that came a high level of awareness and the need for the brain to evolve not only in terms of providing a higher level of cognition but also in size.
Saturated fat doesn't correlate to heart disease, meaning that eating it throughout evolution would have been healthy
Saturated fat: villain and bogeyman in the development of cardiovascular disease?
Abstract
Background
Cardiovascular disease (CVD) is the leading global cause of death. For decades, the conventional wisdom has been that the consumption of saturated fat (SFA) undermines cardiovascular health, clogs the arteries, increases risk of CVD and leads to heart attacks. It is timely to investigate whether this claim holds up to scientific scrutiny.
Objectives
The purpose of this paper is to review and discuss recent scientific evidence on the association between dietary SFA and CVD.
Methods
PubMed, Google scholar and Scopus were searched for articles published between 2010 and 2021 on the association between SFA consumption and CVD risk and outcomes. A review was conducted examining observational studies and prospective epidemiologic cohort studies, RCTs, systematic reviews and meta analyses of observational studies and prospective epidemiologic cohort studies and long-term RCTs.
Results
Collectively, neither observational studies, prospective epidemiologic cohort studies, RCTs, systematic reviews and meta analyses have conclusively established a significant association between SFA in the diet and subsequent cardiovascular risk and CAD, MI or mortality nor a benefit of reducing dietary SFAs on CVD rick, events and mortality. Beneficial effects of replacement of SFA by polyunsaturated or monounsaturated fat or carbohydrates remain elusive.
Conclusions
Findings from the studies reviewed in this paper indicate that the consumption of SFA is not significantly associated with CVD risk, events or mortality. Based on the scientific evidence, there is no scientific ground to demonize SFA as a cause of CVD. SFA naturally occurring in nutrient-dense foods can be safely included in the diet.
Humans are dependent upon eating choline, found mostly in meat and animal products like eggs.
Humans are unique in their diet, physiology and socio-reproductive behavior compared to other primates. They are also unique in the ubiquitous adaptation to all biomes and habitats. From an evolutionary perspective, these trends seem to have started about two million years ago, coinciding with the emergence of encephalization, the reduction of the dental apparatus, the adoption of a fully terrestrial lifestyle, resulting in the emergence of the modern anatomical bauplan, the focalization of certain activities in the landscape, the use of stone tools, and the exit from Africa. It is in this period that clear taphonomic evidence of a switch in diet with respect to Pliocene hominins occurred, with the adoption of carnivory. Until now, the degree of carnivorism in early humans remained controversial. A persistent hypothesis is that hominins acquired meat irregularly (potentially as fallback food) and opportunistically through klepto-foraging. Here, we test this hypothesis and show, in contrast, that the butchery practices of early Pleistocene hominins (unveiled through systematic study of the patterning and intensity of cut marks on their prey) could not have resulted from having frequent secondary access to carcasses. We provide evidence of hominin primary access to animal resources and emphasize the role that meat played in their diets, their ecology and their anatomical evolution, ultimately resulting in the ecologically unrestricted terrestrial adaptation of our species. This has major implications to the evolution of human physiology and potentially for the evolution of the human brain.
Humans spent less time on feeding compared to other apes, indicating that meat was changing anatomy of molar size through evolution.
Phylogenetic rate shifts in feeding time during the evolution of Homo
Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.
Eating of meat allowed reduction of size in teeth, jawbone, and chewing muscles.
Diet and the evolution of the earliest human ancestors
Over the past decade, discussions of the evolution of the earliest human ancestors have focused on the locomotion of the australopithecines. Recent discoveries in a broad range of disciplines have raised important questions about the influence of ecological factors in early human evolution. Here we trace the cranial and dental traits of the early australopithecines through time, to show that between 4.4 million and 2.3 million years ago, the dietary capabilities of the earliest hominids changed dramatically, leaving them well suited for life in a variety of habitats and able to cope with significant changes in resource availability associated with long-term and short-term climatic fluctuations.
Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans
The origins of the genus Homo are murky, but by H. erectus, bigger brains and bodies had evolved that, along with larger foraging ranges, would have increased the daily energetic requirements of hominins1,2. Yet H. erectus differs from earlier hominins in having relatively smaller teeth, reduced chewing muscles, weaker maximum bite force capabilities, and a relatively smaller gut3,4,5. This paradoxical combination of increased energy demands along with decreased masticatory and digestive capacities is hypothesized to have been made possible by adding meat to the diet6,7,8, by mechanically processing food using stone tools7,9,10, or by cooking11,12. Cooking, however, was apparently uncommon until 500,000 years ago13,14, and the effects of carnivory and Palaeolithic processing techniques on mastication are unknown. Here we report experiments that tested how Lower Palaeolithic processing technologies affect chewing force production and efficacy in humans consuming meat and underground storage organs (USOs). We find that if meat comprised one-third of the diet, the number of chewing cycles per year would have declined by nearly 2 million (a 13% reduction) and total masticatory force required would have declined by 15%. Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.
Are humans evolved specialists for running in the heat? Man vs. horse races provide empirical insights
Many mammals run faster and for longer than humans and have superior cardiovascular physiologies. Yet humans are considered by some scholars to be excellent endurance runners at high ambient temperatures, and in our past to have been persistence hunters capable of running down fleeter quarry over extended periods during the heat of the day. This suggests that human endurance running is less affected by high ambient temperatures than is that of other cursorial ungulates. However, there are no investigations of this hypothesis. We took advantage of longitudinal race results available for three annual events that pit human athletes directly against a hyper-adapted ungulate racer, the thoroughbred horse. Regressing running speed against ambient temperature shows race speed deteriorating with hotter temperatures more slowly in humans than in horses. This is the first direct evidence that human running is less inhibited by high ambient temperatures than that of another endurance species, supporting the argument that we are indeed adapted for high temperature endurance running. Nonetheless, it is far from clear that this capacity is explained by an endurance hunting past because in absolute terms humans are slower than horses and indeed many other ungulate species. While some human populations have persistence hunted (and on occasion still do), the success of this unlikely foraging strategy may be best explained by the application of another adaption – high cognitive capacity. With dedication, experience and discipline, capitalising on their small endurance advantage in high temperatures, humans have a chance of running a more athletic prey to exhaustion.
Human-like Cmah inactivation in mice increases running endurance and decreases muscle fatigability: implications for human evolution
Compared to other primates, humans are exceptional long-distance runners, a feature that emerged in genus Homo approximately 2 Ma and is classically attributed to anatomical and physiological adaptations such as an enlarged gluteus maximus and improved heat dissipation. However, no underlying genetic changes have currently been defined. Two to three million years ago, an exon deletion in the CMP-Neu5Ac hydroxylase (CMAH) gene also became fixed in our ancestral lineage. Cmah loss in mice exacerbates disease severity in multiple mouse models for muscular dystrophy, a finding only partially attributed to differences in immune reactivity. We evaluated the exercise capacity of Cmah−/− mice and observed an increased performance during forced treadmill testing and after 15 days of voluntary wheel running. Cmah−/− hindlimb muscle exhibited more capillaries and a greater fatigue resistance in situ. Maximal coupled respiration was also higher in Cmah null mice ex vivo and relevant differences in metabolic pathways were also noted. Taken together, these data suggest that CMAH loss contributes to an improved skeletal muscle capacity for oxygen use. If translatable to humans, CMAH loss could have provided a selective advantage for ancestral Homo during the transition from forest dwelling to increased resource exploration and hunter/gatherer behaviour in the open savannah.
Rapid changes in the gut microbiome during human evolution
Significance
Human lifestyles profoundly influence the communities of microorganisms that inhabit the body, that is, the microbiome; however, how the microbiomes of humans have diverged from those found within wild-living hominids is not clear. To establish how the gut microbiome has changed since the diversification of human and ape species, we characterized the microbial assemblages residing within hundreds of wild chimpanzees, bonobos, and gorillas. Changes in the composition of the microbiome accrued steadily as African apes diversified, but human microbiomes have diverged at an accelerated pace owing to a dramatic loss of ancestral microbial diversity. These results suggest that the human microbiome has undergone a substantial transformation since the human–chimpanzee split.
Abstract
Humans are ecosystems containing trillions of microorganisms, but the evolutionary history of this microbiome is obscured by a lack of knowledge about microbiomes of African apes. We sequenced the gut communities of hundreds of chimpanzees, bonobos, and gorillas and developed a phylogenetic approach to reconstruct how present-day human microbiomes have diverged from those of ancestral populations. Compositional change in the microbiome was slow and clock-like during African ape diversification, but human microbiomes have deviated from the ancestral state at an accelerated rate. Relative to the microbiomes of wild apes, human microbiomes have lost ancestral microbial diversity while becoming specialized for animal-based diets. Individual wild apes cultivate more phyla, classes, orders, families, genera, and species of bacteria than do individual humans across a range of societies. These results indicate that humanity has experienced a depletion of the gut flora since diverging from Pan.
A Cross-species Analysis of Carnivore, Primate, and Hominid Behaviour
The traditional assumption that the origin of human behavior is found within the higher primates rather than the social carnivores is based on failure to adequately define primate and carnivore behavior. Phyletic classification takes no account of convergence and divergence; the behavior of a species is not necessarily characteristic of its order. Of 8 behavior variables that distinguish the order primates from the order carnivora, preagricultural man resembles the carnivores on 7 items: food sharing, food storing, cannibalism, surplus killing, interspecies intolerance, feeding of young, and division of labor; resembling the order primates only in group defense. The original form ofmuch human behavior is found within the carnivores.