In the late 1980s I met a charming Swiss woman at U.C. Berkeley. She had only recently arrived in the U.S. to attend the University as an undergraduate. Our friendship was largely based on a common interest in nature and outdoor activities. At one point she invited me to dinner at a house she shared with several other Swiss men and women in what seemed a communal arrangement. I found the meal quite strange; it consisted solely of uncooked fruits, vegetables and nuts. She explained to me that they ate all of their food raw, breakfast, lunch and dinner. This I learned later is called Raw Foodism (also Rawism).
Raw diets were long popular with ascetic mystics during the early Christian period, but the modern version was formulated, in Switzerland, by a man named Maximillian Bircher-Brenner (1867-1939). He came to his ideas about food by way of the Lebensreform movement in Germany and Switzerland, which sought to reverse the perceived corruption of civilization through a return to the natural. The “natural” tenets included holistic medicine, exercise, outdoor activities, free-love and nudism. With respect to diet, Bircher-Brenner reasoned, quasi-syllogistically:
Humans are animals.
Other animals don’t cook their food.
Ergo, humans shouldn’t cook their food.
Adherents claim cooking reduces the nutrients in food and creates toxins. There is variation in contemporary raw food diets. Some include raw red meats--horse is favored by the French-- others restrict animal flesh to fish, still others allow (unpasteurized) dairy products—including fermented forms such as yogurt and kefir. My friend and her friends, like Bircher-Brenner, were strict Vegans.
She left Academia to help the indigenous peoples of the remote Brazilian state of Rondonia in their battle with illegal loggers, and we lost touch. Five years later we reconnected. By then she had abandoned Raw Foodism having discovered that you can only be a Raw Foodist under very special enabling conditions such as those provided in Berkeley, CA. Actually, the diet is not sustainable even in Berkeley. Long-term adherents do not get enough fats and proteins; fruits and nuts are not enough. Moreover, the diet is deficient in several micronutrients, including a few essential vitamins. Dietary supplements are required that are not much in the spirit of Lebensreform. Nonetheless, through the concerted efforts of celebrity “influencers”, Vegan Raw Foodism has recently experienced a revival. There is now also a Paleo form of the diet, a misbegotten marriage of misconceptions.
Food Prep
Our more distant ancestors did indeed adhere to a raw food diet. So too, perhaps, early archaic humans such as Homo erectus. Though, as we shall see, some contend otherwise. Modern humans (Homo sapiens), though, have always cooked, from our inception.
A key ingredient in hominin success was the ability to shape the environment to their own ends, rather than passively react to it. This is called niche construction. Our ancestors were not singular in this respect but they were exceptionally good at it and became progressively better over time, culminating in modern cities, dams, harbors and beheaded mountain tops. Tools were of course hallmarks of this process. For most of our evolutionary history we used these tools primarily to process foodstuffs—both plant and animal—in ways that rendered them more readily edible. Much of the hominid dietary expansion throughout the Pleistocene was related to innovations in food preparation.
The best evidence for early Pleistocene food preparation comes after the invention of the first stone tools. Even before the first hand-axes were invented our ancestors used stones to smash tough plant material, such as stems, bark, roots and nuts, thereby exposing some of the softer and easier to digest interior stuff. Once hand axes were invented they were used to cut meat and plants into bite sized chunks; they also no doubt continued the digestibility enhancing smashing traditions. Of special note, re this practice, was the extraction of nutritionally dense bone marrow from scavenged carcasses. Much later, when humans became proficient hunters, meats--primarily the protein component-- would have been tenderized in this way.
Untenderized raw mammal muscle is extremely difficult to chew with hominin teeth. Ultra-carnivorous cats—from lions to our domesticated pets—don’t even bother to chew; they just bite off a chunk and swallow. That pushes the earliest digestive stage from the mouth to the stomach. Our ancestors, though, had to chew to deconstruct those muscle fibers. So, the more tenderized the muscle meat the better. Fatty tissue, including that of many organs inside the carcass, would have posed much less of a problem in that regard.
The Raw and the Cooked
Cooking is widely recognized as the most significant food prep innovation. As I discussed in the first post in this series, Claude Levi-Strauss saw cooking as the central organizing principle for his theory of culture in (then) contemporary human forager and forager-farmer societies. Richard Wrangham considered this cultural invention key to understanding human biological evolution as well, particularly our large brains.
When it comes to brains, size isn’t everything. Einstein proved that, though posthumously. But size is a lot easier to measure from fossil evidence, than the way the brain is organized. It may seem obvious that big brains are good to have. (I think the jury is still out, from an evolutionary perspective. In fact, the evidence indicates that animals with large brains have shorter evolutionary life-spans than those with small brains.) Assuming an advantage for large brains there is still a piper to be paid. Your brain consumes 20-25% of the energy your metabolism generates from the food you eat. That’s a lot. So, it would seem that in order to evolve such large brains, either the quality of the food our ancestors ate must have improved, become more calorically dense, or some other “expensive tissue” had to be sacrificed by way of compensation. Turns out it was probably both.
Let’s begin with the sacrificed tissue. Early speculation focused on the gastrointestinal system as the most likely. Human digestion begins in the mouth, proceeds—through the esophagus—to the stomach, then on to the small intestine and finally the large intestine. The intestines, both small and large, are the expensive parts of the gut in primates. There are almost as many neurons in your intestines as in your brain, and they are in constant dialog. According to Leslie Aiello a lot of that tissue had to go in order to energetically compensate for large brains. Most of the sacrificed tissue was in the large intestine. As evidence, Aiello noted that the colon (large intestine) of modern humans is considerably smaller than that of a chimp.
Aiello proposed that the gut reduction was made possible by a dietary change to “higher quality”, less digestively demanding foods, for which our small guts are better suited. One candidate food, still favored by many, is meat. On this view, we had to become carnivores to get such big brains. Proponents of this idea note that exclusive carnivores—cats, for example—have short guts, while herbivores—cows, horses, deer etc.—need larger, more complex guts, because meat requires less digestive work than plant material. The polysaccharide, cellulose, is particularly hard to digest; it abounds in plants but is absent from animal flesh. But even the nutritious polysaccharides that we call starches are harder to digest than meat, on this view. Ergo the evolution of our large brains required a shift toward a more meat-based diet, so that we could evolve smaller cat-like guts, thus compensating for the increase in brain size. This fits quite nicely with the man-the-hunter narrative.
But raw meat protein is not in fact easy to digest as discussed earlier. Animal fats are a different matter, energy dense and readily digestible. But neither the proteins nor fats in meat can meet the energy requirements of the human brain long term. (In true carnivores, though, such as seals or cats, meat proteins and fats are sufficient.) The human brain needs carbohydrates, especially during infancy. Glucose is by far the most important. (When glucose is depleted due to starvation, ketones, produced from fats can temporarily fill the energy void, as emphasized by advocates of the Keto Diet.)
Fruits are great as a source of glucose, and as a primate we come from a long line of fruit eaters. Less great, but still good are starchy plant materials, especially the starches that accumulate in underground storage organs (USOs), such as tubers, bulbs, rhizomes and corms, where many plants, collectively known as geophytes, bank their energy surpluses. These nutritionally dense foods require more digestive effort than fruits but unlike fruits are available year-round.
Not all USOs were created equally regarding their digestibility. Rhizomes (underground extensions of plant stems) can be particularly tough to stomach, as are a lot of tubers. Edible tubers include potatoes, yams and manioc. Corms--such as taro and various sedges--are much easier, and bulbs (including aquatic rushes and onions) are generally the least recalcitrant gut-wise. But a lot depends on the environment. Terrestrial rhizomes, such as those of grasses, are much tougher to digest than the rhizomes of water lilies and other aquatic plants. And the corms of irises are more challenging than the corms of aquatic sedges.
To whatever extent our ancestors’ diets were plant or animal based, unprocessed raw foods were not easy to digest, from mouth to colon. Smashing and cutting increased digestibility but many believe that without cooking our brains would have remained in the size range of chimpanzees. Cooking is vastly better than smashing as a way to tenderize food, both plant polysaccharides and animal protein. So cooked foods are much easier to chew and digest than the uncooked equivalents. Moreover, the chemical alterations caused by heat increase the available nutrients, contra the Raw Foodist claims. Could cooking provide an explanation for our reduced colons?
The time frame for the invention of cooking is debated. But this much is undisputed: by the time modern humans evolved 300,000 years ago, most food was cooked. The practice probably began with wildfire. All Pleistocene humans had long experience of wildfire. Burnt areas enhanced foraging for small critters by removing many of their shelters, thereby exposing them. Moreover, some of the flora and fauna was serendipitously cooked. It would not have gone unnoticed that the cooked foods were easier to chew than uncooked counterparts. Given a choice, captive chimpanzees quickly come to prefer cooked foods without much prior experience, and wild chimpanzees have been known to forage burned areas. But chimpanzees do not cook and neither did the first hominin consumers of cooked foods. Nonetheless, the experience of fortuitously cooked foods probably inspired later attempts to capture fire and eventually to create fire.
Those who adhere to the Man-the-Hunter narrative emphasize the beneficial effects of cooking animal tissues. Wrangham and others claim that the primary benefits of cooking were the consumption of “root crops” that were previously undigestible. For the first time, our ancestors could consume USOs other than bulbs, including rhizomes, tubers and corms foraged from non-aquatic environments. This, according to Wrangham, was the primary evolutionary enabler of brain size increase; it also enabled gut reduction and a corresponding decrease in tooth size.
Microbiomic Considerations
It was long thought that virtually all nutrient extraction occurred in the small intestine and that the large intestines function primarily to reabsorb water and a few micronutrients such as iron from what remained, prior to evacuation. We now know otherwise. A lot of the more complex, “resistant carbohydrates” are processed in the large intestine by the fermentative microbial actors. Of the body parts containing microbiomes: mouth, skin, armpits, vagina and colon, the latter contains by far the most of these microbes—many billions.
The colon microbiome is essential for deriving a host of nutrients that would otherwise be unavailable. Of particular note are several short chain fatty acids; these chemicals are essential in the function of every bodily organ, from pancreas to brain. Here I will focus on gut health and immune functions. Short chain fatty acids also modulate multiple neurochemical pathways by means of which the gut and brain communicate with each other, the so-called gut-brain axis. All of the short chain fatty acids created in the colon are derived from starchy plant materials.
Among the most important short chain fatty acids are acetate, propionate and butyrate, all produced through the fermentation of starches in the colon. The epithelial cells lining the intestine use 70% of the butyrate. Low butyrate levels are associated with inflammatory bowel disorders such as Chron’s disease and Irritable bowel disease. Acetate is an important element in the Gut-Brain axis. It is essential for the regulation of serotonin levels, associate with mood. Moreover, acetate deprived rats are learning-impaired.
Essential amino acids are also produced through microbial metabolism in the colon. Amino acids are the building blocks of all proteins. We, like most creatures, use 20 different amino acids, which combine in diverse ways to make the vast number of proteins we require. We have the metabolic capacity to manufacture 11 of these. The other nine must be obtained from the food we eat. These nine are called essential amino acids. Several essential amino acids are actually manufactured by microbes in the large intestine from various raw materials. Surprisingly, many amino acids—essential or otherwise-- are derived from dietary carbohydrates, not the proteins themselves.
The colon microbiome also provides us vitamins, up to half of the daily vitamin K—essential for blood clotting and calcium binding in bones--required for humans. Other microbe derived vitamins augment those obtained more directly from our food. These include several B-vitamins, among them folate. Folate has multiple functions but is especially important for the health of blood cells and brain development.
Given these vital functions it seems odd that our colons are so reduced relative to chimps. For the Man-the-Hunter advocates, it is clearly due to increased carnivory. Our colons are in the size range of carnivores. But it’s not that simple. Colon transit times are much slower in humans than in true carnivores. Longer transit times mean more time for the gut microbiome to do its magic. Moreover, much of the work done by the colonic microbiomes in typical omnivores—from pigs to bears—may have been outsourced in humans. This is the view of some anthropologists who have taken on board recent research on microbiomes.
In essence the idea is this—external fermentation of food, whether plant or animal, performs much the same function as our gut microbiome and thereby enabled reductions in colon size, which, in turn, enabled the evolution of larger brains. What interests me most is the assumption that fermented foods were part of the hominid diet long before humans evolved, archaic or modern.
Katherine Amato and her associates at Northwestern University propose an early date indeed, as early as 10 million years ago, before the human lineage split from that of the chimps and bonobos. She bases this conjecture on two mutations that occurred around this time, one related to alcoholic fermentation and one related to acidic fermentation.