100 million years of foo.., p.13

100 Million Years of Food, page 13

 

100 Million Years of Food
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  



  Another potential barrier to the popularity of drinking milk is that it may strike the uninitiated as disgusting. Although the idea of drinking cow’s milk is deeply rooted and celebrated in Western countries, plenty of Westerners today would hesitate before quaffing secretions from the teat of a horse, camel, or water buffalo, which are traditional drinks in other parts of the world. People in East Asia long disdained cow’s milk as a barbarian concoction, and only with active government and industry intervention is the drink now making inroads there. If dog milk were demonstrated to be thoroughly nutritious, how many Western shoppers would toss a carton of it into their grocery basket? People develop an innate aversion to bodily fluids from another animal, because such fluids could bear a nasty infectious disease.16 Thanks to their prominence in Western culture and society as sources of milk, the image of cows and their milk has been sanitized, but other potentially nutritious fluids, such as horse milk and pig’s blood, strike us as unpalatable, unless we are acclimated to them from childhood.

  Incidentally, the widespread consumption of dairy today may help to explain why acne is highly prevalent in dairy-consuming countries. A series of large-scale studies (47,355 participants in one study alone) conducted by the Harvard School of Public Health found an association between dairy intake and acne in teenagers. The reported prevalence of acne among adolescents in Westernized societies is between 79 percent and 95 percent, while traditional societies like the Kitavan Islanders in Papua New Guinea and hunter-gatherers in Paraguay have virtually no incidence of pimples. Some researchers contend that high-glycemic Western diets cause spiking in blood sugar levels and insulin, and hence unleash a cascade of androgen hormones and IGF-1, lowered sex-hormone-binding globulin, and increased activity in sebaceous glands, thereby exacerbating acne. Cow’s milk is also known to raise IGF-1 levels among drinkers. However, skim milk seems to be a risk factor for acne rather than whole milk. Whole milk contains saturated fat, and researchers have discovered that saturated fats suppress bacterial activity, while monounsaturated fats (also found in vegetable oils and nuts) increase the incidence of acne, perhaps by increasing the permeability of the skin.17

  Despite the popularly discussed link between milk and acne, many people may continue to drink milk for fear of having weak bones. The standard recommendation in America and Canada for adult calcium intake is 1,000 mg per day, compared to 800 mg in most of Europe and 500 mg in Japan. Who is right? As previously mentioned, in countries where people get more calcium in the diet, hip fractures are more common. Moreover, popping calcium supplements seems to increase the risk of hip fracture. Beyond a minimum threshold (around 400 mg of calcium per day), more calcium doesn’t seem to help our bones. When I traveled in Papua New Guinea, there was no dairy intake at all among the villagers I stayed with, apart from mother’s milk during infancy, yet their bodies were remarkably sturdy. New Guineans have among the lowest rates of hip fracture in the world. The highest rates of hip fracture occur among the statuesque milk-loving northern Europeans. Hip fractures are relatively easy to record and thus provide the clearest evidence with respect to calcium and bone health, but studies on osteoporosis generally show a disappointing lack of benefit with calcium supplementation, contrary to popular wisdom.18 (Soy products, on the other hand, are associated with decreased hip fracture rates among women, perhaps due to the effects of phytoestrogens, or the vitamin K found in fermented soy products such as stinky natto in Japan, and the even stinkier doufu ru in China and tuong in Vietnam. Fermented cheeses like aged goat cheese, blue cheese, brie, cheddar, and Parmesan are also rich in hip-preserving vitamin K, unlike nonfermented cheeses such as mozzarella and processed cheese.19)

  The role of dietary calcium, independent of milk intake, in promoting prostate cancer risk has been demonstrated in several studies. To consider one case, the Yoruba in Nigeria have no traditional history of dairying, and 99 percent of them are lactose intolerant. Most Yoruba have a gene variant that makes them more efficient at absorbing calcium (milk was absent in the traditional diet) and likely boosts their bone density. In a modern diet with a heavy daily dose of calcium, that super-efficient calcium absorption is a drawback, because it puts people with these genes at greater risk of developing prostate cancer. (Prostate gland tissues appear to be readily stimulated by calcium.) People of African descent are frequently carriers of this gene (71 percent of African Americans in the American Southwest, compared to 45 percent of Japanese in Tokyo and 20 percent of Utah residents of northwestern European ancestry). African Americans have a higher risk of developing prostate cancer, while the few African Americans who lack this gene appear to be at decreased risk, especially if they consume less calcium.20

  Similar to the Yoruba, the Inuit also had a dairy-free, low-calcium diet. Inuit children consumed perhaps 20 mg of calcium per day from traditional foods. Since the Inuit are genetically adapted to low-calcium diets, Inuit children who eat a high-calcium Canadian-style diet often exhibit dangerous levels of calcium in the blood, a condition that can damage the kidneys.21

  Conversely, in the case of pastoralists who consumed a lot of meat and especially milk, people developed genetic adaptations to deal with high loads of cholesterol from those foods. East African Maasai pastoralists became genetically adapted to a mega-cholesterol diet of cattle milk, blood, and meat, with two-thirds of their calories coming from fat alone. Their daily cholesterol intake was four to six times the average Western cholesterol consumption, but the level of cholesterol in the Maasai blood was far lower than Western levels. The genes of Maasai show evidence of modifications in regions related to cholesterol metabolism and synthesis, atherosclerosis (the thickening of arteries that is related to cholesterol deposits), and lactase persistence. All of these genetic adaptations seem to make the Maasai better suited to a milk-rich, cholesterol-heavy diet.22

  In view of the traditional reliance of the English, Scandinavians, and northern Indians on milk products, it makes sense that most of them possess the enzyme lactase and are able to digest milk into adulthood (known as lactase persistence). Dairy use also left its genetic signature in widespread lactase genes among groups in East Africa and the Middle East, while in southern India and the eastern Mediterranean, the figure reaches around 15 percent. Few people were lactase persistent in West Africa, East Asia, and the New World.23 Overall, two out of every three people in the world lack the ability to produce lactase.

  In North America, a confluence of circumstances led to a boom in drinking milk. At the end of the nineteenth century, an abundance of pastureland close to cities plus improvements in milk storage techniques allowed production and consumption to soar. The U.S. Department of Agriculture (USDA), established in 1862 by President Abraham Lincoln, was charged with two mandates: to promote agricultural interests by increasing consumption of American agricultural products and to promote the health of Americans by setting dietary guidelines. The conflict of interest built into the USDA eventually materialized when the growing clout of the dairy industry led to the 1915 formation of the National Dairy Council, whose aim was to promote research on the benefits of dairy consumption. In 1919, facing a glut of milk supplies after the end of World War I, the USDA and the dairy industry began a program to increase milk drinking among schoolchildren. Educational materials featuring the merits of milk, including games and songs, were supplied with government endorsement. In Canada, meanwhile, the dairy industry succeeded in banning nondairy butter substitutes—namely margarine, originally made from beef fat—from Canadian markets in 1886. In 1948, the Supreme Court of Canada ruled that the ban was unconstitutional, after which provinces were free to set their own regulations on margarine production and importation. The last bastion of resistance was Quebec, which became the last place in the world to finally permit the sale of yellow margarine, in 2008.24

  Although the interference of political and business groups in public health policies related to dairy consumption was regrettable, this interference does not directly address the question of whether dairy consumption is safe, and if so, for whom. As with all traditional food cultures, the cuisines in places in the world where dairy has a long history, such as northern Europe, pastoralist groups in East Africa, and northern India, performed well in meeting the nutritional requirements of eaters, in accordance with the kinds of animals and plants that could be raised and grown in these environments. Given their several thousands of years of exposure to dairy, over generations the people in these regions have evolved the genetic makeup to handle the challenges of digesting lactose, as well as other possible negative effects of dairy consumption. Conversely, in places that had little or no dairy consumption, such as the New World, people’s traditional cuisines were adequate in meeting their nutritional requirements, including calcium intake, and so they may lack the genes to handle high loads of calcium, cholesterol, and other characteristics of dairy. In other regions, dairy products were a useful supplement to the diet, such as goat cheeses in the Mediterranean and ghee (clarified butter) in southern India, and should remain as useful food items. If we try to drastically alter traditional cuisines, either by adding a lot of dairy to dairy-less cuisines, or removing dairy from dairy-dependent cuisines, then we risk running into nutrient imbalances, because creating a tasty balanced diet from scratch is a lot tougher than eating something that was tested and savored over hundreds of generations.

  One final aspect of dairy needs to be discussed. It is plausible that consuming a lot of dairy will increase a person’s height, due to the presence of IGF-1 or other as-yet-undiscovered hormonal factors in milk. Height is highly socially desirable across societies, particularly for men, but being tall is also plausibly linked to a greater risk of certain forms of cancer, including prostate and breast cancers. In other words, copious dairy consumption involves a trade-off between health and height. This is not a trivial issue for parents to consider. Since height is a relative measure—for example, being of average height in Canada means being tall in Southeast Asia—the ideal outcome is that successive generations across the industrialized world gradually decrease in height to a level that is more optimum for long-term health; this way, no one’s ego needs to be bruised for being small. On the whole, our species, at least in the industrialized regions of the world, has grown too large, beyond the scale that our bodies were built to handle. Small has to become the new beautiful.

  A TRUCE AMONG THIEVES

  [I felt as if] my heart were suspended by a single thread.… My lips were observed to become pale.… A violent palpitation succeeded.

  —J. RIDLEY, “An Account of an Endemic Disease of Ceylon Entitled Berri Berri”

  One curious circumstance in connection with hay fever is that the persons who are most subjected to the action of pollen belong to a class which furnishes the fewest cases of the disorder, namely, the farming class.

  —CHARLES BLACKLEY, Experimental Researches on the Causes and Nature of Catarrhus Æstivus

  Ever since the importance of hunted animals in the diet began to decline, humans have sought suitable replacements. Rough kernels of wheat and rice, tiny cobs of corn, poisonous potatoes, and bitter olives were reborn as aromatic loaves of bread, elegant ribbons of pasta and noodles, bowls of fluffy rice, filling tortillas, hearty potato stew, and nourishing olive oil. The milk issuing from goat, sheep, cattle, camels, and horses was ingeniously fashioned into yogurt, cheese, and butter. Domesticated animals provided much-appreciated meat when they were no longer able to earn their keep through pooping, peeing, tilling, carrying heavy loads, assisting in hunts, laying eggs, or killing rodents and other agricultural pests.1 We tamed wild plants and animals; we multiplied our numbers. Forests and riverbanks that once harbored rich ecosystems were taken over by villages that mushroomed into towns. People could now spend their days indoors, shielded from the harsh elements, the violence inflicted by marauders, and the boredom of talking to the same villagers night after night. The new urbanites traded their expertise in the manufacture of goods and the provision of services for currency that could be redeemed for housing and social stimulation, as well as food that was hauled in from the peripheries of towns.

  Although many changes in diet and lifestyle were beneficial on the surface, rapid changes sometimes carried grave unintended consequences. As we have seen, our ancestors had been undergoing changes in diet and environment for millions of years. Our diets shifted from insects to fruits, meats, and agricultural products like wheat, rice, potatoes, and corn, then added milk and alcohol. By contrast, viewed against this backdrop of gradual transformation, the last thousand years of human history has been a storm of disruption due to technological and scientific breakthroughs. Since biological evolution requires dozens or hundreds of generations to adapt organisms to new environments and foods, new afflictions began to appear. Some of these could be contained and defeated, but other outbreaks took their place, gathered force, and tore through populations with brutal ferocity.

  As recounted in an excellent study by Kenneth John Carpenter, beginning in the seventh century, observers in densely populated areas of East and Southeast Asia periodically described an ominous constellation of symptoms including tremors, numbness, difficulty in walking, swelling of the limbs, wasting, and pervasive weakness; when the heart began to race, death was usually not far away. The patterns of the disease befuddled observers. It was prevalent in southern China but not northern China. Unlike cholera, which vaulted into Japan via seaports linked to China, this disease was not contagious; people who moved from afflicted regions did not carry the disease with them. Japanese doctors tried acupuncture and raising blisters along the spine with heated cylinders but to no avail. Western doctors theorized that noxious airs were the root cause of the affliction, but the disease was common on ships manned by Asian crews. In the 1870s, the rage in medicine was bacteria. Louis Pasteur successfully treated cholera and anthrax through manipulating bacteria; Robert Koch found the bacterial agent responsible for tuberculosis. Perhaps the disease afflicting East and Southeast Asia was another bacterial epidemic? However, experiments on chickens failed to reveal any bacterial agent.

  People who ate barley did not seem to be affected, which focused attention on the possible role of a diet of rice. This led to another puzzle: Poor-quality rice did not seem to increase the risk of acquiring the disease; if anything, those who were privileged to eat better-tasting rice were more susceptible. Adding to the confusion, the disease, known in the Dutch Indies as beriberi, would also appear in areas of the world where rice was not eaten, such as Brazil and Canada.

  Two surgeons, Japanese and Dutch, working with different navies, found that the disease could be greatly alleviated by adding sources of protein to the diet.2 Although this was a relief for the navy men, rats fed on protein-adequate but otherwise nutritionally deficient diets failed to thrive, which disproved the protein hypothesis. Further experiments revealed that apart from protein, fat, and carbohydrates, rats required two additional substances for survival: a fat-soluble “vitamin A,” which could be obtained from cod liver oil and butter, and a “vitamin B,” which could be obtained from yeast, wheat germ, and nonfat milk powder. Further work isolated a vitamin B2 complex, and a vitamin B1 later named thiamine, which was able to prevent beriberi-like symptoms from appearing in rats and chickens. Purified crystals of thiamine were effective at restoring the health of rats and chickens even when administered in microscopic quantities.

  Thiamine, it turns out, is present in the bran of rice and is removed when rice is milled, heated to very high temperatures, or boiled and then rinsed. Milling boosted the storage life of rice and increased its palatability but inadvertently removed thiamine in the rice germ from the diet. By introducing steam-powered milling to Asia, the colonial powers greatly increased the misery wrought by beriberi, though the Chinese and Japanese also used rice mills and were beset with the disease. (Diets of cassava in Brazil and bread made from white flour and baking powder in isolated ports of Newfoundland also lacked thiamine and hence led to outbreaks of beriberi in those regions.)

  There were two relatively simple ways of resolving the thiamine deficiency. The first method was to parboil rice, a traditional way of cooking rice in parts of South Asia that involves soaking it and then boiling it in its husk. This aids in removal of the husk but also helps the germ to retain nutrients from the husk, including thiamine. During the epidemic of beriberi in Asia, populations that used parboiled rice were not affected. However, parboiled rice had a musty smell and a yellow-brown tinge and was not as fluffy as white rice, which made it unacceptable to East Asian populations.

  The second traditional way of preparing rice was to pound or stamp upon rice kernels, then employ sifting or other means to remove the husks. Because this method only incompletely removes the husk, the rice still retains thiamine in the “silver skin” surrounding the kernel. However, people used to eating white rice also rejected hand-milled rice as unpalatable, defeating the efforts of public health officials. A third method was to combine thiamine-rich beans with white rice, a practice still carried on in parts of Asia. Eventually, the disease was resolved definitively by adding thiamine directly to polished rice, but not before beriberi had inflicted much misery upon populations.3

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183