100 Million Years of Food, page 14
*
While beriberi was ravaging urban areas in East Asia, physicians in Europe were coming across a novel disease whose symptoms consisted of blisters, loss of appetite, depression, and a preoccupation with suicide. The disease was dubbed “pellagra,” a term denoting rough skin in the Lombard dialect of Italian. Unlike beriberi, pellagra had a tendency to afflict the poor rather than the rich, the opposite of beriberi’s pattern. The locations of the disease were also opposite, with most cases of pellagra being recorded in Europe, while beriberi primarily afflicted East and Southeast Asia.
American doctors probably spotted the first cases of pellagra in the nineteenth century, but since the disease was believed to be nonexistent on their side of the Atlantic, they refrained from announcing their observations. In 1902, a physician in Atlanta identified the disease in an impoverished farmer. The visibility of pellagra rapidly expanded. In 1906, there were eighty-eight cases of pellagra at Mount Vernon Hospital for the Colored Insane in Alabama. Eighty of these patients were female, and more than half died. Mysteriously, none of the nurses at the hospital contracted the disease. Other mental institutions reported outbreaks, and the epidemic spread as far west as Illinois. By 1912, around twenty-five thousand cases had been diagnosed, with four in ten victims dying. As with beriberi, expert opinion initially focused on microbial agents. Some people believed that pellagra arose from eating spoiled, moldy corn, and thus several states enacted laws to inspect corn. Pellagra was also thought to be infectious, and so pellagra victims, who invariably came from the most economically disadvantaged quarters, were shunned like lepers and denied access to hospitals.4
In February 1914, the U.S. surgeon general invited a talented Jewish Hungarian American epidemiologist, Dr. Joseph Goldberger, to take over the Public Health Service’s faltering pellagra investigations. By the time of his appointment, when he was forty years old, he had made a name for himself studying—and surviving—epidemic diseases. Shortly after launching his investigations, Dr. Goldberger surmised that the disease was not communicable, since health workers who were in close association with pellagra victims never acquired the disease. A more likely explanation lay in the classic “Three M’s” diet of poor southerners: meat (fatty pork), molasses, and meal (cornmeal). Orphans and mental institution patients who ate monotonous meals along the Three M’s pattern got pellagra, but workers at the same institutions who had access to more varied fare avoided the disease.
Dr. Goldberger carried out an experiment with volunteers at a Mississippi prison, who were offered pardon by the governor as a condition for their participation. Over the course of six months, more than half the volunteers who were fed a diet based on cornbread and cornstarch developed skin lesions (starting with the genitalia), while the remaining subjects developed less striking but still noticeable manifestations of the same disease. Although the experiment was carried out with great meticulousness by Dr. Goldberger, both he and the Mississippi governor were roundly criticized for its unorthodox nature. Moreover, not only did the results run contrary to the conventional line that pellagra was infectious, the nutritional hypothesis also drew attention to the poverty of the South, which provoked the ire of proud southern politicians and patriots.5
Dr. Goldberger continued to try to convince critics that pellagra was noncommunicable, even going to the extraordinary extent of injecting himself, his wife, and colleagues with blood from pellagra victims, and swallowing skin scales, feces, and dried urine from pellagra sufferers, wrapped up in dough. Consuming this concoction yielded nausea and diarrhea, but no pellagra. Unable to sway his critics, but convinced by his own observations and efforts that amino acid deficiency rather than spoiled corn was key, Dr. Goldberger then tried to locate the missing amino acid. He died of renal cancer in 1929 before he could complete his life’s mission.
In the end, it turned out that his hypothesis was correct: Corn was found to be deficient in tryptophan, which the human body can metabolize into niacin (also known as vitamin B3). By the 1940s, fortification of foods with vitamin B3 eliminated pellagra as a menace to poor Americans, though not before some 3 million had suffered from the disease, resulting in approximately 100,000 deaths. In Italy, pellagra cases peaked in the late nineteenth century among poor rural peasants in the north confined to monotonous diets of corn, then faded as economic conditions improved through emigration (which raised local wages and brought remittances from emigrant workers), industrialization, improvement of crop yields, and falling prices for wheat (which was substituted for niacin-deficient corn). Pellagra disappeared from Italy by the 1930s.6
Industrial milling of American corn began in the early 1900s, which stripped the corn germ and thereby boosted the shelf life of processed corn. Unfortunately, the germ is also where niacin resides. During the epidemic, rates of pellagra were highest in areas adjacent to railroads, where people had ready access to stores and industrially milled cornmeal. In rural areas, people relied instead on traditional processing techniques, such as water-driven stone-milling, which preserved more of the corn germ and reduced the risk of pellagra. Indigenous groups in the Americas who domesticated corn over hundreds to thousands of years knew how to prepare their sacred crop for safe consumption. Through trial and error, and copying neighbors, tribes that relied heavily on corn learned to cook it with an alkaline substance such as lime or wood ashes, which helped to increase the availability of tryptophan and niacin in corn and thus helped them avoid pellagra. Another method of preserving niacin, practiced by the Tohono O’odham and the Hopi Indians, was to roast immature corn, which contains higher concentrations of niacin than mature corn.7
*
While beriberi was crippling swaths of East Asia and pellagra was picking up steam in southern Europe, a different disease was ravaging northern European cities. In 1634, fourteen deaths in England were attributed to a condition in which children were left with a deformed spine and chest and crooked arms and legs. The disease had shown up in the Balkans in 9000 BC, in early Egypt, and in China around 300 BC, but it finally developed into a full-scale epidemic in industrialized urban European locales in the eighteenth century. Nor was this just a disease of children, as elderly women in northern European and North American cities and towns suffered from high rates of bone fractures.8
Since prevailing medical theory centered on “humours,” the condition known as rickets was blamed on cold distemper. Herring, a rich potential source of vitamin D, was banned as “cold” food. Peasants found their own cure for the disease by consuming raven livers (livers are key organs in vitamin D metabolism). Fishermen around northern Europe had been taking fish liver as a household remedy for centuries. Swallowing cod liver oil was another matter, for it was prepared by allowing livers to spoil until the oil could be skimmed from the surface. The stench, understandably, was nauseating.9
As medical practitioners continued to debate the merits of cod liver oil, sunlight, bloodletting, bone breaking and resetting, and racks and slings designed to stretch out children’s bodies, rickets accompanied settlers migrating to the New World. Between 1910 and 1961, 13,807 deaths in the United States were officially attributed to rickets, mostly in infants less than a year old.10 Dark-skinned children were especially vulnerable, particularly in northern cities. Finally, between 1919 and 1922, a series of experiments conducted by researchers in Vienna verified the efficacy of cod liver oil and sunlight in preventing and treating rickets, and thereafter supplementation with cod liver oil, vitamin-D-fortified milk, and the use of sunlight gradually eradicated rickets. However, cases continue to occur even to the present day.
*
Although the scourges of beriberi, pellagra, and rickets are largely behind us, there are important lessons to learn from the history of these diseases. In each of these cases, a major rethinking—a paradigm shift—was required before progress could be made. In the case of rickets, the old theory of humors made European medical experts skeptical that herring, a “cold” food, could be beneficial, even though we now recognize that herring is a rich source of vitamin D and would have helped to alleviate rickets—a much better cure than the racks that were used to stretch out the deformed bodies of children afflicted with the condition. In the case of beriberi and pellagra, medical opinion clung to the notion that these diseases were caused by infectious germs, thus adding crucial delay to the search for nutritional deficiencies.
Once the role of vitamins was understood, progress was rapid. Adding vitamins like B1 (thiamine), B3 (niacin), and D to factory-produced foods was straightforward; such measures required no alterations in habits, and because the vitamins could be produced cheaply, no one protested the additions. Moreover, the companies that produced the vitamins profited handsomely, and thus the disasters of beriberi, pellagra, and rickets were averted in the ways in which capitalistic societies operate most comfortably: with the scent of profit, the comfort of cheap goods, and minimal prodding from public authorities. Unfortunately, the scent of profit and the lure of quick vitamin fixes continues to dazzle the public: The U.S. supplement industry registered a strapping $28 billion in sales in 2010 despite a lack of evidence for benefits from taking vitamins and antioxidant supplements among today’s well-nourished population.11 The lure of “superfoods” is similarly dubious.
Today, industrialized societies are facing several new epidemic diseases that are being studied for potential treatments; however, because the basis of these diseases conflicts strongly with past medical understanding of how the body works, there is a lot of conflict and confusion among medical experts and the public. We’ll consider two paradigm-breaking clusters of diseases: sunlight-deprivation diseases and allergic diseases. It now appears that myopia (i.e., nearsightedness) and allergic diseases are triggered by radical lifestyle shifts that were undertaken in recent centuries and decades. Understanding myopia forces us to reconsider the role of sunlight in guiding the development of the eye, and reining in allergic diseases compels fresh thinking on hygiene and bacterial warfare, as well as the influence of sunlight and vitamin D on the immune system.
*
From the point of view of evolution, nearsightedness is a great mystery. Back in hunter-gatherer times, anyone unable to spy a stalking predator or a tasty morsel in the forest would have been at a tremendous disadvantage. Myopia was first described by the ancient Greeks, but in the two thousand years since, no one has ever come up with a good explanation for why myopia developed in some people and not others. The old theory was that engaging in too many near-work activities, like reading or writing (or these days, using a computer or smartphone or playing with handheld video games), resulted in prolonged tension of eye muscles and eventually permanent myopia. This theory was proposed at least as far back as 1866 and seemed to make sense, since children first develop myopia during the early school years and myopia is more common among white-collar occupations and rises with education level. However, empirical studies show mixed results about the alleged effect of near work on vision, and the use of various types of lenses to correct for near-work effects have so far been unable to halt myopia from progressing in children. Meanwhile, the prevalence of myopia is rising in regions like East Asia. For example, in Singapore, the prevalence of myopia nearly doubled over a two-decade span, reaching 43 percent among young men.12
In striking contrast to the muddled empirical results of the near-work theory of myopia, in back-to-back studies in three different countries, children who play outside more frequently were found to be less nearsighted (Australia, the United States, and Singapore).13 The most solid explanation for this pattern is that sunlight is protective against nearsightedness. This pattern has been replicated in controlled experiments with chickens and monkeys, and also in a study that looked specifically at ultraviolet exposure and myopia. The reason sunlight helps prevent myopia could be the greater depth of focus and clearer retinal image achieved in bright sunlight or the stimulation of dopamine from the retina by sunlight. Sunlight’s protective effect may help explain why myopia rates are lower in Europe than in East Asia: Blue eyes have very little melanin in the iris compared to brown eyes and hence may permit greater intensity or different wavelengths of light to impact the pupil. Additional studies will be required to develop a complete account of the mechanics underlying myopia, but in the meantime, some people will find ways of increasing lighting inside homes so that it more closely mimics the tremendous intensity of natural sunlight; conversely, they could opt to let their kids play outside more.14
In addition to lowering the risk of nearsightedness, bright light triggers serotonin production in the human brain and combats the misery of seasonal affective disorder (SAD) and depression. Among patients admitted for treatment of depression at a Canadian psychiatric ward, those who by chance received one of the sunny east-facing rooms checked out of the hospital nearly three days sooner than those who were allotted one of the dimmer rooms. The antidepressive effects of sunlight may go beyond shortening hospital stays: Among patients admitted to the cardiac intensive care unit for heart attacks, those who stayed in one of the dim rooms had a higher chance of dying than those who received one of the sunny rooms. Over a four-year stretch, 13.2 percent of the patients who stayed in one of the four dim north-facing rooms died, compared to 7.7 percent of those who received one of the four sunny south-facing rooms.15
Sunlight or geographical latitude effects have also been noted in the incidence of schizophrenia and autism, ailments that have befuddled researchers up until now. Colder northern countries have the heaviest burden of each of these diseases (and in the case of schizophrenia, children of darker-skinned immigrants are particularly susceptible), leading researchers to investigate whether melatonin disruption, vitamin D deficiency, or some other factor is causing the association of these diseases with paucity of sunlight.16
Left to its own devices, the skin regulates production of vitamin D from ultraviolet light (UVB, to be specific) to manageable levels, just as our body does with all our other hormones. However, there are two major problems with relying solely on our skin to provide vitamin D. First, there’s the problem of our beautiful birthday suits. Human skin pigmentation evolved over thousands of years to provide the optimal balance of vitamin D production, protection against cancer-inducing ultraviolet light, and protection against damage to folate levels (folate, or vitamin B9, is easily damaged by ultraviolet radiation). When humans migrated out of Africa into Europe and East Asia, skin types in these latter two regions independently evolved to become lighter, powerful evidence that sunlight was an important factor influencing mortality. However, you can’t change skin color like a coat, so when Europeans started to populate sunny colonies in the Americas and Oceania beginning a few hundred years ago, and people from the tropics, like my parents, moved in the opposite direction, to frigid climes, the wonderfully adapted skin color suddenly became a liability. My Caucasian friends get sunburn from the Californian, Australian, and Southeast Asian sun, while my tropical immigrant friends and I languish from sunlight deficiency in northern cities like Ottawa, Umeå, and Sapporo.17
The second problem with relying on skin for our vitamin D is our pattern of sun exposure. Some people are able to tan, which is the skin’s method of adapting to the rise and fall of ultraviolet rays over the seasons. These days, when sun-starved office workers dash outside to play on weekends, then spend the rest of the week working inside, the alternation between scorching and seclusion sets us up for increased risk of sunburn and developing cutaneous melanoma, the most aggressive form of skin cancer. To avoid skin cancer, people slap on sunscreen, but it’s unclear whether this practice helps or harms, since sunscreen may give people a false sense of security and encourage them to spend more time outside, and the pattern of sunscreen wearing off and then being reapplied may exacerbate the dangerous intermittency of sun exposure. On top of that, the depletion of the ozone layer may be increasing our exposure to ultraviolet radiation beyond the range that our skin is adapted to handle. (Ozone pollution in big cities may cause the opposite effect and screen out UV light from reaching us.)
Not surprisingly, many people, for cultural or health reasons, decide to forgo the hazards of ultraviolet radiation altogether by seeking refuge under parasols, long-sleeved clothing, or heavy sunscreen use, but then they suffer from vitamin D deficiency, and we’re back to square one. Others worry about vitamin D deficiency and pop vitamin D pills, but the problem is that no one knows exactly how much vitamin D is a healthy dosage or how vitamin D supplements influence our immune system and increase our risk for diseases like cancer. In my own case, I love Canada, especially during the quiet, languid summers, but the mismatch between my skinny brown body and the rigors of a Canadian winter is so uncomfortable that I spend as much time in the tropics as my schedule and meager budget allow.18
*
Most allergic diseases were unheard of only decades ago. In the West, there were two waves of allergies. Asthma initially surfaced fifty years ago and reached its highest rate in the 2000s. Hot on its heels, food allergies have besieged Western countries. In one of the most ambitious screening tests for food allergies yet conducted, more than 10 percent of infants in Melbourne were shown to have a food allergy to either peanuts, eggs, or sesame seeds, a higher rate than pediatricians and scientists had previously suspected.19 In Asia, rates of asthma are now rapidly rising, and it is expected that food allergies and eczema will arrive soon afterward. Curiously, some kinds of foods that trigger allergic reactions in Asia are turning out to be rather novel. For instance, in Singapore, the most common trigger of anaphylaxis (the rapid onset of severe allergic symptoms, including hives and difficulty in breathing) is edible bird’s nests, a Chinese delicacy made from the saliva of cave swifts. Buckwheat is a common allergen in Japan, while the same is true of chestnuts in South Korea and chickpeas in India. As in the West, eggs, milk, and sesame are also commonly reported to induce allergic symptoms.20
