POISON IN THE NEGATIVE Isaac Asimov -------------------------------------------------------------------------------- Yesterday I sat down to write my 321st essay for Fantasy and Science Fiction. I called it “How High in the Sky” and it went swimmingly. I was pleased at the ease with which I worked out its construction. It practically wrote itself, and I scarcely had to look anything up. I whistled while I worked. And then, when I reached the last page and launched into my climactic paragraphs, I thought to myself: Why does this suddenly sound familiar to me? Have I ever written an essay like this before? As it happens, I am widely noted as a shy and reserved person of extraordinary modesty, but if there is one thing about myself of which I’m just a weentsy bit proud, it is my phenomenal memory. So I punched my recall button and up on my internal display screen came an essay called “The Figure of the Farthest.” Hoping earnestly that my memory had missed fire, I looked it up. It turned out to be essay *182, published in the December 1973 issue, about 11 1/2 years ago. There it was. That earlier essay was essentially what I had just written. I promptly tore up what I had spent most of the day writing and fell into disgruntled thought. What else would I write? For a while, I could think of nothing but subjects I had already dealt with. In fact, I was just coming to the horrifying conclusion that I had finally written everything there was to write, when my dear wife, Janet, entered my office with a concerned look on her face. Goodness, I thought to myself, the sweet woman is so attuned to my moods that she could feel my misery, telepathically, from the other end of the apartment. “What do you want?” I growled, lovingly. She held out her hand. “You forgot to take your vitamins today.” she said. Ordinarily I greet a sentiment like that with an amiable snarl and a few affectionate cursory remarks. This time, however, I beamed and said, “Thank you so much, darling,” and swallowed the stupid pills with a big grin. You see, it occurred to me that I had never written an essay on vitamins. I presume that human beings have always suffered from vitamin deficiencies, but this usually happened when they were undernourished or confined to a montonous diet (or both) — as, for instance, if they were in prison, or in besieged cities, or were totally impoverished. In general, they were then considered to have died of hunger or of one of the many diseases with which human beings were afflicted. Such deaths were endured stoically in the good old days, especially if the dead and dying were varlets, knaves, churls and other members of the lower classes. But then a brand-new peril began to strike sea-voyagers — The diet on shipboard was generally monotonous and bad. There was no refrigeration in the good old days, and so there was no use storing anything on shipboard that spoiled or went moldy too easily. Consequently, the standard foods for sailors at sea were items such as hard tack and salt pork, which lasted practically forever, even at room temperature, for the good and sufficient reason that no self-respecting bacterium would touch the stuff. Such items supplied the sailors with calories and very little else, but sea-voyaging in ancient and medieval times consisted largely of hugging the coast and making frequent stops during which sailors could get real food, so there was no problem. But then, in the 15th Century, came the Age of Exploration, and ships began making longer voyages during which they remained at sea for longer intervals. In 1497, the Portuguese explorer Vasco da Gama (1460-1524) circled Africa and completed the first successful voyage by sea from Portugal to India. The voyage took eleven months and, by the time India was reached, many of the crew were suffering from scurvy, a disease characterized by bleeding gums, loosened teeth, aching joints, weakness and a tendency to bruise. It was not an unknown disease, for it was also suffered by those who were under long siege during wartime, and had been specifically remarked and commented upon since the time of the Crusades at least. This was the first occasion, however, in which the disease had appeared on shipboard. Naturally, no one knew the cause of scurvy, anymore than anyone at the time knew the cause of any disease. Nor did anyone suspect that the trouble might be dietary, since the natural belief was that food was food, and if it stopped the hunger pains, that was it. Scurvy continued to plague sea-voyagers for two centuries after da Gama, and it was a serious matter. Sailors who were down with scurvy could not do their work, and the ships of early modern times were all too prone to sink in a storm even when the entire crew was able-bodied and hardworking. And yet there were hints that scurvy could be handled. The French explorer Jacques Cartier (1491-1557) sailed three times to North America between 1531 and 1542, exploring the Gulf of St. Lawrence and the St. Lawrence River, and laying the foundations for French dominion in what is now the Province of Quebec. On his second voyage, he wintered in Canada in 1535-36. Adding to the poor food on shipboard was the continued lack of anything else during the winter, so that twenty-five of Cartier’s men died of scurvy, and nearly a hundred others were disabled to one degree or another. According to the story, the Indians had the sufferers drink water in which pine-needles had been soaked, and there was a marked improvement as a result. Then, in 1734, an Austrian botanist, J. G. H. Kramer, was with the Austrian army during the War of the Polish Succession. He noticed that when scurvy appeared it was almost always among the rank and file of the soldiers, while the officers generally seemed to be immune. He noticed that the common soldiers lived monotonously on bread and beans, while the officers frequently had green vegetables to eat. When an officer didn’t eat his green vegetables, he was liable to get scurvy just as though he were a private. Kramer recommended that fruit and vegetables be included in the diet to prevent scurvy. No one paid attention. Food was food. Scurvy was a particular problem for Great Britain, which depended on its navy to defend its shores and protect its commerce. Clearly, if its sailors tended to be disabled by scurvy, it was quite possible that the navy might, at some crucial moment, be unable to perform. A Scottish physician, James Lind (1716-1794), had served in the British navy, first as a surgeon’s mate and then as a surgeon, between 1739 and 1748. That gave him an excellent opportunity to observe the absolutely harrowing conditions on board ships. (Samuel Johnson, in those days, said that no one would serve on board ship who had the wit to get into jail. He said that ships, as compared to jails, had less room, worse food, worse company, and offered the chance of drowning. During wartime in the eighteenth century, the British lost about 88 men to disease and desertion, for every one killed in action.) In 1747, Lind chose twelve men who were disabled with scurvy (there were, of course, plenty to choose from), divided them into groups of two and gave each pair a different dietary supplement. One pair had two oranges and a lemon each day for the six days the supplies held out, and this pair recovered from their illness with astonishing quickness. Next came the task of convincing the British navy to feed the sailors citrus fruits regularly. This was almost impossible to do because, as we all know, military men have a rigid quota of one new idea per lifetime, and the British admirals had apparently all had theirs already, at the age of five or thereabouts. Then, too, Captain Cook (1728-1779), during his voyages of exploration had lost only one man to scurvy. He obtained fresh vegetables at every opportunity, and he also added sauerkraut and malt to the rations. Somehow it was the sauerkraut and malt that got the credit, though they were not particularly effective, and that confused the issue. Then the American Revolution came along, followed by the French Revolution, and the sense of crisis grew. In 1780 (the year before the climactic battle of Yorktown, when France, for one crucial moment, seized control of the western Atlantic) 2,400 British sailors, one-seventh of the total, were down with scurvy. In 1794, the British navy was put almost entirely out of action when the sailors, driven to despair by their inhuman treatment, rose in a massive mutiny. One of the demands of the mutineers was that they be given a ration of lemon juice. Apparently, the common sailors, not surprisingly, didn’t really enjoy scurvy and, even less surprisingly, had more brains than the admirals did. The mutiny was put down by a judicious mixture of barbaric punishment and reluctant giving in. Since lemons from the Mediterranean were expensive, the British Admiralty settled on limes from the West Indies, which were not quite as effective, but were cheaper. British sailors have been called “limeys” ever since. In this way, scurvy disappeared as a major threat on British vessels, but Lind was dead by then and could not savor the victory. And it was a purely local victory. The use of citrus fruits did not spread, and all through the nineteenth century scurvy flourished on land, especially among children who were no longer breast-fed. Though enormous advances were made in medicine during that century, that actually worked against the proper treatment of scurvy. As biochemical knowledge grew, for instance, it became plain that there were three chief classes of organic foodstuffs: carbohydrates, fats and proteins. It was recognized, at last, that food was not necessarily food, but that foods differed in nutritional quality. However, the difference seemed to rest entirely in the amount and type of protein that was present, and scientists tended to look no further. In addition, the century saw the great discovery of the influence of microorganisms on disease. So important was this “germ theory” and so effectively did it lead to the control of various infectious diseases, that physicians began to think of all disease in terms of germs, and the possibility that diet had something to do with some diseases tended to be brushed aside. Scurvy wasn’t the only disease that afflicted sailors and that could be countered by diet. In the second half of the nineteenth century, Japan was westernizing itself and was rising to the status of a great power. To that end, she worked busily to build a modern navy. The Japanese sailors ate white rice, fish, and vegetables and were not troubled by scurvy. However, they fell prey to a disease called “beriberi.” This is from a Sri Lankese word meaning “very weak.” The disease produced damage to the nerves, with the result that a person with beriberi felt weakness in his limbs and great lassitude. In the extreme, the sufferer died. The Director-General of the Japanese navy was Kanehiro Takaki, and, in the 1880’s, he was greatly concerned over this matter. One-third of all the Japanese sailors were down with beriberi at any one time, but Takaki noted that the officers on board ship generally did not get beriberi, and that they had a less monotonous diet than the ordinary sailors did. Takaki also noted that British sailors did not suffer from the disease, and there, too, the diet was different. In 1884, Takaki decided to produce greater variety in the diet and to add some British items to it. He replaced part of the polished rice with barley, and added meat and evaporated milk to the rations. Behold, beriberi disappeared in the Japanese navy. Takaki assumed that this was so because he had added more protein to the diet. Again, as in the case of Lind’s treatment a century before, nothing further happened. Beriberi, like scurvy, was stopped on shipboard, but, again like scurvy, beriberi continued to flourish on land. To be sure, it is comparatively easy to alter the diet of a few sailors, who can be disciplined harshly for disobedience, while it is considerably more difficult to change the diet of millions of people, especially to a more expensive one, when they can barely manage to find enough of anything at all to eat. (Even today, when the cause and cure of beriberi is precisely known, it kills 100,000 each year.) Beriberi was endemic in the Dutch East Indies (now called Indonesia) in the nineteenth century, and the Dutch were naturally concerned over the matter. A certain Dutch physician, Christiaan Eijkman (1858-1930), had served in Indonesia and was invalided home with malaria. He had finally recovered and, in 1884, he agreed to return to Indonesia at the head of a team of physicians in order to study beriberi and to determine how best to deal with it. Eijkman was convinced that beriberi was a germ disease, and so he took with him some chickens. He hoped to breed chickens in numbers and use them as experimental animals. He would infect them with the disease, isolate the germ, form an antitoxin perhaps, and work out an appropriate treatment to try on human patients. It didn’t work. He could not infect the chickens, and, eventually, the bulk of the medical team returned to the Netherlands. Eijkman stayed on, however, to serve as the head of a bacteriological laboratory, and continued to work on beriberi. Then, in 1896, quite suddenly, the chickens came down with a paralytic disease. The disease clearly affected the nerves (it was called “fowl polyneuritis” for that reason), and it seemed to the suddenly excited Eijkman that it was quite analogous to the human disease of beriberi, which was also, after all, a polyneuritis. The chickens, Eijkman felt, had finally caught the disease. Now what he had to do was to locate the polyneuritis germ in the sick chickens, and prove it was infectious by transferring it to those that were yet well, then work out an antitoxin, and so on. Again, nothing worked, He could not locate a germ, he could not transfer the disease, and worst of all the chickens got well. The very puzzled and disappointed Eijkman set about finding what had happened and he discovered that just before the chickens had recovered, the hospital had received a new cook. The previous cook had at some point taken to feeding the chickens with leavings from the diet fed the patients at the hospital, a diet that was heavy on polished white rice — that is, rice with the outer brownish hulls scraped off. (The reason for the polishing is that the hulls contain oils that can grow rancid on standing. The polished rice, oil-free, remains edible for long periods of time.) It was while they were being fed on these scraps that the chickens grew ill. Then the new cook arrived and was horrified at the thought of feeding food fit for people to mere chickens. He took to feeding them on unpolished rice, complete with hulls That’s when they got better. Eijkman realized, then, that beriberi was caused and cured by diet and was not a germ disease. There had to be something in the rice that caused the disease, and something in the hulls that cured it. It wasn’t anything that occurred in substantial quantities since the carbohydrate, fat and protein of rice were in themselves harmless. It had to be some very minute “trace” constituent. Trace constituents capable of sickening and even killing people were, of course, known. They were called poisons, and Eijkman decided that there was a poison of some sort in the white rice. In the rice hulls, he thought, there was something that neutralized the poison. This was rather the reverse to the truth, but the notion of trace substances in food that produced or cured sickness proved uncommonly fruitful. Whereas Iind’s and Takaki’s work was important, but produced no further consequences, Eijkman’s work produced a blizzard of subsequent experimentation and brought about an enormous revolution in the science of nutrition. It was for this reason that Eijkman was awarded a share in the 1929 Nobel Prize in physiology and medicine, for by that time the seminal nature of his work was abundantly recognized. Unfortunately, he was too ill by that time to go to Stockholm to collect the award in person, and the next year he died; but, unlike Lind, he had lived long enough to witness his own victory. Eijkman returned to the Netherlands soon after he had made his great discovery, but a co-worker, Gerrit Grijns (1865-1944) remained in Indonesia. It was he who first announced the correct interpretation. In 1901 (the first year of the twentieth century) he presented arguments for believing that something in the rice hulls did not serve to neutralize a toxin, but was itself essential to human life. White rice resulted in disease, in other words, not because it possessed a small quantity of a poison, but because it lacked a small quantity of something vital. Beriberi was not merely a dietary disease; it was a dietary deficiency disease. This was a revolutionary thought! For thousands of years, people had been well aware that one could die through the presence of a bit of poison. Now, for the first time, they had to get used to the thought that death could result from the absence of a bit of something. That “something” was the opposite of a poison, and since its absence meant death, it was a poison in the negative, so to speak. Once this fact was absorbed, it seemed likely that beriberi wasn’t the only dietary deficiency disease. Scurvy was an obvious example of another. In 1906, the English biochemist Frederick Gowland Hopkins (1861 -1947) suggested that rickets, too, was a dietary deficiency disease. He was particularly successful in publicizing the concept and persuading the medical profession to accept it, so he shared the 1929 Nobel Prize with Eijkman. In 1912, the Polish biochemist, Casimir Funk (1884-1967), suggested pellagra as a fourth dietary deficiency disease. Nutritionists naturally grew nervous over the business of some trace substances in food that represented life or death to an organism, including the human being. That was made to order for mysticism. What had to be done was to isolate the materials, determine exactly what they were, and find out how they worked. That would reduce matters to ordinary, prosaic biochemistry. It was not enough, in other words, to work with food and to say, “Lemon juice prevents scurvy and brown rice prevents beribri.” That might be enough for people who would otherwise get those diseases, but it would not be enough for scientists. The person who took the first step toward moving beyond the foods themselves was the American biochemist Elmer Verner McCollum (1879-1967). In 1907, he was working on the nutrition of cattle, varying the nature of the diets and analyzing the excreta. There was, however, so much food and excreta involved, and everything was so slow that McCollum grew frustrated and weary. He decided that one had to work with smaller animals and more of them, so that studies could be made more quickly. The knowledge thus gained could be applied to larger animals — as Eijkman had done with his chickens. McCollum moved beyond chickens. He established the first colony of white rats intended for nutritional studies, a device the rest of the field was quick to follow. McCollum, futhermore, tried to break down foods into various components — sugar, starch, fat, protein — and feed these, separately and in combination, to the white rats, observing when their growth proceeded normally and when it slowed, or when abnormal symptoms of any sort appeared. In 1913, for instance, he showed that when he used certain purified diets, on which rats did not grow normally, normal growth could be resumed if a little butterfat or egg-yolk fat were added. Nor was it the fat alone that did the trick, for when lard or olive oil was added to the diet, growth was not resumed. It had to be some trace substance present in some fats in small quantities but not in others. The next year, McCollum reported that he could extract the trace substance from butter, by using various chemical procedures, and add it to olive oil. Thereafter, that olive oil could support growth if it were added to the rats’ diet. This offered strong support to the notion of trace substances necessary to life, and deprived it of any mystical aura. Whatever the trace was, it had to be a chemical substance, and one that could be dealt with by chemical methods. It happens that living tissue is mostly water. In this watery medium, there are solid structures made up of inorganic material (bones, for instance), or large insoluble molecules (cartilage, for instance). In addition, there are small organic molecules, many of which are soluble in water and exist in solution, in consequence. Some tissue molecules, however, are not soluble in water. The chief of these are the various fats and oils, which clump together separately from the water. Certain other molecules which are not soluble in water dissolve in the fat instead. Thus, we can group the small molecules in living tissue as either “water-soluble” or “fat-soluble. Water-soluble substances in tissue can be soaked out in more water. Fat-soluble substances in tissue can be soaked out by making use of solvents such as ether or chloroform. The trace substance essential to growth that was present in some fats and not others is clearly fat-soluble. McCollum could show, on the other hand, that whatever it was in rice hulls that prevented beriberi could be extracted with water and was therefore water-soluble. That was, in itself, conclusive proof that there was not one overall trace substance that permitted normal growth and prevented disease, but that there were at least two. In the absence of any knowledge of the structure of these substances, McCollum had to use a simple code to distinguish them. By 1915, he was speaking of them as “fat-soluble A” and “water-soluble B” (giving his own discovery priority out of natural egocentrism). That started the fashion of using letters of the alphabet to identify these trace substances, a habit that continued for a quarter of a century until their chemical structure was known well enough for them to receive other names. Even now, however, the letter designations are frequently used not only by the lay public, but even by biochemists and nutritionists. Meanwhile, though, another attempt at naming had been made. Funk, whom I mentioned earlier, was working in London on these trace substances. His chemical analyses had convinced him, in 1912, that whatever the trace substance was that prevented beriberi, it contained as part of its chemical structure an atom grouping consisting of a nitrogen atom and two hydrogen atoms (NH2). This grouping is chemically related to ammonia (NH3) and is therefore called an “amine” by chemists. Funk turned out to right in this conclusion. Funk then went on to speculate that if there were more than one of these trace substances, then all were probably one kind of amine or another. (He was wrong in this.) For that reason he called the trace substances, as a group, “vitamines”; that is (from the Latin) “life amines.” It didn’t take many years for evidence to accumulate that some trace-substances necessary to life did not have an amine group as part of their chemical structure and that “vitamine” was consequently a misnomer. There are many cases of this sort in science, and often the misnomer must remain if it has become too embedded in scientific writing and too ground into customary use to be given up. (“Oxygen” is a misnomer, for instance, and has been known to be such for nearly two centuries, but what can we do?) In 1920, however, the English biochemist Jack Cecil Drummond (1891-1952) suggested that the final “e” of the word might at least be dropped, so that the “amine” reference need not be so overwhelmingly prominent. The suggestion was quickly adopted, and the trace substances have been known as “vitamins” ever since. For that reason, “fat-soluble A” and “water-soluble B” came to be known as “vitamin A” and “vitamin B” — and I will carry on the story of what we can now call vitamins next month.