You are on page 1of 2

Chapter 130

Cutaneous Changes in
Nutritional Disease
Melinda Jen & Albert C. Yan

MICRONUTRIENTS
Fat-Soluble Vitamins
Historical Background
Bony deformities in children have been described
by artifacts dating back to the first and second
centuries c.e. The earliest classic descriptions of the
clinical symptoms of rickets appeared separately
in 1645 by Daniel Whistler and in 1650 by Francis Glisson. These two men described children in
southwest England, where rickets was endemic. At
the time they did not understand its etiology. As
early as 1824, cod liver oil was noted to help cure
rickets. In 1861, Trousseau of France hypothesized
that rickets was caused by inadequate diet and sun
exposure, but Theodor Palm was the first to study
and conclude that there was also a connection
between rickets and lack of sun exposure. During
the Industrial Revolution, a combination of poor
diet, smog and tall buildings obscuring the sun, as
well as inadequate time outdoors all contributed to
the persistence of rickets in the United States and
Europe. By 1918, John Howland, Edward Park, and
Paul Shipley had used rat models of rickets to identify
the anti-rachitic molecule in cod liver oil, vitamin D.53
In the past preterm infants were at risk for rickets
because of an inadequate supply of calcium and
phosphorus at the time of birth.
Following feeding with unsupplemented breast
milk, their phosphorus levels would fall over the
first 2 weeks of life with a subsequent increase
in alkaline phosphatase activity at 4-8 weeks,
indicating vitamin D defiency if dietary intake is
insufficient to meet daily requirements.
Since the introduction of breast milk fortifiers
and preterm formulas, the incidence of rickets of
prematurity has decreased dramatically. Rickets
also occurs with greater frequency during puberty,
when physical and behavioral changes modulate
vitamin D availability. The pubertal growth spurt
places greater demands for more calcium and phosphorus needed for bone growth, so increased levels

of vitamin D are required. In settings where women


are expected to cover their skin with the onset of
puberty for cultural reasons, the scant opportunities for sun exposure increases the risk for vitamin
D deficiency if dietary intake is insufficient to meet
daily requirements.

Water-Soluble Vitamins
Historical Background
In 1735, Gasper Casal noted poor peasants in
northern Spain were particularly affected by a
skin disorder referred to then as mal de la rose, so
named because of the reddish, glossy rash on the
dorsum of the hands and feet. He noted that these
peasants were all poor, ate mainly maize, and rarely
ate fresh meat. Francois Thierty published the first
description of pellagra in 1755, but it was Francesco
Frapoli who coined the name pellagra after the Italian words pelle, meaning skin, and agra, meaning
rough.
During the nineteenth century, the cause of many
diseases was attributed to infectious agents, and
pellagra had been thought to be related to some
infectious microorganism. While working as for
the United States Public Health Service, Joseph
Goldberger first suggested that pellagra might be
caused by an amino acid deficiency in 1922 and
that a dietary pellagra-preventative factor existed. Pellagra was endemic in the southern United
States in the early 1900s because of a ubiquitous
diet consisting principally of corn bread, molasses, and pork fat. Beginning in 1914, Goldberger
worked with two orphanages and one sanitarium
in the South. By increasing the amount of fresh
animal meat and vegetables available at the three
institutions, Goldberger was able to significantly
decrease the incidence of pellagra. He went on to
investigate pellagra among male prisoners. Using
12 prisoners from the Mississippi State Penitentiary and offering prisoners pardons as an incentive to participate, he successfully demonstrated
that pellagra could be induced by a monotonous
cereal-based, low calorie, and protein diet. In order
to disprove the allegation that pellagra was caused
by an infectious agent, he subjected 16 volunteers
to the blood, urine, feces, and epidermal scales of
pellagrous patients and showed that they were
not predisposed to develop pellagra.94 Goldberger
died before he identified the pellagra-preventative
factor, but in 1937, Conrad Elvanhjem identified
niacin as the antipellagra factor.95

Copyright McGraw-Hill Companies, Inc. All rights reserved.

198 Chapter 130: Cutaneous Changes in Nutritional Disease

Vitamin B12 (Cobalamin)


Historical Background
Some controversy exists as to who documented
the earliest report of pernicious anemia. Thomas
Addison is often credited with the first published
description in 1855, but others like James Combe
and Antoine Biemer also deserve some measure of
credit. Pernicious anemia was a recognized entity in
the late 1800s, and strides were made in the early
twentieth century toward a better understanding
of pernicious anemia. As understanding of other
diseases such as pellagra and beriberi began to
emerge, researchers began to wonder if pernicious
anemia was also caused by a dietary deficiency.
George Whipple published results in 1920 from
studies he did on anemic dogs. Whipple-induced
anemia in dogs by bleeding them. After trials of
different foods to recover the hemoglobin level,
Whipple observed the greatest improvement with
liver. George Minot and William Murphy won the
1934 Nobel Prize in Physiology and Medicine along
with Whipple for their work in documenting that
meat and liver could be employed to treat anemic
patients.107,108
Around the same time, William Castle utilized controls and patients with pernicious anemia to prove
that an essential interaction between meat (extrinsic factor) and a component of normal human
gastric secretions (intrinsic factor) were required for
resolution of anemia. Finally, in 1948, Karl Folkers
team successfully crystallized vitamin B12 and in
1964 Philippus Hoedenmaeker showed that Castles
intrinsic factor was produced by the gastric parietal
cell. The well-known Schilling test to assess intrinsic
factor deficiency was described by Robert Schilling
in 1953.109
Vitamin C (Ascorbic Acid)
Historical Background
Scurvy, the disease of vitamin C deficiency, has
been documented since antiquity. Ancient Greek,
Roman, and Egyptian texts describe cases of scurvy.
The Ebers papyrus, which dates to about 1552 b.c.e.
documents cases of scurvy that were successfully
treated with onions. Scurvy-plagued sailors for hundreds of years before its cause was fully understood.
One of the earliest reports dates to the 1497 expedition of India by Vasco da Gama. On this journey,
many of the crew members developed scurvy, but
da Gama noted that their symptoms improved after
they traded for fresh oranges with locals in East
Africa. After their supply of fresh oranges were depleted, da Gama observed the symptoms returned,

so at their next landfall, they again sought locals


with oranges to cure their disease. Other ships were
not as fortunate as da Gamas crew. George Ansons pursuit of Spanish ships in 17401744 began
with more than 1,400 crew members. By the end
of the 4-year journey, he returned with only 145
of his original crew members with only four killed
in enemy action and over 1,300 having died from
scurvy.116
In 1747, James Lind devised one of the earliest
clinical trials to investigate crew members from the
HMS Salisbury afflicted with scurvy. Lind selected
12 seamen with severe scurvy and divided them
into six groups of two, and each group was assigned to receive a different dietary therapy: hard
apple cider, elixir of vitriol, vinegar, sea water, two
oranges and one lemon daily for 6 days, and a
medicinal paste. Lind published his findings in his
Treatise of the Scurvy in 1753 where he concluded
that oranges and lemons were the most effective
treatment for scurvy.117 Although Linds findings
were published in 1753, it was not until 1793 that
lemon juice to be a required daily provision on long
sea voyages under the advice of Gilbert Blaine.118
As the incidence of scurvy decreased at sea, several epidemics on land occurred. The Great Potato
Famine of 18451848, World War I, and World War II
were times of were times of nutritional impoverishment. The armies of the Crimean War and American
Civil War, Arctic explorers, and California gold rush
communities suffered from scurvy in large numbers.
In the late nineteenth and early twentieth centuries,
an explosion in cases of infantile scurvy occurred
in the United States because of the trend toward
heated milk and proprietary foods. As shown by
James Lind, heating of vitamin C decreased its
biological activity. Alfred Hess reported that pasteurization of milk likewise decreases its vitamin C
concentration.119 Proprietary food at that time was
of poor nutritional quality. Interestingly, most of
the affected infants were from affluent families who
thought they were providing superior nutrition for
their children.120 Vitamin C was isolated by Albert
Szent-Gyorgyi in 1927 when he isolated a compound found in high concentrations in the adrenal
cortex, oranges, cabbages, and paprika.

Copyright McGraw-Hill Companies, Inc. All rights reserved.

You might also like