You are on page 1of 66

Pharmaceutical_Century

Pharmaceutical Century
Patents & Potions (1800's to 1919)
We live today in a world of drugs. Drugs for pain, drugs for
disease, drugs for allergies, drugs for pleasure, and drugs for mental
health. Drugs
that have been rationally designed; drugs that have been synthesized in the
factory
or purified from nature. Drugs fermented and drugs engineered. Drugs that
have
been clinically tested. Drugs that, for the most part, actually do what
they are
supposed to. Effectively. Safely.
By no means was it always so.
Before the end of the 19th century, medicines were concocted with a mixture
of
empiricism and prayer. Trial and error, inherited lore, or mystical
theories were the
basis of the worlds pharmacopoeias. The technology of making drugs was
crude
at best: Tinctures, poultices, soups, and teas were made with water- or
alcohol-based extracts of freshly ground or dried herbs or animal products
such as bone, fat, or even pearls,
and sometimes from minerals best left in the groundmercury among the
favored. The difference between a
poison and a medicine was a hazy differentiation at best: In the 16th
century, Paracelsus declared that the only
difference between a medicine and a poison was in the dose. All medicines
were toxic. It was cure or kill.
Rational treatments and rational drug design of the era were based on
either the doctrine of humors (a
pseudoastrological form of alchemical medicine oriented to the fluids of
the body: blood, phlegm and black and
yellow bile) or the doctrine of signatures. (If a plant looks like a
particular body part, it must be designed by
nature to influence that part. Lungwort, for example, was considered good
for lung complaints by theorists of
the time because of its lobe-shaped leaves.) Neither theory, as might be
expected, guaranteed much chance of
a cure.
Doctors and medicines were popular, despite their failures. As pointed out
by noted medical historian Charles
E.Rosenberg, a good bedside manner and a dose of something soothing (or
even nasty) reassured the patient
that something was being done, that the disease was not being ignored.
Blood and mercury
By the first part of the 19th century, the roots of modern pharmacy had
taken hold with a wave of heroic
medicine. Diseases were identified by symptom, and attacking the symptom as
vigorously as possible was the
high road to health.
Bloodletting dominated the surgeons art, and dosing patients with powerful
purgatives and cathartics became
the order of the day in an attempt to match the power of the disease with
the power of the drug. Bleed them till
they faint. (It is difficult to sustain a raging fever or pounding pulse
when there is too little blood in the body, so
the symptoms, if not what we would call the disease, seemed to vanish.)
Dose them with calomel till they drool
Strnka 1

Pharmaceutical_Century
and vomit. (Animals were thought to naturally expel toxins this way.)
Cleanse both stomach and bowels
violently to remove the poisons there.
Certainly these methods were neither pleasant nor very effective at curing
patients already weakened by
disease. George Washington died in misery from bloodletting; Abraham
Lincoln suffered chronic mercury
poisoning and crippling constipation from his constant doses of blue
mass. The cure was, all too often,
worse than the disease.
In the second half of the 19th century, things changed remarkably as the
industrial revolution brought
technological development to manufacturing and agriculture and inspired the
development of medical
technology.
Spurred in part by a reaction against doctors and their toxic nostrums,
patent medicines and in particular
homeopathy (which used extreme dilutions of otherwise toxic compounds)
became popular and provided an
antidote to the heroic treatments of the past. Not helpful, but at least
harmless for the most part, these new
drugs became the foundation of a commodity-based medicine industry that
galvanized pharmacist and
consumer alike. Technology entered in the form of pill and powder and
potion making.
Almost by accident, a few authentic drugs based on the wisdom and herbal
lore of the past were developed:
quinine, digitalis, and cocaine. Ultimately, these successes launched the
truly modern era. The century ended
with the development of the first of two synthesized drugs that represent
the triumph of chemistry over folklore
and technology over cookery. The development of antipyrine in 1883 and
aspirin in 1897 set the stage for the
next 10 decades of what we can look back on in retrospect as the
Pharmaceutical Century. With new
knowledge of microbial pathogens and the burgeoning wisdom of vaccine
technology, the first tentative steps
were taken to transform medicines to a truly scientific foundation.
From these scattered seeds, drug technology experienced remarkable if
chaotic growth in the first two
decades of the 20th century, a period that can be likened to a weedy
flowering of quackery and patent
medicines twining about a hardening strand of authentic science and
institutions to protect and nourish it.
Staging the Pharmaceutical Century
In the latter half of the 19th century, numerous beneficent botanicals took
center stage in the worlds
pharmacopoeias. Cocaine was first extracted from coca leaves in 1860;
salicylic acidthe forerunner of
aspirinwas extracted from willow bark in 1874 for use as a painkiller.
Quinine and other alkaloids had long
been extracted from China bark; but an antifebrile subcomponent, quinoline,
was not synthesized in the lab
until 1883 by Ludwig Knorr. The first truly synthetic pain reliever,
antipyrine, was produced from quinoline
derivatives. Digitalis from foxglove and strophantin from an African
dogbane were both botanicals purified for
use against heart disease. The opium poppy provided a wealth of pain
relievers: opium, morphine, codeine,
and heroin.
But it was not until the birth of medical microbiology that the true
Strnka 2

Pharmaceutical_Century
breakthroughs occurred, and sciencerather
than empiricismtook center stage in the development of pharmaceuticals.
Murderous microbes
The hallmark of 19th-century medicine has to be the microbial theory of
disease. The idea that infectious
diseases were caused by microscopic living agents provided an understanding
of the causes and the potential
cures for ills from anthrax to whooping cough.
Technology made the new framework possible. The brilliance of European lens
makers and microscopists,
coupled with the tinkering of laboratory scientists who developed the
technologies of sterilization and the media
and methods for growing and staining microbes, provided the foundation of
the new medical science that
would explode in the 20th century. These technologies offered proof and
intelligence concerning the foe against
which pharmaceuticals, seen thereafter as weapons of war, could be tested
and ultimately designed.
In 1861, the same year that the American Civil War began, Ignaz Semmel weis
published his research on the
transmissible nature of purperal (childbed) fever. His theories of
antisepsis were at first vilified by doctors who
could not believe their unwashed hands could transfer disease from corpses
or dying patients to healthy
women. But eventually, with the work of Robert Koch, Joseph Lister, and
Louis Pasteur adding proof of the
existence and disease-causing abilities of microorganisms, a worldwide
search for the microbial villains of a
host of historically deadly diseases began.
In 1879, as part of the new technology, Bacterium coli was discovered (it
was renamed Escherichia after
its discoverer, Theodor Escherich, in 1919). It quickly became the
quintessential example of an easily grown,
safe bacteria for laboratory practice. New growth media, new sterile
techniques, and new means of isolating
and staining bacteria rapidly developed. The ability to grow pathogens in
culture proved remarkably useful.
Working with pure cultures of the diphtheria bacillus in Pasteurs
laboratory in 1888, Emile Roux and
Alexandre Yersin first isolated the deadly toxin that causes most of
diphtherias lethal effects.
One by one over the next several decades, various diseases revealed their
microbial culprits to the so-called
microbe-hunters.
Initially, most American physicians were loath to buy into germ theory,
seeing it as a European phenomenon
incompatible with the truth of spontaneous generation and as a threat to
the general practitioner from the
growing cadre of scientifically trained laboratory microbiologists and
specialist physicians.
Anti-contagionists such as the flamboyant Colonel George E. Waring Jr.,
pamphleteer, consulting engineer,
and phenomenally effective warrior in the sanitation movement, ultimately
held sway. Filth was considered the
source of disease. A host of sewage projects, street-cleaning regimens, and
clean water systems swept urban
areas across the United States, with obvious benefits. Ultimately, the germ
theory of infectious diseases had to
be accepted, especially as the theoretical foundation behind the success of
the sanitation movement. And with
the production of vaccines and antitoxins, older medical frameworks fell by
Strnka 3

Pharmaceutical_Century
the wayside, though rural
American physicians were still recommending bleeding and purgatives as
cures well into the first few decades
of the 20th century.
Victorious vaccines
The most significant outgrowth of the new germ theory, and the one
that created the greatest demand for new technologies for
implementation, was the identification and production of the new
immunologicalsdrugs that are, in essence, partially purified
components or fractions of animal blood. In 1885, Pasteur
developed attenuated rabies vaccinea safe source of active
immunity (immunity developed against a form or component of the
disease-causing microorganism by the bodys own immune system).
Vaccines would be developed against a variety of microorganisms in
rapid succession over the next several decades.
But active immunity was perhaps not the most impressive result of
the immunologicals. Antitoxins (antibodies isolated against disease
organisms and their toxins from treated animals), when injected into
infected individuals, provided salvation from otherwise fatal diseases.
This technology began in 1890 when Emil von Behring and
Shibasaburo Kitasato isolated the first antibodies against tetanus and,
soon after, diphtheria. In 1892, Hoechst Pharma developed a
tuberculin antitoxin. These vaccines and antitoxins would form the
basis of a new pharmaceutical industry.
Perhaps as important as the development of these new immunologics was the
impetus toward standardization
and testing that a new generation of scientist-practitioners such as Koch
and Pasteur inspired. These scientists
credibility and success rested upon stringent controland ultimately,
government regulationof the new
medicines. Several major institutions sprang up in Europe and the United
States to manufacture and/or inspect
in bulk the high volume of vaccines and antitoxins demanded by a desperate
public suddenly promised new
hope against lethal diseases. These early controls helped provide a bulwark
against contamination and abuse.
Such control would not be available to the new synthetics soon to dominate
the scene with the dawn of
scientific chemotherapy.
Medicinal chemistry
Parallel (and eventually linked) to developments in biology, the chemists
art precipitously entered the
medicinal arena in 1856 when Englishman William Perkin, in an abortive
attempt to synthesize quinine,
stumbled upon mauve, the first synthesized coal tar dye. This discovery led
to the development of many
synthetic dyes but also to the realization that some of these dyes had
therapeutic effects. Synthetic dyes, and
especially their medicinal side effects, helped put Germany and
Switzerland in the forefront of both organic
chemistry and synthesized drugs. The dyedrug connection was a two-way
street: The antifever drug
Antifebrin, for example, was derived from aniline dye in 1886.
The chemical technology of organic synthesis and analysis seemed to offer
for the first time the potential to
scientifically ground the healers art in a way far different from the
cookery of ancient practitioners. In 1887,
phenacetin, a pain reliever, was developed by Bayer specifically from
synthetic drug discovery research. The
drug eventually fell into disfavor because of its side effect of kidney
damage. Ten years later, also at Bayer,
Felix Hoffman synthesized acetylsalicylic acid (aspirin). First marketed in
1899, aspirin has remained the most
Strnka 4

Pharmaceutical_Century
widely used of all the synthetics.
Many other new technologies also enhanced the possibilities for drug
development and delivery. The advent of
the clinical thermometer in 1870 spearheaded standardized testing and the
development of the antifever drugs.
In 1872, Wyeth invented the rotary tablet press, which was critical to the
mass marketing of drugs. By 1883, a
factory was producing the first commercial drug (antipyrine) in a
ready-dosaged, prepackaged form. With the
discovery of X-rays in 1895, the first step was taken toward X-ray
crystallography, which would become the
ultimate arbiter of complex molecular structure, including proteins and
dna.
The Pharmaceutical Century
Not only did the early 1900s bring the triumph of aspirin as an inexpensive
and universal pain relieverthe first
of its kindbut the science of medicine exploded with a new understanding
of the human body and its
systems. Although not immediately translated into drugs, these discoveries
would rapidly lead to a host of new
pharmaceuticals and a new appreciation of nutrition as a biochemical
process and hence a potential source of
drugs and drug intervention.
Of equal if not more importance to the adoption and implementation of the
new technologies was the rise of
public indignationa demand for safety in food and medicines that began in
Europe and rapidly spread to the
United States.
Tainted food and public furor
The developing understanding of germ theory and the increasing availability
of immunologics and chemical
nostrums forced recognition that sanitation and standardization were
necessary for public health and safety.
First in Europe, and then in the United States, the new technologies led to
the growth of new public and
semipublic institutions dedicated to producing and/or assaying the
effectiveness and safety of pharmaceuticals
and foods in addition to those dedicated to sanitation and public disease
control. Unfortunately, the prevalence
of disease among the poor created a new line of prejudice against these
presumed unsanitary subclasses.
In the United States, where the popular sanitation movement could now be
grounded in germ theory, this fear
of contagion manifested among the developing middle classes was directed
especially against
immigrantscalled human garbage by pundits such as American social critic
Herbert George in 1883. This
led to the Immigration Act of 1891, which mandated physical inspection of
immigrants for diseases of mind
and bodyany number of which could be considered cause for quarantine or
exclusion. Also in 1891, the
Hygienic Laboratory (founded in 1887 and the forerunner of the National
Institutes of Health) moved from
Staten Island (New York City) to Washington, DCa sign of its growing
importance.
That same year, the first International Sanitary Convention was
established. Although restricted to efforts to
control and prevent cholera, it would provide a model of things to come in
the public health arena. In 1902, an
International Sanitary Bureau (later renamed the Pan American Sanitary
Bureau and then the Pan American
Sanitary Organization) was established in Washington, DC, and became the
Strnka 5

Pharmaceutical_Century
forerunner of todays Pan
American Health Organization, which also serves as the World Health
Organizations Regional Office for the
Americas.
Fears of contagion on the one hand and poisoning on the other, resulting
from improperly prepared or stored
medicines, led to the 1902 Biologicals Controls Act, which regulates the
interstate sale of viruses, serums,
antitoxins, and similar products.
One of the significant outgrowths of the new progressive approach to
solving public health problems with
technological expertise and government intervention was the popularity and
influence of a new class of
journalists known as the Muckrakers. Under their impetus, and as the result
of numerous health scandals, The
1906 U.S. Pure Food and Drugs Act, after years of planning by U.S.
Department of Agriculture (USDA)
researchers such as John Wiley, was passed easily. The act established the
USDAs Bureau of Chemistry as
the regulatory agency. Unfortunately, the act gave the federal government
only limited powers of inspection
and control over the industry. Many patent medicines survived this first
round of regulation.
The American Medical Association (AMA) created a Council on Pharmacy and
Chemistry to examine the
issue and then established a chemistry laboratory to lead the attack on the
trade in patent medicines that the
Pure Food and Drugs Act had failed to curb. The ama also published New and
Nonofficial Remedies annually
in an effort to control drugs by highlighting serious issues of safety and
inefficacy. This publication prompted
rapid changes in industry standards.
International health procedures continued to be formalized
alsoLOffice International dHygine Publique (OIHP) was
established in Paris in 1907, with a permanent secretariat and a
permanent committee of senior public health officials. Military and
geopolitical concerns would also dominate world health issues. In
1906, the Yellow Fever Commission was established in Panama to
help with U.S. efforts to build the canal; in 1909, the U.S. Army
began mass vaccination against typhoid.
Nongovernmental organizations also rallied to the cause of medical
progress and reform. In 1904, for example, the U.S. National
Tuberculosis Society was founded (based on earlier European
models) to promote research and social change. It was one of many
groups that throughout the 20th century were responsible for much
of the demand for new medical technologies to treat individual
diseases. Grassroots movements such as these flourished. Public
support was often behind the causes. In 1907, Red Cross volunteer
Emily Bissell designed the first U.S. Christmas Seals (the idea started
in Denmark). The successful campaign provided income to the
Tuberculosis Society and a reminder to the general public of the
importance of medical care. Increased public awareness of diseases
and new technologies such as vaccination, antitoxins, and later,
magic bullets, enhanced a general public hunger for new cures.
The movement into medicine of government and semipublic
organizations such as the AMA and the Tuberculosis Society
throughout the latter half of the 19th and beginning of the 20th
centuries set the stage for a new kind of medicine that was regulated,
tested, and public. Combined with developments in technology and
analysis that made regulation possible, public scrutiny slowly forced
medicine to come out from behind the veil of secret nostrums and
alchemical mysteries.
Strnka 6

Pharmaceutical_Century
The crowning of chemistry
It was not easy for organized science, especially chemistry, to take
hold in the pharmaceutical realm. Breakthroughs in organic synthesis
and analysis had to be matched with developments in biochemistry,
enzymology, and general biology. Finally, new medicines could be
tested for efficacy in a controlled fashion using new
technologieslaboratory animals, bacterial cultures,
chemical analysis, clinical thermometers, and clinical trials, to name a
few. Old medicines could be debunked
using the same methodswith public and nongovernmental organizations such
as the ama providing impetus.
At long last, the scientific community began to break through the fog of
invalid information and medical
chicanery to attempt to create a new pharmacy of pharmaceuticals based on
chemistry, not caprice.
The flowering of biochemistry in the early part of the new century was key,
especially as it related to human
nutrition, anatomy, and disease. Some critical breakthroughs in metabolic
medicine had been made in the
1890s, but they were exceptions rather than regular occurrences. In 1891,
myedema was treated with sheep
thyroid injections. This was the first proof that animal gland solutions
could benefit humans. In 1896, Addisons
disease was treated with chopped up adrenal glands from a pig. These test
treatments provided the starting
point for all hormone research. Also in 1891, a pair of agricultural
scientists developed the AtwaterRosa
calorimeter for large animals. Ultimately, it provided critical baselines
for human and animal nutrition studies.
But it wasnt until the turn of the century that metabolic and nutritional
studies truly took off. In 1900, Karl
Landsteiner discovered the first human blood groups: O, A, and B. That same
year, Frederick Hopkins
discovered tryptophan and demonstrated in rat experiments that it was an
essential amino acidthe first
discovered. In 1901, fats were artificially hydrogenated for storage for
the first time (providing a future century
of heart disease risk). Eugene L. Opie discovered the relationship of
islets of Langerhans to diabetes mellitus,
thus providing the necessary prelude to the discovery of insulin. Japanese
chemist Jokichi Takamine isolated
pure epinephrine (adrenaline). And E. Wildiers discovered a new substance
indispensable for the
development of yeast. Growth substances such as this eventually became
known as vitamines and later,
vitamins.
In 1902, proteins were first shown to be polypeptides, and the AB blood
group was discovered. In 1904, the
first organic coenzymecozymasewas discovered. In 1905, allergies were
first described as a reaction to
foreign proteins by Clemens von Pirquet, and the word hormone was coined.
In 1906, Mikhail Tswett
developed the all-important technique of column chromatography. In 1907,
Ross Harrison developed the first
animal cell culture using frog embryo tissues. In 1908, the first
biological audioradiograph was madeof a
frog. In 1909, Harvey Cushing demonstrated the link of pituitary hormone to
giantism.
Almost immediately after Svante August Arrhenius and Soren
Sorensen demonstrated in 1909 that pH could be measured,
Sorenson pointed out that pH can affect enzymes. This discovery
was a critical step in the development of a biochemical model of
metabolism and kinetics. So many breakthroughs of medical
Strnka 7

Pharmaceutical_Century
significance occurred in organic chemistry and biochemistry in the
first decade of the Pharmaceutical Century that no list can do more
than scratch the surface.
Making magic bullets
It was not the nascent field of genetics, but rather a maturing
chemistry that would launch the most significant early triumph of the
Pharmaceutical Century. Paul Ehrlich first came up with the magic
bullet concept in 1906. (Significant to the first magic bullets ultimate
use, in this same year, August von Wasserman developed his syphilis
test only a year after the bacterial cause was determined.) However,
it wasnt until 1910 that Ehrlichs arsenic compound 606, marketed
by Hoechst as Salvarsan, became the first effective treatment for
syphilis. It was the birth of chemotherapy.
With the cure identified and the public increasingly aware of the
subject, it was not surprising that the progressive U.S. government
intervened in the public health issue of venereal disease. The
CharmerlainKahn Act of 1918 provided the first
federal funding specifically designated for controlling venereal disease.
It should also not be a surprise that this
attack on venereal disease came in the midst of a major war. Similar
campaigns would be remounted in the
1940s.
The fall of chemotherapy
Salvarsan provided both the promise and the peril of chemotherapy. The
arsenicals, unlike the immunologicals,
were not rigidly controlled and were far more subject to misprescription
and misuse. (They had to be
administered in an era when injection meant opening a vein and percolating
the solution into the bloodstream
through glass or rubber tubes.) The problems were almost insurmountable,
especially for rural practitioners.
The toxicity of these therapeutics and the dangers associated with using
them became their downfall. Most
clinicians of the time thought the future was in immunotherapy rather than
chemotherapy, and it wasnt until the
antibiotic revolution of the 1940s that the balance would shift.
Ultimately, despite the manifold breakthroughs in biochemistry and
medicine, the end of the Teens was not a
particularly good time for medicine. The influenza pandemic of 19181920
clearly demonstrated the inability of
medical science to stand up against disease. More than 20 million people
worldwide were killed by a flu that
attacked not the old and frail but the young and strong. This was a disease
that no magic bullet could cure and
no government could stamp out. Both war and pestilence set the stage for
the Roaring Twenties, when many
people were inclined to eat, drink, and make merry as if to celebrate the
optimism of a world ostensibly at
peace.
Still, a burgeoning science of medicine promised a world of wonders
yet to come. Technological optimism and industrial expansion
provided an antidote to the malaise caused by failed promises
revealed in the first two decades of the new century.
But even these promises were suspect as the Progressive Era drew
to a close. Monopoly capitalism and renewed conservatism battled
against government intervention in health care as much as the
economy did and became a familiar refrain. The continued explosive
growth of cities obviated many of the earlier benefits in sanitation and
hygiene with a host of new imported diseases. The constitutive bad
health and nutrition of both the urban and rural poor around the
world grew worse with the economic fallout of the war.
Strnka 8

Pharmaceutical_Century
Many people were convinced that things would only get worse
before they got better.
The Pharmaceutical Century had barely begun.
SALVING WITH SCIENCE (1920s & 1930s)
Throughout the 1920s and 1930s, new technologies and new
science intersected as physiology led to the discovery of vitamins and to
increasing knowledge of hormones and body chemistry. New drugs and new
vaccines flowed from developments started in the previous decades. Sulfa
drugs
became the first of the anti bacterial wonder drugs promising
broad-spectrum
cures. Penicillin was discovered, but its development had to await new
technology (and World War II, which hastened it). New instruments such as
the
ultracentrifuge and refined techniques of X-ray crystallography paralleled
the
development of virology as a science. Isoelectric precipitation and
electrophoresis
first became important for drug purification and analysis. Overall, the
medical
sciences were on a firmer footing than ever before. This was the period in
which
the Food and Drug Administration (FDA) gained independence as a regulatory
agency.And researchers reveled in the expanding knowledge base and their
new
instruments. They created a fusion of medicine and machines that would
ultimately
be known as molecular biology.
The assault on disease
Just as pharmaceutical chemists sought magic bullets for myriad diseases
in the first two decades of the
century, chemists inthe 1920s and 1930s expanded the search for solutions
to the bacterial and viral infections
that continued to plague humankind. Yet, the first great pharmaceutical
discovery of the 1920s addressed not
an infectious disease but a physiological disorder.
Diabetes mellitus is caused by a malfunction of the pancreas, resulting in
the failure of that gland to produce
insulin, the hormone that regulates blood sugar. For most of human history,
this condition meant certain death.
Since the late 19th century, when the connection between diabetes and the
pancreas was first determined,
scientists had attempted to isolate the essential hormone and inject it
into the body to control the disorder.
Using dogs as experimental subjects, numerous researchers had tried and
failed, but in 1921, Canadian
physician Frederick Banting made the necessary breakthrough.
Banting surmised that if he tied off the duct to the pancreas of a living
dog and waited until the gland atrophied
before removing it, there would be no digestive juices left to dissolve the
hormone, which was first called iletin.
Beginning in the late spring of 1921, Banting worked on his project at the
University of Toronto with his
assistant, medical student Charles Best. After many failures, one of the
dogs whose pancreas had been tied off
showed signs of diabetes. Banting and Best removed the pancreas, ground it
up, and dissolved it in a salt
solution to create the long-sought extract. They injected the extract into
the diabetic dog, and within a few
hours the canines blood sugar returned to normal. The scientists had
created the first effective treatment for
Strnka 9

Pharmaceutical_Century
diabetes.
At the insistence of John Macleod, a physiologist at the University of
Toronto who provided the facilities for
Bantings work, biochemists James Collip and E. C. Noble joined the
research team to help purify and
standardize the hormone, which was renamed insulin. Collip purified the
extract for use in human subjects, and
enough successful tests were performed on diabetic patients to determine
that the disorder could be reversed.
The Connaught Laboratories in Canada and the Eli Lilly Co. in the United
States were awarded the rights to
manufacture the drug. Within a few years, enough insulin was being produced
to meet the needs of diabetics
around the world. Although Banting and Best had discovered the solution to
a problem that had troubled
humans for millennia, it was Lillys technical developments (such as the
use of an isoelectric precipitation step)
that enabled large-scale collection of raw material, extraction and
purification of insulin, and supplying of the
drug in a state suitable for clinical use. Only after proper bulk
production and/or synthesis techniques were
established did insulin and many other hormones discovered in the 1920s and
1930s (such as estrogen, the
corticosteroids, and testosterone) become useful and readily available to
the public. This would continue to be
the case with most pharmaceutical breakthroughs throughout the century.
Although the isolation and development of insulin was a critically
important pharmaceutical event, diabetes was
by no means the greatest killer of the 19th and early 20th centuries. That
sordid honor belonged to infectious
diseases, especially pneumonia, and scientists in the 1920s and 1930s
turned with increasing success to the
treatment of some of the most tenacious pestilences. Paul Ehrlich
introduced the world to chemotherapy in the
early years of the 20th century, and his successful assault on syphilis
inspired other chemists to seek miracle
drugs and magic bullets. But what was needed was a drug that could cure
general bacterial infections such
as pneumonia and septicemia. Bacteriologists began in the 1920s to
experiment with dyes that were used to
stain bacteria to make them more visible under the microscope, and a
breakthrough was achieved in the
mid-1930s.
Sulfa drugs and more
The anti-infective breakthrough occurred at Germanys I. G. Farben, which
had hired Gerhard Domagk in the
late 1920s to direct its experimental pathology laboratory in a drive to
become a world leader in the
production of new drugs. Domagk performed a series of experiments on mice
infected with streptococcus
bacteria. He discovered that some previously successful compounds killed
the bacteria in mice but were too
toxic to give to humans. In 1935, after years of experimentation, Domagk
injected an orange-red azo dye
called Prontosil into a group of infected mice. The dye, which was
primarily used to color animal fibers, killed
the bacteria, and, most importantly, all the mice survived. The first
successful use of Prontosil on humans
occurred weeks later, when Domagk gave the drug to a desperate doctor
treating an infant dying of bacterial
infection. The baby lived, but this did not completely convince the
scientific community of the drugs efficacy.
Only when 26 women similarly afflicted with life-threatening infections
were cured during clinical trials in
Strnka 10

Pharmaceutical_Century
London in late 1935 did Prontosil become widely known and celebrated for
its curative powers.
The active part of Prontosil was a substance called sulfanilamide, so
termed by Daniele Bovet of the Pasteur Institute, who determined
that Prontosil broke down in the body and that only a fragment of the
drugs molecule worked against an infection. After the discovery of
the active ingredient, more than 5000 different sulfa drugs were
made and tested, although only about 15 ultimately proved to be of
value. In 1939, Domagk received the 1939 Nobel Prize in
Physiology or Medicine.
Sulfanilamide was brought to the United States by Perrin H. Long
and Eleanor A. Bliss, who used it in clinical applications at Johns
Hopkins University in 1936. It was later discovered that the sulfa
drugs, or sulfonamides, did not actually kill bacteria outright, like
older antiseptics, but halted the growth and multiplication of the
bacteria, while the bodys natural defenses did most of the work.
Certainly the most famous antibacterial discovered in the 1920s and
1930s was penicillinwhich was found through almost sheer
serendipity. In the years after World War I, Alexander Fleming was
seeking better antiseptics, and in 1921 he found a substance in
mucus that killed bacteria. After further experimentation, he learned
that the substance was a protein, which he called lysozyme. Although
Fleming never found a way to purify lysozymes or use them to treat
infectious diseases, the discovery had implications for his later
encounter with penicillin because it demonstrated the existence of
substances that are lethal to certain microbes and harmless to human
tissue.
Flemings major discovery came almost seven years later. While
cleaning his laboratory one afternoon, he noticed large yellow
colonies of mold overgrowing a culture of staphylococcus bacteria
on an agar plate. Fleming realized that something was killing the
bacteria, and he proceeded to experiment with juice extracted from the mold
by spreading it on agar plates
covered with more bacteria. He found that even when the juice was highly
diluted, it destroyed the bacteria.
Calling the new antiseptic penicillin, after the Latin term for brush,
Fleming had two assistants purify the mold
juice, but he performed no tests on infected animal subjects. He published
a paper in 1929 discussing the
potential use of penicillin in surgical dressings but went no further. It
wasnt until the 1940s that penicillin was
taken up by the medical community.
Another important achievement in antibacterial research occurred in the
late 1930s, when Rene Dubos and
colleagues at the Rockefeller Institute for Medical Research inaugurated a
search for soil microorganisms
whose enzymes could destroy lethal bacteria. The hope was that the enzymes
could be adapted for use in
humans. In 1939, Dubos discovered a substance extracted from a soil
bacillus that cured mice infected with
pneumococci. He named it tyrothricin, and it is regarded as the first
antibiotic to be established as a therapeutic
substance. The 1920s and 1930s were also interesting times in malaria
research. The popular antimalarial drug
chloroquine was not formally recognized in the United States until 1946,
but it had been synthesized 12 years
before at Germanys Bayer Laboratories under the name Resochin.
While much of pharmaceutical science concentrated on finding answers to the
problems posed by bacterial
infections, there was some significant work done on viruses as well.
Viruses were identified in the late 19th
century by Dutch botanist Martinus Beijerinck, but a virus was not
Strnka 11

Pharmaceutical_Century
crystallized until 1935, when biochemist
Wendell Stanley processed a ton of infected tobacco leaves down to one
tablespoon of crystalline
powdertobacco mosaic virus. Unlike bacteria, viruses proved to be highly
resistant to assault by
chemotherapy, and thus antiviral research during the era did not yield the
successes of antibiotic research.
Most clinical research was dedicated to the search for vaccines and
prophylactics rather than treatments. Polio
was one of the most feared scourges threatening the worlds children in the
1920s and 1930s, but no real
breakthroughs came until the late 1940s. Scientifically, the greatest
progress in antiviral research was probably
made in investigations of the yellow fever virus, which were underwritten
by the Rockefeller Foundation in the
1930s. But as in the case of polio, years passed before a vaccine was
developed. In the meantime, efforts by
the public health services of many nations, including the United States,
promoted vaccination as part of the
battle against several deadly diseases, from smallpox to typhoid fever.
Special efforts were often made to
attack the diseases in rural areas, where few doctors were available.
Vitamins and deficiency diseases
Whereas the medical fight against pernicious bacteria and viruses brings to
mind a military battle against
invading forces, some diseases are caused by internal treachery and
metabolic deficiency rather than external
assault. This notion is now commonplace, yet in the early 20th century it
was hotly contested.
For the first time, scientists of the era isolated and purified food
factors, or vitamines, and understood that
the absence of vitamins has various detrimental effects on the body,
depending on which ones are in short
supply. Vitamins occur in the body in very small concentrations; thus, in
these early years, determining which
foods contained which vitamins, and analyzing their structure and effects
on health, was complex and
time-intensive. Ultimately, scientists discerned that vitamins are
essential for converting food into energy and
are critical to human growth. As a result of this research, by the late
1930s, several vitamins and vitamin
mixtures were used for therapeutic purposes.
The isolation of specific vitamins began in earnest in the second decade of
the 20th century and continued into
the 1920s and 1930s. Experiments in 1916 showed that fat-soluble vitamin A
was necessary for normal
growth in young rats; in 1919, Harry Steenbock, then an agricultural
chemist at the University of Michigan,
observed that the vitamin A content of vegetables varies with the degree of
vegetable pigmentation. It was later
determined that vitamin A is derived from the plant pigment carotene. Also
in 1919, Edward Mellanby proved
that rickets is caused by a dietary deficiency. His research indicated that
the deficiency could be
overcomeand rickets prevented or curedby adding certain fats to the diet,
particularly cod-liver oil. At
first, Mellanby thought that vitamin A was the critical factor, but further
experimentation did not support this
hypothesis. Three years later, Elmer McCollum and associates at Johns
Hopkins University offered clear proof
that vitamin A did not prevent rickets and that the antirachitic factor in
cod-liver oil was the fat-soluble vitamin
D. The research team soon developed a method for estimating the vitamin D
content in foods.
Strnka 12

Pharmaceutical_Century
Experiments on vitamin D continued into the mid-1920s. The most significant
were the projects of Steenbock
and Alfred Hess, working in Wisconsin and New York, respectively, who
reported that antirachitic potency
could be conveyed to some biological materials be exposing them to a
mercury-vapor lamp. The substance in
food that was activated by ultraviolet radiation was not fat, but a
compound associated with fat called
ergosterol, which is also present in human skin. The scientists surmised
that the explanation of the antirachitic
effect of sunlight is that ultraviolet rays form vitamin D from ergosterol
in the skin, which then passes into the
blood. The term vitamin D-1 was applied to the first antirachitic substance
to be isolated from irradiated
ergosterol, and we now know that there are several forms of vitamin D.
Other important vitamin studies took place in the 1920s and 1930s that had
implications for the future of
pharmaceutical chemistry. In 1929, Henrik Dam, a Danish biochemist,
discovered that chicks fed a diet that
contained no fat developed a tendency toward hemophilia. Five years later,
Dam and colleagues discovered
that if hemp seeds were added to the chicks diet, bleeding did not occur.
The substance in the seeds that
protected against hemorrhage was named vitamin K, for koagulation vitamin.
In 1935, Armand Quick and
colleagues at Marquette University reported that the bleeding often
associated with jaundiced patients was
caused by a decrease in the blood coagulation factor prothrombin. This
study was complemented by a report
by H. R. Butt and E. D. Warner that stated that a combination of bile salts
and vitamin K effectively relieved
the hemorrhagic tendency in jaundiced patients. All of these scientists
work pointed to the conclusion that
vitamin K was linked to the clotting of blood and was necessary for the
prevention of hemorrhage, and that
vitamin K was essential for the formation of prothrombin.
Dam and the Swiss chemist Paul Kirrer reported in 1939 that they had
prepared pure vitamin K from green
leaves. In the same year, Edward Doisy, a biochemist at Saint Louis
University, isolated vitamin K from alfalfa,
determined its chemical composition, and synthesized it in the laboratory.
Vitamin K was now available for
treating patients who suffered from blood clotting problems. Dam and Doisy
received the 1943 Nobel Prize in
Physiology or Medicine for their work.
With the advent of successful research on vitamins came greater commercial
exploitation of these substances.
In 1933, Tadeus Reichstein synthesized ascorbic acid (vitamin C), making it
readily available thereafter. The
consumption of vitamins increased in the 1930s, and popular belief held
them to be almost magical.
Manufacturers, of course, did not hesitate to take advantage of this
credulity. There was no informed public
regulation of the sale and use of vitamins, and, as some vitamins were
dangerous in excess quantities, this had
drastic results in isolated cases. Water-soluble vitamins such as vitamin C
easily flow out of the body through
the kidneys, but the fat-soluble vitamins, such as A, D, and K, could not
so easily be disposed of and might
therefore prove especially dangerous. Many physicians of the era did
nothing to discourage popular
misconceptions about vitamins, or harbored relatively uncritical beliefs
themselves.
Strnka 13

Pharmaceutical_Century
The FDA and federal regulation
The threat posed by unregulated vitamins was not nearly as
dangerous as the potential consequences of unregulated drugs. Yet
legislation was enacted only when drug tragedies incensed the public
and forced Congress to act. After the 1906 Pure Food and Drug
Act, there was no federal legislation dealing with drugs for decades,
although the American Medical Association (AMA) did attempt to
educate physicians and the public about pharmaceuticals. The AMA
published books exposing quack medicines, gradually adopted
standards for advertisements in medical journals, and in 1929,
initiated a program of testing drugs and granting a Seal of
Acceptance to those meeting its standards. Only drugs that received
the seal were eligible to advertise in AMA journals.
Dangerous drugs were still sold legally, however, because safety
testing was not required before marketing. Well into the 1930s,
pharmaceutical companies still manufactured many 19th and early
20th century drugs that were sold in bulk to pharmacists, who then
compounded them into physicians prescriptions. But newer drugs,
such as many biologicals and sulfa drugs (after 1935), were
packaged for sale directly to consumers and seemed to represent the
future of drug manufacturing.
In 1937, an American pharmaceutical company produced a liquid
sulfa drug. Attempting to make sulfanilamide useful for injections, the
company mixed it with diethylene glycolthe toxic chemical now
used in automobile antifreeze. Ultimately sold as a syrup called Elixir
of Sulfanilamide, the drug concoction was on the market for two
months, in which time it killed more than 100 people, including many
children, who drank it.
Under existing federal legislation, the manufacturer could be held
liable only for mislabeling the product. In response to this tragedy
and a series of other scandals, Congress passed the Food, Drug and
Cosmetic Act of 1938, which banned drugs that were dangerous
when used as directed, and required drug labels to include directions
for use and appropriate warnings. The act also required new drugs to
be tested for safety before being granted federal government
approval and created a new category of drugs that could be dispensed to a
patient only at the request of a
physician. Before the act was passed, patients could purchase any drug,
except narcotics, from pharmacists.
The Food and Drug Administration (the regulatory division established in
1927 from the former Bureau of
Chemistry) was given responsibility for implementing these laws.
The 1938 legislation is the basic law that still regulates the
pharmaceutical industry. New manufacturing and
mass-marketing methods demanded changes in federal oversight, because a
single compounding error can
cause hundreds or even thousands of deaths. Yet before the 1940s, the law
did not require drugs to be
effective, only safe when used as directed. It was not until the 1940s that
the Federal Trade Commission
forced drug manufacturers to substantiate claims made about their products,
at least those sold in interstate
commerce.
Instrumentation
Although the 1920s and 1930s were especially fruitful for soft
technologies such as antibiotics and vitamin
production, these decades also produced several significant hard
technologiesscientific instruments that
transformed pharmaceutical R&D. Initially, one might think of the electron
microscope, which was developed
in 1931 in Germany. This early transmission electron microscope, invented
by Max Knoll and Ernst Ruska,
was essential for the future of pharmaceutical and biomedical research, but
Strnka 14

Pharmaceutical_Century
many other critical instruments
came out of this era. Instrument production often brought people from
disparate disciplines together on
research teams, as physical chemists and physicists collaborated with
biochemists and physiologists. New
research fields were created along the boundary between chemistry and
physics, and in the process, many new
instruments were invented or adapted to molecular phenomena and
biomolecular problems.
One of the scientists who worked under the aegis of Warren
Weaver, research administrator for the natural sciences at the
Rockefeller Foundation, was Swedish chemist Theodor
Svedberg. Svedbergs early research was on colloids, and he
began to develop high-speed centrifuges in the hope that they
might provide an exact method for measuring the distribution of
particle size in the solutions. In 1924, he developed the first
ultracentrifuge, which generated a centrifugal force up to 5000
times the force of gravity. Later versions generated forces
hundreds of thousands of times the force of gravity. Svedberg
precisely determined the molecular weights of highly complex
proteins, including hemoglobin. In later years, he performed
studies in nuclear chemistry, contributed to the development of
the cyclotron, and helped his student, Arne Tiselius, develop
electrophoresis to separate and analyze proteins.
Another essential instrument developed in this era was the pH
meter with a glass electrode. Kenneth Goode first used a vacuum
triode to measure pH in 1921, but this potentiometer was not
coupled to a glass electrode until 1928, when two groups (at
New York University and the University of Illinois) measured pH
by using this combination. Rapid and inexpensive pH
measurement was not a reality until 1934, however, when Arnold
Beckman of the California Institute of Technology and corporate
chemist Glen Joseph substituted a vacuum tube voltmeter for a galvanometer
and assembled a sturdy
measuring device with two vacuum tubes and a milliammeter. The portable pH
meter was marketed in 1935
for $195. In the world of medical applications, the electrometer dosimeter
was developed in the mid-1920s to
assess exposure to ionizing radiation for medical treatment, radiation
protection, and industrial exposure
control. For clinical dosimetry and treatment planning, an ionization
chamber connected to an electrometer was
valued for its convenience, versatility, sensitivity, and reproducibility.
It was not simply the invention of instruments, but the way research was
organized around them, that made the
1920s and 1930s so fertile for biochemistry. Weaver was involved in
encouraging and funding much of this
activity, whether it was Svedbergs work on molecular evolution or Linus
Paulings use of X-ray diffraction to
measure bond lengths and bond angles, and his development of the method of
electron diffraction to measure
the architecture of organic compounds. Inventing and developing new
instruments allowed scientists to
combine physics and chemistry and advance the field of pharmaceutical
science.
Radioisotopes
Another powerful technological development that was refined during the
1920s and 1930s was the use of
radioactive forms of elementsradioisotopesin research. Hungarian chemist
Georg von Hevesy introduced
radioisotopes into experimental use in 1913, tracing the behavior of
nonradioactive forms of selected elements;
he later used a radioisotope of lead to trace the movement of lead from
soil into bean plants.
Strnka 15

Pharmaceutical_Century
The radioactive tracer was an alternative to more arduous methods of
measurement and study. By the late
1920s, researchers applied the tracer technique to humans by injecting
dissolved radon into the bloodstream to
measure the rate of blood circulation. Yet there were limits to the use of
radioisotopes, owing to the fact that
some important elements in living organisms do not possess naturally
occurring radioisotopes.
This difficulty was overcome in the early 1930s, when medical researchers
realized that the cyclotron, or atom
smasher, invented by physicist Ernest Lawrence, could be used to create
radioisotopes for treatment and
research. Radiosodium was first used in 1936 to treat several leukemia
patients; the following year,
Lawrences brother, John, used radiophosphorus to treat the same disease. A
similar method was used to treat
another blood disease, polycythemia vera, and soon it became a standard
treatment for that malady. Joseph
Hamilton and Robert Stone at the University of California, Berkeley,
pioneered the use of cyclotron-produced
radioisotopes for treating cancer in 1938; and one year later, Ernest
Lawrence constructed an even larger
atom smasher, known as the medical cyclotron, which would create
additional radioisotopes in the hopes of
treating cancer and other diseases.
Thus began the age of nuclear medicine, in which the skills of
physicists were necessary to produce materials critical to biochemical
research. The new use of radioisotopes was a far cry from the quack
medicines of the period that used the mystique of radioactivity to
peddle radium pills and elixirs for human consumptionalthough in
their ignorance, many legitimate doctors also did far more harm than
good. The cost of producing radioisotopes was high, as cyclotrons
often operated continuously at full power, requiring the attention of
physicists around the clock. The human and financial resources of
physics departments were strained in these early years, which made
the contributions of foundations essential to the continuation of these
innovative projects. Ultimately, as the century wore on, radioisotopes
came into routine use, the federal governments role increased
immensely, and radioisotopes were mass-produced in the reactors of
the Atomic Energy Commission. After World War II, nuclear
medicine occupied a permanent place in pharmaceutical science.
By the end of the 1930s, the work of scientists such as Svedberg, Tiselius,
Banting, Dubos and Domagk, along
with the vision of administrators such as Weaver, set the stage for the
unparalleled developments of the
antibiotic era to come. In addition, World War II, already beginning in
Europe, spurred a wealth of research
into perfecting known technologies and developing new onesinstruments and
processes that would have a
profound impact on the direction that biology and pharmacology would
ultimately take.
ANTIBIOTICS AND ISOTOPES (1940s)
As the American public danced to the beat of the Big Band
era, so did pharmacology swing into action with the upbeat tone of the
dawning antibiotic era.
The latter is the nickname most commonly used for the 1940s among
scientists
in the world of biotechnology and pharmaceuticals. The nickname is more
than
justified, given the numerous impressive molecules developed during the
decade. Many firsts were accomplished in the drug discovery industry of
Strnka 16

Pharmaceutical_Century
the
Forties, but no longer in a serendipitous fashion as before. Researchers
were
actually looking for drugs and finding them.
To appreciate the events that paved the way for the advancements of the
1940s, one need only look back to 1939, when Rene Dubos of the
Rockefeller Institute for Medical Research discovered and isolated an
antibacterial compoundtyrothricin, from the soil microbe Bacillus
breviscapable of destroying Gram-positive bacteria.Before this discovery,
penicillin and the sulfa drugs had been discovered accidentally. Dubos
planned his experiment to search soil
for microorganisms that could destroy organisms related to diseases. It
was a planned experiment. It wasnt a
chance observation, explains H. Boyd Woodruff, who worked on the
penicillin project at Merck in the early
1940s, [and because the experiment] had been successful, it sort of opened
the field in terms of looking at soil
for microorganisms that kill disease organisms.
Flemings serendipity
The most famous example of serendipity in the 20th century has to be the
discovery of penicillin by Alexander
Fleming as discussed in the previous chapter. Although Fleming observed the
antibiotic properties of the mold
Penicillium notatum in 1928, it was another 12 years before the active
ingredient, penicillin, was isolated and
refined.
Of course, as in most of history, there are some inconsistencies. Fleming
was not the first scientist to observe
the antibacterial action of penicillin. In 1896 in Lyon, France, Ernest
Augustin Duchesne studied the survival
and growth of bacteria and molds, separately and together. He observed that
the mold Penicillium glaucum
had antibacterial properties against strains of both Escherichia coli and
typhoid bacilli. The antibacterial
properties of penicillin serendipitously surfaced at least three times
during the course of scientific history before
scientists used its power. And the third time was definitely a charm.
Finding a magic bullet
An Oxford University student of pathology, Howard Walter Floreys early
research interests involved mucus
secretion and lysozymean antibacterial enzyme originally discovered by
Fleming. The more he learned about
the antibacterial properties of lysozyme and intestinal mucus, the more
interested he became in understanding
the actual chemistry behind the enzymatic reactions. However, he did not
have the opportunity to work with
chemists until 1935, when Florey hired Ernst Boris Chain to set up a
biochemistry section in the department of
pathology at the Sir William Dunn School of Pathology at Oxford. Because
Chain was a chemist, Florey
encouraged him to study the molecular action of lysozyme. Florey wanted to
find out whether lysozyme played
a role in duodenal ulcers and was less interested in its antibacterial
properties.
During a scientific literature search on bacteriolysis, Chain came upon
Flemings published report of penicillin,
which had, as Chain describes, sunk into oblivion in the literature.
Chain thought that the active substance
inducing staphylococcus lysis might be similar to lysozyme, and that their
modes of action might also be similar.
He set out to isolate penicillin to satisfy his own scientific curiosity
and to answer a biological problemwhat
reaction lysozyme catalyzesnot to find a drug.
Strnka 17

Pharmaceutical_Century
The scientific collaborations and discussions between Chain and Florey
eventually laid the foundation for their
1939 funding application to study the antimicrobial products of
microorganisms. It never crossed their minds
that one of these antimicrobial products would be the next magic bullet.
The timing of their funded research is
also significantit occurred within months of Great Britains declaration
of war with Germany and the
beginning of World War II.
Because of its activity against staphylococcus, Flemings penicillin was
one of the first compounds chosen for
the study. The first step toward isolating penicillin came in March 1940 at
the suggestion of colleague Norman
G. Heatley. The team extracted the acidified culture filtrate into organic
solution and then re-extracted penicillin
into a neutral aqueous solution. In May, Florey examined the
chemotherapeutic effects of penicillin by treating
four of eight mice infected with Streptomyces pyogenes. The mice treated
with penicillin survived, whereas the
other 4 died within 15 hours. In September, Henry Dawson and colleagues
confirmed the antibiotic properties
of penicillin by taking a bold step and injecting penicillin into a patient
at Columbia Presbyterian Hospital (New
York).
With the help of Chain, Heatley, Edward P. Abraham, and other
chemists, Florey was able to
scrape up enough penicillin to perform clinical trials at the
Infirmary in Oxford in February 1941. The
first patient treated was dying of S. aureus and S. pyogenes.
with penicillin resulted in an amazing
recovery, but because of insufficient quantities of the drug,
eventually died after a relapse.

Dunn School
Radcliffe
Treatment
the patient

Over the next three months, five other patients responded well when treated
with penicillin. All of these
patients were seriously ill with staphylococcal or streptococcal infections
that could not be treated with
sulfonamide. These trials proved the effectiveness of penicillin when
compared to the sulfa drugs, which at the
time were considered the gold standard for treating infections.
Producing penicillin
Florey had difficulties isolating the quantities of penicillin required to
prove its value. In the early years, the
Oxford team grew the mold by surface culture in anything they could lay
their hands on. Because of the war,
they couldnt get the glass flasks they wanted, so they used bedpans until
Florey convinced a manufacturer to
make porcelain pots, which incidentally resembled bedpans. Britain was deep
into the war, and the British
pharmaceutical industry did not have the personnel, material, or funds to
help Florey produce penicillin.
Florey and Heatley came to the United States in June 1941 to seek
assistance from the American
pharmaceutical industry. They traveled around the country but could not
garner interest for the project.
Because of the as yet ill-defined growing conditions for P. notatum and the
instability of the active compound,
the yield of penicillin was low and it was not economically feasible to
produce. Florey and Heatley ended up
working with the U.S. Department of Agricultures Northern Regional
Research Laboratory in Peoria, IL.
The agricultural research center had excellent fermentation facilities, but
Strnka 18

Pharmaceutical_Century
more importantlyunlike any other
facility in the countryit used corn steep liquor in the medium when faced
with problematic cultures. This
liquor yielded remarkable results for the penicillin culture. The
production of penicillin increased by more than
10-fold, and the resulting penicillin was stable. It turns out that the
penicillin (penicillin G) produced at the
Peoria site was an entirely different compound from the penicillin
(penicillin F) produced in Britain. Fortunately
for all parties involved, penicillin G demonstrated the same antibacterial
properties against infections as
penicillin F. With these new developments, Merck, Pfizer, and Squibb agreed
to collaborate on the
development of penicillin.
By this time, the United States had entered the war, and the U.S.
government was encouraging pharmaceutical
companies to collaborate and successfully produce enough penicillin to
treat war-related injuries. By 1943,
several U.S. pharmaceutical companies were mass-producing purified
penicillin G (~21 billion dosage units per
month), and it became readily available to treat bacterial infections
contracted by soldiers. In fact, by 1944,
there was sufficient penicillin to treat all of the severe battle wounds
incurred on D-day at Normandy. Also,
diseases like syphilis and gonorrhea could suddenly be treated more easily
than with earlier treatments, which
included urethra cleaning and doses of noxious chemicals such as mercury or
Salvarsan. The Americans
continued to produce penicillin at a phenomenal rate, reaching nearly 7
trillion units per month in 1945.
Fleming, Florey, and Chain were recognized for the discovery of penicillin
and its curative effect in various
infectious diseases in 1945 when they received the Nobel Prize in
Physiology or Medicine.
But all magic bullets lose their luster, and penicillin was no different.
Dubos had the foresight to understand the
unfortunate potential of antibiotic-resistant bacteria and encouraged
prudent use of antibiotics. As a result of
this fear, Dubos stopped searching for naturally occurring compounds with
antibacterial properties.
As early as 1940, Abraham and Chain identified a strain of S.
aureus that could not be treated with penicillin. This seemingly small,
almost insignificant event foreshadowed the wave of
antibiotic-resistant microorganisms that became such a problem
throughout the medical field toward the end of the century.
Malaria and quinine
Although penicillin was valuable against the battle-wound infections
and venereal diseases that have always afflicted soldiers, it was not
effective against the malaria that was killing off the troops in the
mosquito-ridden South Pacific. The Americans entered Guadal canal
in June 1942, and by August there were 900 cases of malaria; in
September, there were 1724, and in October, 2630. By December
1942, more than 8500 U.S. soldiers were hospitalized with malaria.
Ninety percent of the men had contracted the disease, and in one
hospital, as many as eight of every 10 soldiers had malaria rather
than combat-related injuries.
The only available treatment, however, was the justifiably unpopular
drug Atabrine. Besides tasting bitter, the yellow Atabrine pills caused
headaches, nausea, vomiting, and in some cases, temporary
psychosis. It also seemed to leave a sickly hue to the skin and was
falsely rumored to cause impotence. Nevertheless, it was effective
and saved lives. Firms such as Abbott, Lilly, Merck, and Frederick
Stearns assured a steady supply of Atabrine, producing 3.5 billion
Strnka 19

Pharmaceutical_Century
tablets in 1944 alone.
But Atabrine lacked the efficacy of quinine, which is isolated from
cinchona, an evergreen tree native to the mountains of South and
Central America. Unfortunately, the United States did not have a
sufficient supply of quinine in reserve when the war broke out. As a
result, the U.S. government established the
Cinchona Mission in 1942. Teams of botanists, foresters, and assistants
went to South America to find and
collect quinine-rich strains of the treea costly, strenuous, and
time-consuming task.
Out of desperation, research to develop antimalarials intensified. As an
unfortunate example of this
desperation, prison doctors in the Chicago area experimentally infected
nearly 400 inmates with malaria during
their search for a therapeutic. Although aware that they were helping the
war effort, the prisoners were not
given sufficient information about the details and risks of the clinical
experiments. After the war, Nazi doctors
on trial for war crimes in Nuremberg referred to this incident as part of
their defense for their criminal treatment
of prisoners while aiding the German war effort.
In 1944, William E. Doering and Robert B. Woodward synthesized quininea
complex molecular
structurefrom coal tar. Woodwards achievements in the art of organic
synthesis earned him the 1965
Nobel Prize in Chemistry. Chloroquine, another important antimalarial, was
synthesized and studied under the
name of Resochin by the German company Bayer in 1934 and rediscovered in
the mid-1940s. Even though
chloroquine-resistant parasites cause illness throughout the world, the
drug is still the primary treatment for
malaria.
Streptomycin and tuberculosis
When Dubos presented his results with tyrothricin at the Third
International Congress for Microbiology in New York in 1939,
Selman A. Waksman was there to see it. The successful
development of penicillin and the discovery of tyrothricin made
Waksman realize the enormous potential of soil as a source of
druglike compounds. He immediately decided to focus on the
medicinal uses of antibacterial soil microbes.
In 1940, Woodruff and Waksman isolated and purified actinomycin
from Actinomyces griseus (later named Streptomyces griseus),
which led to the discovery of many other antibiotics from that same
group of microorganisms. Actinomycin attacks Gram-negative
bacteria responsible for diseases like typhoid, dysentery, cholera,
and undulant fever and was the first antibiotic purified from an
actinomycete. Considered too toxic for the treatment of diseases in
animals or humans, actinomycin is primarily used as an investigative
tool in cell biology. In 1942, the two researchers isolated and
purified streptothricin, which prevents the proliferation of
Mycobacterium tuberculosis but is also too toxic for human use.
A couple of years later, in 1944, Waksman, with Albert Schatz and
Elizabeth Bugie, isolated the first aminoglycoside, streptomycin, from
S. griseus. Like penicillin, aminoglycosides decrease protein synthesis in
bacterial cells, except that
streptomycin targets Gram-positive organisms instead of Gram-negatives.
Waksman studied the value of
streptomycin in treating bacterial infections, especially tuberculosis. In
1942, several hundred thousand deaths
resulted from tuberculosis in Europe, and another 5 to 10 million people
suffered from the disease. Although
sulfa drugs and penicillin were readily available, they literally had no
Strnka 20

Pharmaceutical_Century
effect.
Merck immediately started manufacturing streptomycin with the help of
Woodruff. A consultant for Merck,
Waksman sent Woodruff to Merck to help with the penicillin project, and
after finishing his thesis, Woodruff
continued working there. Simultaneously, studies by W. H. Feldman and H. C.
Hinshaw at the Mayo Clinic
confirmed streptomycins efficacy and relatively low toxicity against
tuberculosis in guinea pigs. On November
20, 1944, doctors administered streptomycin for the first time to a
seriously ill tuberculosis patient and
observed a rapid, impressive recovery. No longer unconquerable,
tuberculosis could be tamed and beaten into
retreat. In 1952, Waksman was awarded the Nobel Prize in Physiology or
Medicine for his discovery of
streptomycin1 of 18 antibiotics discovered under his guidanceand its
therapeutic effects in patients
suffering from tuberculosis.
Merck had just developed streptomycin and moved it into the marketplace
when the company stumbled upon
another great discovery. At the time, doctors treated patients with
pernicious anemia by injecting them with
liver extracts, which contained a factor required for curing and
controlling the disease. When patients stopped
receiving injections, the disease redeveloped. The Merck chemists had been
working on isolating what was
called the pernicious anemia factor from liver extracts, and they decided
to look at the cultures grown by
Woodruff and other microbiologists at Merck, to see if one of the cultures
might produce the pernicious
anemia factor. They found a strain of S. griseus similar to the
streptomycin-producing strain that made the
pernicious anemia factor.
With the help of Mary Shorbs Lactobacillus lactis assay to guide the
purification and crystallization of the
factor, Merck scientists were able to manufacture and market the factor as
a cure for pernicious anemia. The
factor turned out to be a vitamin, and it was later named vitamin B12. As
Woodruff describes the period, So,
we jumped from penicillin to streptomycin to vitamin B12. We got them
1-2-3, bang-bang-bang. Merck
struck gold three times in a row. The United Kingdoms only woman Nobel
laureate, Dorothy Crowfoot
Hodgkin, solved the molecular structure of vitamin B12 in 1956just as she
had for penicillin in the 1940s, a
discovery that was withheld from publication until World War II was over.
The continuing search
After developing penicillin, U.S. pharmaceutical companies
continued to search for antibiotics, a term coined by P. Vuillemin in
1889 but later defined by Waksman in 1947 as those chemical
substances produced by microbes that inhibit the growth of and
even destroy other microbes.
In 1948, Benjamin M. Duggar, a professor at the University of
Wisconsin and a consultant to Lederle, isolated chlortetracycline
from Streptomyces aureofaciens. Chlortetracycline, also called
aureomycin, was the first tetracycline antibiotic and the first
broad-spectrum antibiotic. Active against an estimated 50 disease
organisms, aureomycin works by inhibiting protein synthesis. The
discovery of the tetracycline ring system also enabled further
development of other important antibiotics.
Other antibiotics with inhibitory effects on cell wall synthesis were
also discovered in the 1940s and include cephalosporin and
Strnka 21

Pharmaceutical_Century
bacitracin. Another -lactam antibiotic, cephalosporin was first
isolated from Cephalosporium acremonium in 1948 by Guiseppe
Brotzu at the University of Cagliari in Italy. Bacitracin, first derived
from a strain of Bacillus subtilis, is active against Gram-positive
bacteria and is used topically to treat skin infections.
Nonantibiotic therapeutics
Even though Lederle Laboratories was a blood processing plant during World
War II, it evolved into a
manufacturer of vitamins and nutritional products, including folic acid.
Sidney Farber, a cancer scientist at
Bostons Childrens Hospital, was testing the effects of folic acid on
cancer. Some of his results, which now
look dubious, suggested that folic acid worsened cancer conditions,
inspiring chemists at Lederle to make
antimetabolitesstructural mimics of essential metabolites that interfere
with any biosynthetic reaction involving
the intermediatesresembling folic acid to block its action. These events
led to the 1948 development of
methotrexate, one of the earliest anticancer agents and the mainstay of
leukemia chemotherapy.
But the pioneer of designing and synthesizing antimetabolites that could
destroy cancer cells was George
Hitchings, head of the department of biochemistry at Burroughs Wellcome Co.
In 1942, Hitchings initiated his
DNA-based antimetabolite program, and in 1948, he and Gertrude Elion
synthesized and demonstrated the
anticancer activity of 2,6-diaminopurine. By fine-tuning the structure of
the toxic compound, Elion synthesized
6-mercaptopurine, a successful therapeutic for treating acute leukemia.
Hitchings, Elion, and Sir James W.
Black won the Nobel Prize in Physiology or Medicine in 1988 for their
discoveries of important principles for
drug treatment, which constituted the groundwork for rational drug design.
The discovery of corticosteroids as a therapeutic can be linked to Thomas
Addison, who made the connection
between the adrenal glands and the rare Addisons disease in 1855. But it
wasnt until Edward Calvin Kendall
at the Mayo Clinic and Thadeus Reichstein at the University of Basel
independently isolated several hormones
from the adrenal cortex that corticosteroids were used to treat a more
widespread malady. In 1948, Kendall
and Philip S. Hench demonstrated the successful treatment of patients with
rheumatoid arthritis using cortisone.
Kendall, Reichstein, and Hench received the 1950 Nobel Prize in Physiology
or Medicine for determining the
structure and biological effects of adrenal cortex hormones.
One of the first therapeutic drugs to prevent cardiovascular disease also
came from this period. While
investigating the mysterious deaths of farm cows, Karl Paul Link at the
University of Wisconsin proved that the
loss of clotting ability in cattle was linked to the intake of sweet
clover. He and his colleagues then isolated the
anticoagulant and blood thinner dicoumarol (warfarin) from coumarin, a
substance found in sweet clover, in
1940.
Many other advances
The synthesis, isolation, and therapeutic applications of miracle drugs may
be the most well-remembered
discoveries of the 1940s for medical chemists and biochemists, but advances
in experimental genetics, biology,
and virology were also happening. These advances include isolating the
influenza B virus in 1940, by Thomas
Francis at New York University and, independently, by Thomas Pleines
Strnka 22

Pharmaceutical_Century
Magill. Also in 1940, at the New
York HospitalCornell University Medical Center, Mary Loveless succeeded in
blocking the generation of
immunotherapy-induced antibodies using pollen extracts. Routine use of the
electron microscope in virology
followed the first photos of tobacco-mosaic virus by Helmut Ruska, an
intern at the Charit Medical School of
Berlin University, in 1939; and the 1940s also saw numerous breakthroughs
in immunology, including the first
description of phagocytosis by a neutrophil.
In 1926, Hermann J. Muller, a professor at the University of Texas at
Austin, reported the identification of
several irradiation-induced genetic alterations, or mutations, in
Drosophila that resulted in readily observed
traits. This work, which earned Muller the Nobel Prize in Physiology or
Medicine in 1946, enabled scientists
to recognize mutations in genes as the cause of specific phenotypes, but it
was still unclear how mutated genes
led to the observed phenotypes.
In 1935, George Wells Beadle began studying the development of eye pigment
in Drosophila with Boris
Ephrussi at the Institut de Biologie Physico-Chimique in Paris. Beadle then
collaborated with Edward Lawrie
Tatum when they both joined Stanford in 1937Beadle as a professor of
biology (genetics) and Tatum as a
research associate in the department of biological sciences. Tatum, who had
a background in chemistry and
biochemistry, handled the chemical aspects of the Drosophila eye-color
study. Beadle and Tatum eventually
switched to the fungus Neurospora crassa, a bread mold. After producing
mutants of Neurospora by
irradiation and searching for interesting phenotypes, they found several
auxotrophsstrains that grow normally
on rich media but cannot grow on minimal medium. Each mutant required its
own specific nutritional
supplement, and each requirement correlated to the loss of a compound
normally synthesized by the organism.
By determining that each mutant evoked a deficiency in a specific metabolic
pathway, which was known to be
controlled by enzymes, Beadle and Tatum concluded in a 1940 report that
each gene produced a single
enzyme, also called the single genesingle enzyme concept. The two
scientists shared the Nobel Prize in
Physiology or Medicine in 1958 for discovering that genes regulate the
function of enzymes and that each gene
controls a specific enzyme.
Also recognized with the same prize in 1958 was Joshua Lederberg. As a
graduate student in Tatums
laboratory in 1946, Lederberg found that some plasmids enable bacteria to
transfer genetic material to each
other by forming direct cellcell contact in a process called conjugation.
He also showed that F (fertility)
factors allowed conjugation to occur. In addition, Lederberg defined the
concepts of generalized and
specialized transduction, collaborated with other scientists to develop the
selection theory of antibody
formation, and demonstrated that penicillin-susceptible bacteria could be
grown in the antibiotics presence if a
hypotonic medium was used.
In the field of virology, John Franklin Enders, Thomas H. Weller, and
Frederick Chapman Robbins at the
Childrens Hospital Medical Center in Boston figured out in 1949 how to
grow poliovirus in test-tube cultures
of human tissuesa technique enabling the isolation and study of viruses.
Strnka 23

Pharmaceutical_Century
Polio, often referred to as infantile
paralysis, was one of the most feared diseases of the era. These
researchers received the Nobel Prize in
Physiology or Medicine in 1954.
Salvador Luria, at Indiana University, and Alfred Day Hershey, at
Washington Universitys School of
Medicine, demonstrated that the mutation of bacteriophages makes it
difficult for a host to develop immunity
against viruses. In 1942, Thomas Anderson and Luria photographed and
characterized E. coli T2
bacteriophages using an electron microscope. Luria and Max Delbrck, at
Vanderbilt University, used
statistical methods to demonstrate that inheritance in bacteria follows
Darwinian principles. Luria, Hershey, and
Delbrck were awarded the Nobel Prize in Physiology or Medicine in 1969 for
elucidating the replication
mechanism and genetic structure of viruses.
Although these discoveries were made outside of the pharmaceutical
industry, their applications contributed
enormously to understanding the mechanisms of diseases and therapeutic
drugs.
Biological and chemical warfare
Biological warfarethe use of disease to harm or kill an adversarys
military forces, population, food, and
livestockcan involve any living microorganism, nonliving virus, or
bioactive substance deliverable by
conventional artillery. The history of biological warfare can be traced
back to the Romans, who used dead
animals to infect their enemies water supply. The United States started a
biological warfare program in 1942
after obtaining Japanese data about the destructive use of chemical and
biological agents from pardoned war
criminals. Japan sprayed bubonic plague over parts of mainland China on
five occasions in 1941. Despite the
fact that the spraying was ineffective, the attempts prompted the United
States to develop its biological warfare
program. Later, the developing Cold War further stimulated this research in
the United Statesand in the
Soviet Union.
Ironically, the first chemotherapeutic agent for cancer came from an early
instance of chemical warfare. Initially
used as a weapon in World War I, mustard gas proved useful in treating mice
and a person with lymphoma in
1942, when Alfred Gilman and Fred Phillips experimentally administered the
chemical weapon as a
therapeutic. Because the patient showed some improvement, chemical
derivatives of mustard gas were
developed and used to treat various cancers.
The Nuremberg Code
Not only did World War II encourage the discovery and development of
antibiotics and antidisease drugs, it
also instigated the need to define what constitutes permissible medical
experiments on human subjects. The
Nazis performed cruel and criminal medical experiments on Jews and other
prisoners during the war. In
1949, the Nuremberg Code was established in an effort to prevent medical
crimes against humanity. The Code
requires that individuals enrolled in clinical trials give voluntary
consent. The experiment must hypothetically
achieve useful results for the good of society, be performed by
scientifically qualified persons, and be derived
from experiments on animal models that suggest the anticipated outcome will
justify human clinical experiments.
Strnka 24

Pharmaceutical_Century
The code also emphasizes that all physical and mental suffering must be
avoided and that precautions must be
taken to protect the human subject if injury or disability results from the
experiment. In achieving its goals, the
Nuremberg Code necessarily empowers the human subject and holds the
researcher responsible for inflicting
unnecessary pain and suffering on the human subject.
On the practical level, it was not until the 1960s that institutionalized
protections for subjects in clinical trials and human experimentation
were put into place.
Swing time
The 1940s ended with the antibiotic era in full swing and with a host
of wartime advancements in fermentation and purification
technologies changing the drug development process. Penicillin and
DDT became the chemical markers of the age, promising to heal the
worldcuring the plagues and killing the plague carriers. The
radioisotopes now easily produced through advances in technology
promoted by the war were becoming routinely available for health
research, as the era of computer-aided drug analysis began. The
baby boom launched by postwar U.S. prosperity produced the first
generation born with the expectation of health through drugs and
medical intervention.
Because of these new possibilities, health became a political as well
as a social issue. The leading role science played in the Allied victory
gave way in the postwar 1940s to its new role as medical savior.
The new technologies that became available in the 1940sincluding partition
chromatography, infrared and
mass spectrometry, as well as nuclear magnetic resonance (NMR)would
eventually become critical to
pharmaceutical progress.
Prescriptions & polio (1950s)
The 1950s began amid a continuing wave of international
paranoia. The Cold War intensified in the late 1940s as the West responded
to the
ideological loss of China to communism and the very real loss of atom bomb
exclusivity to the Soviets. The first few years of the 1950s heated up
again with the
Korean conflict.
On the home front, World War II related science that was now declassified,
together with Americas factories now turned to peace, shaped a postwar
economic boom. Driven by an unprecedented baby boom, the eras mass
consumerism focused on housing, appliances, automobiles, and luxury goods.
Technologies applied to civilian life included silicone products, microwave
ovens,
radar, plastics, nylon stockings, long-playing vinyl records, and computing
devices.
New medicines abounded, driven by new research possibilities and the
momentum
of the previous decades Antibiotic Era. A wave of government spending
was
spawned by two seminal influences: a comprehensive new federal science and
technology policy and the anti-Red sentiment that dollars spent for
science were
dollars spent for democracy.
While improved mechanization streamlined production in drug factories, the
DNA era dawned. James Watson
and Francis Crickdetermined the structure of the genetic material in 1953.
Prescription and nonprescription
drugs were legally distinguished from one another for the first time in the
United States as the pharmaceutical
industry matured. Human cell culture and radioimmunoassays developed as key
Strnka 25

Pharmaceutical_Century
research technologies; protein
sequencing and synthesis burgeoned, promising the development of protein
drugs.
In part because of Cold War politics, in part because the world was
becoming a smaller place, global health
issues took center stage. Fast foods and food additives became commonplace
in theWest. The Pill was
developed and first tested in Puerto Rico. Ultrasound was adapted to fetal
monitoring. Gas chromatography
(GC), mass spectrometry, and polyacrylamide gel electrophoresis began
transforming drug research, as did the
growth of the National Institutes of Health (NIHI) and the National Science
Foundation (NSF). The
foundations of modern immunology were laid as the pharmaceutical industry
moved ever forward in
mass-marketing through radio and the still-novel format of television.
But above all, through the lens of this time in the Western world, was the
heroic-scientist image of Jonas Salk,
savior of children through the conquest of polio via vaccines.
Antibiotics redux
The phenomenal success of antibiotics in the 1940s spurred the continuing
pursuit of more and better
antibacterial compounds from a variety of sources, especially from natural
products in soils and synthetic
modifications of compounds discovered earlier. In 1950, the antibiotic
Nystatin was isolated from
Streptomyces noursei obtained from soil in Virginia. In 1952, erythromycin
was first isolated from S.
erythreus from soil in the Philippines. Other antibiotics included
Novobiocin (1955), isolated from S.
spheroides from Vermont; Vancomycin (1956) from S. orientalis from soils in
Borneo and Indiana; and
Kanamycin (1957) from S. kanamyceticus from Japan. The search for
antibiotic compounds also led
researchers in Great Britain to discover, in 1957, an animal glycoprotein
(interferon) with antiviral activity.
Not only did the development of antibiotics that began in the 1940s lead to
the control of bacterial infections, it
also permitted remarkable breakthroughs in the growth of tissue culture in
the 1950s. These breakthroughs
enabled the growth of polio and other viruses in animal cell cultures
rather than in whole animals, and permitted
a host of sophisticated physiological studies that had never before been
possible. Scientists were familiar with
the concepts of tissue culture since the first decades of the century, but
routine application was still too difficult.
After antibiotics were discovered, such research no longer required an
artist in biological technique to
maintain the requisite sterile conditions in the isolation, maintenance,
and use of animal cells, according to virus
researcher Kingsley F. Sanders. In 1957, he noted that the use of
antibiotics in tissue culture made the process
so easy that even an amateur in his kitchen can do it.
Funding medicine
In the 1950s, general science research, particularly biological research,
expanded in the United States to a
great extent because of the influence of Vannevar Bush, presidential
science advisor during and immediately
after World War II. His model, presented in the 1945 report to the
President, Science: The Endless Frontier,
set the stage for the next 50 years of science funding. Bush articulated a
linear model of science in which basic
research leads to applied uses. He insisted that science and government
Strnka 26

Pharmaceutical_Century
should continue the partnership forged
in the 1940s with the Manhattan Project (in which he was a key participant)
and other war-related research.
As early as 1946, Bush argued for creating a national science funding
body. The heating up of the Cold War, as much as anything else,
precipitated the 1950 implementation of his idea in the form of the
National Science Foundationa major funnel for government
funding of basic research, primarily for the university sector. It was a
federal version of the phenomenally successful Rockefeller
Foundation. The new foundation had a Division of Biological and
Medical Sciences, but its mission was limited to supporting basic
research so that it wouldnt compete with the more clinically oriented
research of the NIH.
The NIH rode high throughout the 1950s, with Congress regularly
adding $8 million to $15 million to the NIH budget proposed by the
first Eisenhower administration. By 1956, the NIH budget had risen
to almost $100 million. By the end of the decade, the NIH was
supporting some 10,000 research projects at 200 universities and
medical schools at a cost of $250 million.
Other areas of government also expanded basic medical research
under the Bush vision. In 1950, for example, the Atomic Energy
Commission received a $5 million allocation from Congress
specifically to relate atomic research to cancer treatment. In this
same vein, in 1956, Oak Ridge National Laboratory established a
medical instruments group to help promote the development of
technology for disease diagnostics and treatment that would lead, in
conjunction with advances in radioisotope technology, to a new era
of physiologically driven medicine.
Part of this research funding was under the auspices of the Atoms for
Peace program and led to the proliferation of human experiments
using radioactive isotopes, often in a manner that would horrify a
later generation of Americans with its cavalier disregard for
participants rights.
Science funding, including medical research, received an additional
boost with the 1957 launch of the first orbital satellite. The Soviet
sputnik capped the era with a wave of science and technology fervor
in industry, government, and even the public schools. The perceived
science gap between the United States and the Soviet Union led to
the 1958 National Defense Education Act. The act continued the momentum of
government-led education that
started with the GI Bill to provide a new, highly trained, and competent
workforce that would transform
industry. This focus on the importance of technology fostered increased
reliance on mechanized
mass-production techniques. During World War II, the pharmaceutical
industry had learned its lessonthat
bigger was better in manufacturing methodsas it responded to the high
demand for penicillin.
Private funds were also increasingly available throughout the 1950sand not
just from philanthropic
institutions. The American Cancer Society and the National Foundation for
Infantile Paralysis were two of the
largest public disease advocacy groups that collected money from the
general public and directed the
significant funds to scientific research. This link between the public and
scientific research created, in some
small fashion, a sense of investment in curing disease, just as investing
in savings bonds and stamps in the
previous decade had created a sense of helping to win World War II.
The war on polio
Perhaps the most meaningful medical story to people of the time was that of
Strnka 27

Pharmaceutical_Century
Jonas Salk and his conquest of
polio. Salk biographer Richard Carter described the response to the April
12, 1955, vaccine announcement:
More than a scientific achievement, the vaccine was a folk victory, an
occasion for pride and jubilation.
People observed moments of silence, rang bells, honked horns, closed
their schools or convoked fervid
assemblies therein, drank toasts, hugged children, attended church. The
public felt a strong tie to Salks
research in part because he was funded by the National Foundation for
Infantile Paralysis. Since 1938, the
organizations March of Dimes had collected small change from the general
public to fund polio research. The
group managed to raise more money than was collected for heart disease or
even cancer.
The Salk vaccine relied on the new technology of growing viruses in cell
cultures, specifically in monkey kidney
cells (first available in 1949). Later, the human HeLa cell line was used
as well. The techniques were
developed by John F. Enders (Harvard Medical School), Thomas H. Weller
(Childrens Medical Center,
Boston), and Frederick C. Robbins (Case Western Reserve University,
Cleveland), who received the 1954
Nobel Prize in Physiology or Medicine for their achievement.
Salk began preliminary testing of his polio vaccine in 1952, with a massive
field trial in the United States in
1954. According to Richard Carter, a May 1954 Gallup Poll found that more
Americans were aware of the
polio field trial than knew the full name of the President of the United
States. Salks vaccine was a killed-virus
vaccine that was capable of causing the disease only when mismanaged in
production. This unfortunately
happened within two weeks of the vaccines release. The CDC Poliomyelitis
Surveillance Unit was
immediately established; the popularly known disease detectives traced
down the problem almost
immediately, and the guilty vaccine lot was withdrawn. It turned out that
Cutter Laboratories had released a
batch of vaccine with live contaminants that tragically resulted in at
least 260 cases of vaccine-induced polio.
Ironically, these safety problems helped promote an alternative
vaccine that used live rather than killed virus. In 1957, the attenuated
oral polio vaccine was finally developed by Albert Sabin and
became the basis for mass vaccinations in the 1960s. The live
vaccine can infect a small percentage of those inoculated against the
disease with active polio, primarily those with compromised immune
systems, and is also a danger to immunocompromised individuals
who have early contact with the feces of vaccinated individuals. In its
favor, it provides longer lasting immunity and protection against
gastrointestinal reinfection, eliminating the reservoir of polio in the
population.
The debate that raged in the 1950s over Salk versus Sabin (fueled at
the time by a history of scientific disputes between the two men)
continues today: Some countries primarily use the injected vaccine,
others use the oral, and still others use one or the other, depending
on the individual patient.
Instruments and assays
The 1950s saw a wave of new instrumentation, some of which,
although not originally used for medical purposes, was eventually
used in the medical field. In 1951, image-analyzing microscopy
began. By 1952, thin sectioning and fixation methods were being
perfected for electron microscopy of intracellular structures,
especially mitochondria. In 1953, the first successful open-heart
Strnka 28

Pharmaceutical_Century
surgery was performed using the heartlung machine developed by
John H. Gibbon Jr. in Philadelphia.
Of particular value to medical microbiology and ultimately to the
development of biotechnology was the production of an automated
bacterial colony counter. This type of research was first
commissioned by the U.S. Army Chemical Corps. Then the Office
of Naval Research and the NIH gave a significant grant for the
development of the Coulter counter, commercially introduced as the
Model A.
A.J.P. Martin in Britain developed gasliquid partition chromatography in
1952. The first commercial devices
became available three years later, providing a powerful new technology for
chemical analysis. In 1954, Texas
Instruments introduced silicon transistorsa technology encompassing
everything from transistorized analytical
instruments to improved computers and, for the mass market, miniaturized
radios.
The principle for electromagnetic microbalances was developed near the
middle of the decade, and a
prototype CT scanner was unveiled. In 1958, amniocentesis was developed,
and Scottish physician Ian
McDonald pioneered the use of ultrasound for diagnostics and therapeutics.
Radiometer micro pH electrodes
were developed by Danish chemists for bedside blood analysis.
In a further improvement in computing technology, Jack Kilby at Texas
Instruments developed the integrated
circuit in 1958.
In 1959, the critical technique of polyacrylamide gel electrophoresis
(PAGE) was in place, making much of the
coming biotechnological analysis of nucleic acids and proteins feasible.
Strides in the use of atomic energy continued apace with heavy government
funding. In 1951, Brookhaven
National Laboratory opened its first hospital devoted to nuclear medicine,
followed seven years later by a
Medical Research Center dedicated to the quest for new technologies and
instruments. By 1959, the
Brookhaven Medical Research Reactor was inaugurated, making medical
isotopes significantly cheaper and
more available for a variety of research and therapeutic purposes.
In one of the most significant breakthroughs in using isotopes for research
purposes, in 1952, Rosalyn
Sussman Yalow, working at the Veterans Hospital in the Bronx in association
with Solomon A. Berson,
developed the radioimmunoassay (RIA) for detecting and following antibodies
and other proteins and
hormones in the body.
Physiology explodes
The development of new instruments, radioistopes, and assay techniques
found rapid application in the realm
of medicine as research into general physiology and therapeutics prospered.
In 1950, for example, Konrad
Emil Bloch at Harvard University used carbon-13 and carbon-14 as tracers in
cholesterol buildup in the body.
Also in 1950, Albert Claude of the Universit Catolique de Louvain in
Belgium discovered the endoplasmic
reticulum using electron microscopy. That same year, influenza type C was
discovered.
New compounds and structures were identified in the human body throughout
the decade. In 1950, GABA
Strnka 29

Pharmaceutical_Century
(gamma-aminobutyric acid ) was identified in the brain. Soon after that,
Italian biologist Rita Levi-Montalcini
demonstrated the existence of a nerve growth hormone. In Germany, F.F.K.
Lynen isolated the critical enzyme
cofactor, acetyl-CoA, in 1955. Human growth hormone was isolated for the
first time in 1956. That same
year, William C. Boyd of the Boston University Medical School identified 13
races of humans based on
blood groups.
Breakthroughs were made that ultimately found their way into the
development of biotechnology. By 1952,
Robert Briggs and Thomas King, developmental biologists at the Institute
for Cancer Research in Philadelphia,
successfully transplanted frog nuclei from one egg to anotherthe ultimate
forerunner of modern cloning
techniques. Of tremendous significance to the concepts of gene therapy and
specific drug targeting, sickle cell
anemia was shown to be caused by one amino acid difference between normal
and sickle hemoglobin
(19561958). Although from todays perspective it seems to have occurred
surprisingly late, in 1956 the
human chromosome number was finally revised from the 1898 estimate of 24
pairs to the correct 23 pairs. By
1959, examination of chromosome abnormalities in shape and number had
become an important diagnostic
technique. That year, it was determined that Downs syndrome patients had
47 chromosomes instead of 46.
As a forerunner of the rapid development of immunological sciences, in 1959
Australian virologist Frank
Macfarlane Burnet proposed his clonal selection theory of antibody
production, which stated that antibodies
were selected and amplified from preexisting rather than instructionally
designed templates.
A rash of new drugs
New knowledge (and, as always, occasional serendipity) led to new drugs.
The decade began with the Mayo
Clinics discovery of cortisone in 1950a tremendous boon to the treatment
of arthritis. But more
importantly, it saw the first effective remedy for tuberculosis. In 1950,
British physician Austin Bradford Hill
demonstrated that a combination of streptomycin and para-aminosalicylic
acid (PAS) could cure the disease,
although the toxicity of streptomycin was still a problem. By 1951, an even
more potent antituberculosis drug
was developed simultaneously and independently by the Squibb Co. and
Hoffmann-LaRoche. Purportedly
after the death of more than 50,000 mice (part of a new rapid screening
method developed by Squibb to
replace the proverbial guinea pigs as test animals) and the examination of
more than 5000 compounds,
isonicotinic acid hydrazide proved able to protect against a lethal dose of
tubercle bacteria. It was marketed
ultimately as isoniazid and proved especially effective in mixed dosage
with streptomycin or pas.
In 1951, monoamine oxidase (MAO) inhibitors were introduced to treat
psychosis. In 1952, reserpine was
isolated from rauwolfia and eventually was used for treating essential
hypertension. But in 1953, the rauwolfia
alkaloid was used as the first of the tranquilizer drugs. The source plant
came from India, where it had long
been used as a folk medicine. The thiazide drugs were also developed in
this period as diuretics for treating
high blood pressure. In 1956, halothane was introduced as a general
anesthetic. In 1954, the highly touted
Strnka 30

Pharmaceutical_Century
chlorpromazine (Thorazine) was approved as an antipsychotic in the United
States. It had started as an allergy
drug developed by the French chemical firm Rhne-Poulenc, and it was
noticed to have slowed down bodily
processes.
Also in 1954, the FDA approved BHA (butylated hydroxyanisole) as a food
preservative; coincidentally,
McDonalds was franchised that same year. It soon became the largest fast
food chain. Although not really a
new drug (despite numerous fast food addicts), the arrival and
popularity of national fast food chains (and
ready-made meals such as the new TV dinners in the supermarket) were the
beginning of a massive change in
public nutrition and thus, public health.
Perhaps the most dramatic change in the popularization of drugs came with
the 1955 marketing of
meprobamate (first developed by Czech scientist Frank A. Berger) as Miltown
(by Wallace Laboratories) and
Equanil (by Wyeth). This was the first of the major tranquilizers or
anti-anxiety compounds that set the stage
for the 1960s drug era. The drug was so popular that it became iconic in
American life. (The most popular
TV comedian of the time once referred to himself as Miltown Berle.)
Unfortunately, meprobamate also
proved addictive.
In 1957, British researcher Alick Isaacs and J. Lindenman of the National
Institute for Medical Research, Mill
Hill, London, discovered interferona naturally occurring antiviral
protein, although not until the 1970s (with
the advent of gene-cloning technology) would it become routinely available
for drug use. In 1958, a
saccharin-based artificial sweetener was introduced to the American public.
That year also marked the
beginning of the thalidomide tragedy (in which the use of a new
tranquilizer in pregnant women caused severe
birth defects), although it would not become apparent until the 1960s. In
1959, Haldol (haloperidol) was first
synthesized for treating psychotic disorders.
Blood products also became important therapeutics in this decade, in large
part because of the 1950
development of methods for fractionating blood plasma by Edwin J. Cohn and
colleagues. This allowed the
production of numerous blood-based drugs, including fraction X (1956), a
protein common to both the
intrinsic and extrinsic pathways of blood clotting, and fraction VIII
(1957), a blood-clotting protein used for
treating hemophilia.
The birth of birth control
Perhaps no contribution of chemistry in the second half of the 20th century
had a greater impact on social
customs than the development of oral contraceptives. Several people were
important in its
developmentamong them Margaret Sanger, Katherine McCormick, Russell
Marker, Gregory Pincus, and
Carl Djerassi.
Sanger was a trained nurse who was a supporter of radical, left-wing
causes. McCormick was the
daughter-in-law of Cyrus McCormick, founder of International Harvester,
whose fortune she inherited when
her husband died. Both were determined advocates of birth control as the
means to solving the worlds
overpopulation. Pincus (who founded the Worcester Foundation for
Strnka 31

Pharmaceutical_Century
Experimental Biology) was a physiologist
whose research interests focused on the sexual physiology of rabbits. He
managed to fertilize rabbit eggs in a
test tube and got the resulting embryos to grow for a short time. The feat
earned him considerable notoriety,
and he continued to gain a reputation for his work in mammalian
reproductive biology.
Sanger and McCormick approached Pincus and asked him to produce a
physiological contraceptive. He
agreed to the challenge, and McCormick agreed to fund the project. Pincus
was certain that the key was the
use of a female sex hormone such as progesterone. It was known that
progesterone prevented ovulation and
thus was a pregnancy-preventing hormone. The problem was finding suitable,
inexpensive sources of the
scarce compound to do the necessary research. Enter American chemist
Russell Marker. Markers research
centered on converting sapogenin steroids found in plants into
progesterone. His source for the sapogenins
was a yam grown in Mexico. Marker and colleagues formed a company (Syntex)
to produce progesterone. In
1949, he left the company over financial disputes and destroyed his notes
and records. However, a young
scientist hired that same year by Syntex ultimately figured prominently in
further development of the Pill.
The new hire, Djerassi, first worked on the synthesis of cortisone
from diosgenin. He later turned his attention to synthesizing an
improved progesterone, one that could be taken orally. In 1951,
his group developed a progesterone-like compound called
norethindrone.
Pincus had been experimenting with the use of progesterone in
rabbits to prevent fertility. He ran into an old acquaintance in 1952,
John Rock, a gynecologist, who had been using progesterone to
enhance fertility in patients who were unable to conceive. Rock
theorized that if ovulation were turned off for a short time, the
reproductive system would rebound. Rock had essentially proved in
humans that progesterone did prevent ovulation. Once Pincus and
Rock learned of norethindrone, the stage was set for wider clinical
trials that eventually led to FDA approval of it in 1960 as an oral
contraceptive. However, many groups opposed this approval on
moral, ethical, legal, and religious grounds. Despite such opposition,
the Pill was widely used and came to have a profound impact on
society.
DNA et al.
Nucleic acid chemistry and biology were especially fruitful in the
1950s as not only the structure of DNA but also the steps in its
replication, transcription, and translation were revealed. In 1952,
Rosalind E. Franklin at Kings College in England began producing
the X-ray diffraction images that were ultimately used by James
Watson and Francis Crick in their elucidation of the structure of
DNA published in Science in 1953.
Two years later, Severo Ochoa at New York University School of
Medicine discovered the enzyme, RNA polymerase, that made
RNA from DNA. In 1956, electron microscopy was used to
determine that the cellular structures called microsomes contained
RNA (they were thus renamed ribosomes). That same year, Arthur
Kornberg at Washington University Medical School (St. Louis, MO)
discovered DNA polymerase. Soon after that, DNA replication as a
semiconservative process was worked out separately by
autoradiography in 1957 and then by using density centrifugation in
1958.
With the discovery of transfer RNA (tRNA) in 1957 by Mahlon
Strnka 32

Pharmaceutical_Century
Bush Hoagland at Harvard Medical School, all of the pieces were in
place for Francis Crick to postulate in 1958 the central dogma of
DNAthat genetic information is maintained and transferred in a
one-way process, moving from nucleic acids to proteins. The path
was set for the elucidation of the genetic code the following decade.
On a related note, in 1958, bacterial transduction was discovered by Joshua
Lederberg at the University of
Wisconsina critical step toward future genetic engineering.
Probing proteins
Behind the hoopla surrounding the discovery of the structure of DNA, the
blossoming of protein chemistry in
the 1950s is often ignored. Fundamental breakthroughs occurred in the
analysis of protein structure and the
elucidation of protein functions.
In the field of nutrition, in 1950 the protein-building role of the
essential amino acids was demonstrated. Linus
Pauling, at the California Institute of Technology, proposed that protein
structures are based on a primary
alpha-helix (a structure that served as inspiration for helical models of
DNA). Frederick Sanger at the Medical
Research Council (MRC) Unit for Molecular Biology at Cambridge and Pehr
Victor Edman developed
methods for identifying N-terminal peptide residues, an important
breakthrough in improved protein
sequencing. In 1952, Sanger used paper chromatography to sequence the amino
acids in insulin. In 1953, Max
Perutz and John Kendrew, cofounders of the MRC Unit for Molecular Biology,
determined the structure of
hemoglobin using X-ray diffraction.
In 1954, Vincent du Vigneaud at Cornell University synthesized the
hormone oxytocinthe first naturally occurring protein made with
the exact makeup it has in the body. The same year, ribosomes were
identified as the site of protein synthesis. In 1956, the
three-dimensional structure of proteins was linked to the sequence of
its amino acids, so that by 1957, John Kendrew was able to solve
the first three-dimensional structure of a protein (myoglobin); this
was followed in 1959 with Max Perutzs determination of the
three-dimensional structure of hemoglobin. Ultimately, linking protein
sequences with subsequent structures permitted development of
structureactivity models, which allowed scientists to determine the
nature of ligand binding sites. These developments proved critical to
functional analysis in basic physiological research and to drug
discovery, through specific targeting.
On to the Sixties
By the end of the 1950s, all of pharmaceutical science had been
transformed by a concatenation of new instruments and new
technologiesfrom GCs to X-ray diffraction, from computers to
tissue culturecoupled, perhaps most importantly, to a new understanding of
the way things (meaning cells,
meaning bodies) worked. The understanding of DNAs structure and
functionhow proteins are designed
and how they can cause diseaseprovided windows of opportunity for drug
development that had never
before been possible. It was a paradigm shift toward physiology-based
medicine, born with the hormone and
vitamin work in the 1920s and 1930s, catalyzed by the excitement of the
antibiotic era of the 1940s, that
continued throughout the rest of the century with the full-blown
development of biotechnology-based medicine.
The decade that began by randomly searching for antibiotics in dirt ended
with all the tools in place to begin
searching for drugs with a knowledge of where they should fit in the
chemical world of cells, proteins, and
DNA.
Strnka 33

Pharmaceutical_Century
Anodynes & estrogens (1960s)
Mention the Sixties and there are varied hot-button
responsesJFK, LBJ, civil rights, and Vietnam or sex, drugs, and rock
n
roll. But it was all of a piece. Politics and culture mixed like the
colors of a badly
tie-dyed t-shirt. But in this narrative, drugs are the hallmark of the
eramaking
them, taking them, and dealing with their good and bad consequences. Ever
after
this era, the world would be continually conscious of pills, pills,
pillsfor life, for
leisure, and for love. In many ways, the Sixties was the Pharmaceutical
Decade of
the Pharmaceutical Century.
A plethora of new drugs was suddenly available: the Pill was first
marketed;
Valium and Librium debuted to soothe the nerves of housewives and
businessmen;
blood-pressure drugs and other heart-helping medications were developed.
Another emblem of the 1960s was the development of worldwide drug abuse,
including the popularization of psychotropic drugs such as LSD by gurus
like
Timothy Leary. The social expansion of drugs for use and abuse in the 1960s
forever changed not only the nature of medicine but also the politics of
nations.
The technology of drug discovery, analysis, and manufacture also
proliferated. New forms of chromatography
became available, including HPLC, capillary GC, GC/MS, and the rapid
expansion of thin-layer
chromatography techniques. Proton NMR was developed to analyze complex
biomolecules. By the end of the
decade, amino acid analyzers were commonplace, and the ultracentrifuge was
fully adapted to biomedical
uses. Analytical chemistry and biology joined as never before in the search
for new drugs and analysis of old
ones.
And of equal importance, a new progressivism took the stage, especially in
the United States, where increasing
demand for access to health care and protection from unsafe and fraudulent
medications once more led to an
increased federal presence in the process of drug development, manufacture,
and sale.
Popping the Pill
If there was any single thing that foreshadowed the tenor of the decade,
one medical advance was paramount:
In 1960 the first oral contraceptive was mass-marketed. Sex and drugs were
suddenly commingled in a single
pill, awaiting only the ascendance of rock n roll to stamp the image of
the decade forever. Born of the Pill,
the sexual revolution seemed to be in good measure a pharmaceutical one.
The earlier achievements of womens suffrage and the growing presence of
women in the labor force were
somewhat blocked from further development by the demands of pregnancy and
child rearing in a
male-dominated society. Feminist historians recount how hopes were suddenly
energized by the ability of
women to control their own bodies chemically and thus, socially. Chemical
equality, at least for those who
could afford it and, in the beginning, find the right doctors to prescribe
it, was at last available.
Strnka 34

Pharmaceutical_Century
However, it was not a uniformly smooth process. First and foremost, despite
its popularity, the technology for
tinkering with womens reproductive systems had not been fully worked out.
In Britain in 1961, there were
problems with birth control pills with excess estrogens. Similar problems
also occurred in the United States.
Dosage changes were required for many women; side effects debilitated a
few.
But still the sexual revolution marched on, as was documented in the 1966
Masters and Johnson report
Human Sexual Response, which showed a transformation of female sexuality,
new freedoms, and new
attitudes in both sexes. Furthermore, technology was not done fiddling with
reproduction by any means. In
1969, the first test-tube fertilization was performed.
Valium of the dolls
In an era that would be far from sedate, the demand for sedatives was
profound, and the drug marketplace
responded rapidly. Although Miltown (meprobamate), the first of the major
tranks, was called the Wonder
Drug of 1954, sedatives werent widely used until 1961, when Librium (a
benzodiazepine) was discovered
and marketed as a treatment for tension. Librium proved a phenomenal
success. Then Valium (diazepam),
discovered in 1960, was marketed by Roche Laboratory in 1963 and rapidly
became the most prescribed
drug in history.
These drugs were touted to the general population and mass-marketed and
prescribed by doctors with what
many claimed was blithe abandon. While the youth of America were smoking
joints and tripping on acid, their
parents generation of businessmen and housewives were downing an
unprecedented number of sedatives.
According to the Canadian Government Commission of Inquiry into the
Nonmedical Use of Drugs (1972),
In 1965 in the USA, some 58 million new prescriptions and 108 million
refills were written for psychotropes
(sedatives, tranquilizers, and stimulants), and these 166 million
prescriptions accounted for 14% of the total
prescriptions of all kinds written in the United States. Physical and
psychological addiction followed for many.
Drug taking became the subject of books and movies. In the 1966 runaway
best-seller Valley of the Dolls by
Jacqueline Susann, the dolls were the pills popped by glamorous
upper-class women in California.
Eventually, the pills trapped them in a world of drug dependence that
contributed to ruining their lives.
Drug wars
It was only a matter of time before the intensive testing of LSD by the
military in the 1950s and early
1960sas part of the CIAs Project MKULTRAspread into the consciousness
of the civilian population.
By 1966, the chairman of the New Jersey Narcotic Drug Study Commission
declared that LSD was the
greatest threat facing the country today more dangerous than the Vietnam
War. In the United States, at
least at the federal level, the battle against hallucinogens and marijuana
use was as intense as, if not more
intense than, the fight against narcotic drugs.
According to some liberal critics, this was because these
recreational drugs were a problem in the middle and the upper
Strnka 35

Pharmaceutical_Century
classes, whereas narcotic addiction was the purview of the poor.
From the start of the decade, in the United States and around the
world, regions with large populations of urban poor were more
concerned about the growing problem of narcotic drug addiction. In
1961, with the passage of the UN Single Convention on Narcotic
Drugs, signatory nations agreed to processes for mandatory
commitment of drug users to nursing homes. In 1967, New York
State established a Narcotics Addiction Control Program that,
following the UN Convention, empowered judges to commit addicts
into compulsory treatment for up to five years. The program cost
$400 million over just three years but was hailed by Governor
Nelson Rockefeller as the start of an unending war. Such was the
measure of the authorities concern with seemingly out-of-control
drug abuse. The blatant narcotics addictions of many rock stars and
other celebrities simultaneously horrified some Americans and
glamorized the use of hard drugs among others, particularly young people.
By 1968, the Food and Drug Administration (FDA) Bureau of Drug Abuse
Control and the Treasury
Departments Bureau of Narcotics were fused and transferred to the
Department of Justice to form the Bureau
of Narcotics and Dangerous Drugs in a direct attempt to consolidate the
policing of traffic in illegal drugs. Also
in 1968, Britain passed the Dangerous Drug Act to regulate opiates. As
these efforts to stop drug use
proliferated, technology for mass-producing many of these same drugs
continued to improve in factories
around the world. By 1968, Canada was producing nearly 56 million doses of
amphetamines; by 1969, the
United States was producing more than 800,000 pounds of barbiturates.
Forging a great society
The 1960 election of a Democratic administration in the United States
created a new demand for government
intervention in broader areas of society. John F. Kennedy and his
successor, Lyndon B. Johnson, expanded
federal intervention in a host of previously unregulated areas, both civil
and economic, including food and
medicine.
Around the world, a new social agenda demanding rights for
minorities (especially apparent in the civil rights struggles in the
United States) and women (made possible by freedoms associated
with the Pill) fostered a new focus on protecting individuals and
ending, or at least ameliorating, some of the damage done to hitherto
exploited social classes. Health and medicine, a prime example, led
to what many disparagingly called government paternalism. This
same liberal agenda, in its darker moments, moved to protect
less-developed nations from the perceived global communist
threathence the Vietnam war. Ultimately, there was neither the
money nor the will to finance internal social and external political
agendas. General inflation resulted, with specific increases in the cost
of medical care.
Perhaps one of the most significant long-term changes in drug
development procedures of the era, especially regarding the role of
governments, came with a new desire to protect human guinea
pigs. General advocacy for the poor, women, and minorities led to
a reexamination of the role of the paternalistic, (generally) white male
clinicians in the morally repugnant treatment of human subjects
throughout the century, before and after the Nuremberg trials of Nazi
doctors. It was a profound catalyst for the modern bioethics movement when
health groups worldwide
established new regulations regarding informed consent and human
experimentation in response to specific
outrages.
Strnka 36

Pharmaceutical_Century
In 1962, the KefauverHarris amendments to the Federal Food, Drug, and
Cosmetic Act of 1938 were
passed to expand the FDAs control over the pharma and food industries. The
Kefauver amendments were
originally the outgrowth of Senate hearings begun in 1959 to examine the
conduct of pharmaceutical
companies. According to testimony during those hearings, it was common
practice for these companies to
provide experimental drugs whose safety and efficacy had not been
established to physicians who were then
paid to collect data on their patients taking these drugs. Physicians
throughout the country prescribed these
drugs to patients without their control or consent as part of this loosely
controlled research. That the
amendments were not passed until 1962 was in part because of the profound
battle against allowing additional
government control of the industry. The 1958 Delaney proviso and the 1960
Color Additive Amendment led
to industry and conservative resentment and complaints that the FDA was
gaining too much power.
However, with the 1961 birth defects tragedy involving the popular European
sedative thalidomide (prevented
from being marketed in the United States by FDA researcher Frances Kelsey),
public demand for greater
protections against experimental agents was overwhelming. Thalidomide had
been prescribed to treat morning
sickness in countless pregnant women in Europe and Canada since 1957, but
its connection to missing and
deformed limbs in newborns whose mothers had used it was not realized until
the early 1960s. In 1961,
televised images of deformed thalidomide babies galvanized support for
the FDA, demonstrating the power
of this new medium to transform public opinion, as it had in the 1960
NixonKennedy debates and would
again in the Vietnam War years.
Building on the newfound public fears of synthetic substances, Rachel
Carsons 1962 book Silent Spring
precipitated the populist environmental movement. As with the 1958 Delaney
clause and the Federal
Hazardous Substances Labeling Act of 1960, it was all part of increased
public awareness of the potential
dangers of the rapid proliferation of industrial chemicals and drugs.
According to the 1962 amendments to the
1938 Food, Drug, and Cosmetic Act, new drugs had to be shown to be
effective, prior to marketing, by
means to be determined by the FDA. Ultimately, this requirement translated
to defined clinical trials. However,
Congress still vested excessive faith in the ethics of physicians by
eliminating the need for consent when it was
not feasible or was deemed not in the best interest of the patient; these
decisions could be made according
to the best judgment of the doctors involved.
Despite the fact that President Kennedy proclaimed a Consumer Bill of
Rightsincluding the rights to safety
and to be informedmore stringent federal guidelines to protect research
subjects were not instituted until
1963. Then the NIH responded to two egregious cases: At Tulane University,
a chimpanzee kidney was
unsuccessfully transplanted into a human with the patients consent but
without medical review. At the
Brooklyn Jewish Chronic Disease Hospital, in association with the
Sloan-Kettering Cancer Research Institute,
live cancer cells were injected into indigent, cancer-free elderly
patients.
Strnka 37

Pharmaceutical_Century
Particularly important to the public debate was the disclosure of 22
examples of potentially serious ethical
violations found in research published in recent medical journals. The
information was presented by Henry
Beecher of Harvard Medical School to science journalists at a 1965
conference sponsored by the Upjohn
pharmaceutical company and was later published in the New England Journal
of Medicine.
In 1964, the World Medical Association issued its Declaration of Helsinki,
which set standards for clinical
research and demanded that subjects be given informed consent before
enrolling in an experiment. By 1966, in
the United States, the requirement for informed consent and peer review of
proposed research was written
into the guidelines for Public Health Servicesponsored research on human
subjects. These regulations and the
debate surrounding human experimentation continued to evolve and be applied
more broadly.
And in an attempt to protect consumers from unintentional waste or fraud,
in 1966, wielding its new power
from the Kefauver amendments, the FDA contracted with the National Academy
of Sciences/National
Research Council to evaluate the effectiveness of 4000 drugs approved on
the basis of safety alone between
1938 and 1962.
The U.S. government faced other health issues with less enthusiasm. In
1964, the Surgeon Generals Report
on Smoking was issued, inspired in part by the demand for a response to the
Royal College of Surgeons
report in Britain from the year before. Although the Surgeon Generals
report led to increased public
awareness and mandated warning labels on tobacco products, significant
government tobacco subsidies, tax
revenues, smoking, and smoking-related deaths continued in the United
States. The debate and associated
litigation go on to this day.
Heart heroics and drugs
The lungs were not the only vital organs to attract research and publicity
in the Sixties. From heart transplants
and blood-pressure medications to blood-based drugs, a new kind of heroic
medicine focused technology
on the bloodstream and its potential for helping or hindering health.
Heart surgery captivated the public with its bravado. In 1960, the
transistorized self-contained pacemaker was
introduced. In the 1960s, organ transplantation became routinely possible
with the development of the first
immune suppressant drugs. In 1967, the coronary bypass operation was
developed by Michael DeBakey; in
that same year, physician (and some would say showman) Christiaan Barnard
performed the first heart
transplant. In 1968, angioplasty was introduced for arterial treatment and
diagnosis.
But for all the glamour of surgery, chemicals for controlling heart
disease provided far more widespread effects. Blood-pressure drugs
based on new knowledge of human hormone systems became
available for the first time. In 1960, guanethidine (a noradrenaline
release inhibitor) was developed for high blood pressure; in rapid
succession the first beta-adrenergic blocker appeared in Britain
(1962), and alpha-methyldopa, discovered in 1954, was first used
clinically in the early 1960s for treating high blood pressure by
interfering not with noradrenaline release, but with its synthesis. In
1964, methods were perfected to nourish individuals through the
Strnka 38

Pharmaceutical_Century
bloodstream. This ability to feed intravenously and provide total
caloric intake for debilitated patients forever changed the nature of
coma and the ethics of dealing with the dying. The use of
blood-thinning and clot-dissolving compounds for heart disease was
also pioneered in the early 1960s: Streptokinase and aspirin reduced
deaths by 40% when taken within a few hours of a heart attack.
In the late Sixties, as methods of fractionation using centrifugation
and filtration improved dramatically, concentrated blood factors
became readily available for the first time.
Plasmapheresiscentrifuging the red blood cells from plasma and
then returning the whole cells to the donorallowed donations to
occur twice weekly instead of every few months. Blood proteins
such as Big-D Rh antibodies used to immunize women immediately
after the birth of their first Rh-positive child, albumin, gamma
globulins, blood-typing sera, and various clotting factors such as
factor VIII created a booming industry in plasma products. But it
also made blood all the more valuable as a commodity.
Abuses increased in collecting, distributing, and manufacturing blood
products, as recounted by Douglas Starr in his 1999 book Blood. In
1968, in a move that foreshadowed the AIDS era yet to come, the United
States revoked all licenses for
consumer sales of whole plasma prepared from multiple donors because of
fears that viral hepatitis would
spread. Fears were exacerbated by scandalous revelations about the health
status (or lack thereof) of many
paid blood donors. This donor problem became even more newsworthy in the
early 1970s as malnourished
street hippies, drug abusers, unhealthy indigents, and prisoners were
revealed as sources often contaminating
the world blood supply.
High tech/new mech
In the 1960s, analytical chemists increasingly turned their attention to
drug discovery and analysis and to
fundamental questions of biological significance in physiology and
genetics. New technologies were developed,
and previously designed instruments were adapted to biomedical
applications.
The development of high-pressure (later known as high-performance) liquid
chromatography (HPLC)
heralded a new era of biotechnology and allowed advanced separations of
fragile macromolecules in fractions
of the time previously required. The radioimmune assay, first developed in
1959 by Rosalyn Yalow and
Solomon Berson, was perfected in 1960. Tissue culture advances
proliferated, allowing more and better in
vitro testing of drugs; and, when coupled with radiotracer and
radioimmunoassay experiments, led to
unprecedented breakthroughs in all areas of mammalian physiology. In 1964,
for example, Keith Porter and
Thomas F. Roch discovered the first cell membrane receptor. Further
developments in analytical chemistry
came as gas chromatography (GC) was first linked with mass spectrometry
(MS), providing a quantum leap in
the ability to perform structural analysis of molecules. And laboratory
automation, including primitive robotics,
became a powerful trend.
But perhaps the most important breakthrough in tissue culture, and one that
created a direct path to the Human
Genome Project, was the invention of somatic-cell hybridization by Mary
Weiss and Howard Green in 1965.
By fusing mouse and human cells together via the molecular glue of Sendai
virus, these researchers and
others quickly developed a series of cell lines containing mostly mouse
Strnka 39

Pharmaceutical_Century
chromosomes but with different single
human ones, all expressing unique proteins. For the first time, human
proteins could be assigned to individual
human chromosomes (and later chromosome fragments) to the degree that gene
mapping was finally possible
in humans. Of comparable importance, new improvements in fermentation
technology allowed continuous
cycling and easy sterilization along with mass-produced instrumentation.
Fundamental breakthroughs in the etiology of disease transformed the
understanding of infection. In 1961, the
varying polio virus receptors were correlated to pathogenicity of known
isolates; in 1967, diphtheria toxins
mode of action was finally determined and provided the first molecular
definition of a bacterial protein virulence
factor.
Structural biology proceeded at an enthusiastic pace. In 1960, John Kendrew
reported the first high-resolution
X-ray analysis of the three-dimensional structure of a proteinsperm whale
myoglobin. In the 1960s, image
analyzers were linked to television screens for the first time, enhancing
the use and interpretation of complex
images. And in 1967, Max Perutz and Hilary Muirhead built the first
high-resolution model of the atomic
structure of a protein (oxyhemoglobin), which promoted a wave of protein
structural analysis. Computer
systems became increasingly powerful and quickly indispensable to all
laboratory processes, but especially to
molecular analysis.
Hardly falling under the category of miscellaneous discoveries, at least in
terms of the development of
biotechnology and the pharmaceutical industry, was the development of
agarose gel electrophoresis in 1961.
This requirement was critical for separating and purifying high molecular
weight compounds, especially DNA;
in 1963, to the benefit of a host of laboratory workers, the first film
badge dosimeter was introduced in the
United Kingdom; and in 1966, the disc-diffusion standardized test was
developed for evaluating antibiotics,
which was a boon to the exploding pharmaceutical development of such
compounds.
Dancing with DNA
Finally, the understanding of the structure of DNA and acceptance that it
was indeed genetic material yielded
intelligible and practical results.
At the beginning of the decade, researchers A. Tsugita and Heinz
Fraenkel-Conrat demonstrated the link
between mutation and a change in the protein produced by the gene. Also in
1960, Francois Jacob and
Jacques Monod proposed their operon model. This was the birth of gene
regulation models, which launched a
continuing quest for gene promoters and triggering agents, such that by
1967, Walter Gilbert and co-workers
identified the first gene control (repressor) substance.
Perhaps most significantly, the prelude to biotechnology was established
when restriction enzymes were
discovered, the first cell-free DNA synthesis was accomplished by
biochemist Arthur Kornberg in 1961, and
the DNAamino acid code was deciphered.
Deciphering the genetic code was no minor feat. Finding the mechanism for
going from gene to protein was a
combination of brilliant theorizing and technological expertise.
Strnka 40

Pharmaceutical_Century
The technology necessary for the dawn of biotechnology proliferated as
well. Throughout the 1960s,
automated systems for peptide and nucleic acid analysis became commonplace.
In 1964, Bruce Merrifield
invented a simplified technique for protein and nucleic acid synthesis,
which was the basis for the first such
machines. (In the 1980s, this technique would be mass-automated for gene
synthesis.) And in 1967, the first
specific gene transfer was accomplished; the lac operon was functionally
transferred from E. coli to another
bacterial species. And, although its importance was not realized at the
time, in 1967, Thermus aquaticus was
discovered in a hot spring in Yellowstone National Park. This microbe was
the first Archea ever found and the
ultimate source of heat-stable taq polymerasethe enabling enzyme for
modern PCR.
By the late 1960s, not only had the first complete gene been isolated from
an organism, but biologists were
already debating the ethics of human and animal genetic engineering.
Generation of drug-takers
In the 1960s, the postWorld War II generation of baby boomers entered
their teenage years. These were the
children of vitamins, antibiotics, hormones, vaccines, and fortified foods
such as Wonder Bread and Tang. The
technology that won the war had also transformed the peace and raised high
expectations on all social fronts,
including, and perhaps especially, health. The unparalleled prosperity of
the 1950s was largely driven by new
technologies and a profusion of consumer goods. This prosperity, coupled
with a new political progressivism,
created a euphoric sense of possibility early in the decade, especially
with regard to health care. The prevailing
belief was that medicine would save and society would provide. Although the
dream ultimately proved elusive,
its promise permeated the following decadesMedicare, Medicaid, and a host
of new regulations were the
lingering aftereffects in government, a generation of drug-takers the
result in society at large.
The Sabin oral polio vaccine was approved in the United States in 1960
after trials involving 100 million
people overseas and promised salvation to a new generation from this former
scourge. In 1961, the sweetener
cyclamate was introduced in the first low-calorie soft drink, Diet Rite,
and it created demand for consumption
without cost, or at least without weight gain. In 1964, a suitable, routine
vaccine for measles was developed
that was much better than its predecessor vaccine first produced in 1960.
In 1967, a live-virus mumps vaccine
was developed. Faith that childhood diseases could be stamped out grew
rapidly. The Surgeon General of the
United States even went so far as to state that we were coming nearfor the
first time in historyto finally
closing the books on infectious diseases.
From todays perspective, these seem like naive hopes. But they were not
without some foundation.
Antibiotics had yet to lose effectiveness, and more were being discovered
or synthesized. In 1966, the first
antiviral drug, amantadine-HCl, was licensed in the United States for use
against influenza. In 1967, the who
began efforts to eradicate smallpox. The rhetoric of nongovernmental
antipolio and antituberculosis groups
provided additional reason for optimism.
Strnka 41

Pharmaceutical_Century
No wonder a generation of baby boomers was poised to demand a
pill or vaccine for any and every ill that affected
humankindpharmaceutical protection from unwanted pregnancies,
from mental upset, from disease. Rising expectations were reflected
in the science and business arenas, where research was promoted
and industry driven to attack a greater variety of human ills with the
weapons of science. Technological fixes in the form of pills and
chemicals seemed inevitable. How else could the idea of a war on
cancer in the next decade be initiated with the same earnestness and
optimism as the quest for a man on the moon? The 1969 success of
Apollo 11 was the paradigm for the capabilities of technology.
On a darker note, the decade ended with a glimpse of the more
frightening aspects of what biomedical technology could do, or at
least what militarists wanted it to do. In 1969, the U.S. Department
of Defense requested $10 million from Congress to develop a
synthetic biological agent to which no natural immunity existed.
Funding soon followed under the supervision of the CIA at Fort
Detrick, MD.
Ultimately, although a new battery of technologies had become
available in the 1960s, including HPLC, GC/MS, and machines to
synthesize DNA and proteins, and new knowledge bases were
developedfrom cracking the genetic code to the discovery of
restriction enzymesthe majority of these breakthroughs would not
bear fruit until the 1970s and 1980s. The 1960s would instead be
remembered primarily for the changes wrought by new pharmaceuticals and new
social paradigms. As the
decade ended, it was an open question as to whether the coming decade would
bring the dawn of massive
biological warfare or the rather expected nuclear holocaust (heightened in
the world psyche by the Cuban
Missile Crisis of 1962). Others speculated that the future would bring
massive social collapse arising from the
intergenerational breakdown in the West that many blamed on the success of
pharmaceutical technology in
developing and producing new and dangerous drugs.
Chemistry, cancer & ecology (1970s)
As the 1970s opened, new chemistries and the war
seized center stage. U.S. President Richard
his
pursuit of the Vietnam War) established the
popularly
known as the war on cancer, with an initial
funding.

on cancer
Nixon (taking a moment of from
National Cancer Program,
half-billion dollars of new

Carcinogens were one of the concerns in the controversy surrounding the


polluted
Love Canal. And cancer was especially prominent in the emotional issue of
the
DES daughterswomen at risk for cancer solely because of
diethylstilbestrol
(DES), the medication prescribed to their mothers during pregnancy. New
cancer
treatments were developed; chemotherapy joined the ranks of routine
treatments,
especially for breast cancer.
New drugs appeared. Cyclosporin provided a long-sought breakthrough with
its
ability to prevent immune rejection of tissue grafts and organ transplants.
Rifampicin proved its worth for treating tuberculosis; cimetidine
(Tagamet), the
first histamine blocker, became available for treating peptic ulcers.
Throughout the
Strnka 42

Pharmaceutical_Century
decade, improvements in analytical instrumentation, including high-pressure
liquid
chromatography (HPLC) and mass spectrometry, made drug purification and
analysis easier than ever before. In this period, NMR became transformed
into the medical imaging system,
MRI.
The popular environmental movement that took root in the ideology of the
previous decade blossomed
politically in 1970 as the first Earth Day was celebrated, the U.S. Clean
Air Act was passed, and the U.S.
Environmental Protection Agency (EPA) was established.
Some of the optimism of the Sixties faded as emerging plagues, such as Lyme
and Legionnaires disease in the
United States and Ebola and Lassa fever in Africa, reopened the book on
infectious diseases. The World
Health Organization continued its smallpox eradication campaign, but as DDT
was gradually withdrawn
because of its detrimental effect on the environment, efforts to eradicate
malaria and sleeping sickness were
imperiled.
Ultimately, the 1970s saw the start of another kind of infectiongenetic
engineering feveras recombinant
DNA chemistry dawned. In 1976, in a move to capitalize on the new
discoveries, Genentech Inc. (San
Francisco) was founded and became the prototypical entrepreneurial biotech
company. The companys very
existence forever transformed the nature of technology investments and the
pharmaceutical industry.
Cancer wars
The first major salvo against cancer in the United States was in 1937, when
the National Cancer Institute
(NCI) was created by congressional mandate. A sign of the times, one of the
key provisions of the act was to
enable the institute to procure, use, and lend radium. Such an early
interest in cancer was by no means
unique to the United States. Throughout the period, Nazi Germany led the
world in cancer research, including
early demonstrations of the carcinogenic effects of smoking tobacco. Cancer
remained of interest to
researchers throughout the world in the decades to follow.
In the 1950s, a major move was made to develop chemotherapies for various
cancers. By 1965, the NCI had
instituted a program specifically for drug development with participation
from the NIH, industry, and
universities. The program screened 15,000 new chemicals and natural
products each year for potential
effectiveness.
Still, by the 1970s, there seemed to be a harsh contrast between medical
success against infectious diseases
and cancer. The 1971 report of the National Panel of Consultants on the
Conquest of Cancer (called the
Yarborough Commission) formed the basis of the 1971 National Cancer Act
signed by President Nixon. The
aim of the act was to make the conquest of cancer a national crusade with
an initial financial boost of $500
million (which was allocated under the direction of the long-standing NCI).
The Biomedical Research and
Research Training Amendment of 1978 added basic research and prevention to
the mandate for the continuing
program.
Daughters and sons
Strnka 43

Pharmaceutical_Century
The story of the synthetic estrogen, DES, framed cancer as a complex
societal problem and not just an issue
dealing with a particular new source of cancer. DES daughters not only
crossed generations, but also involved
interactions between patients, the drug industry, uninformed or wrongly
informed physicians, and the political
interests of Congress and the FDA.
DES was prescribed from the early 1940s until 1971 to help prevent certain
complications of pregnancy,
especially those that led to miscarriages. By the 1960s, DES use was
decreasing because of evidence that the
drug lacked effectiveness and might indeed have damaging side effects,
although no ban or general warning to
physicians was issued. According to the University of Pennsylvania Cancer
Center (Philadelphia), there are
few reliable estimates of the number of women who took DES, although one
source estimates that 510 million
women either took the drug during pregnancy or were exposed to it in utero.
In 1970, a study in the journal Cancer described a rare form of
vaginal cancer, clear cell adenocarcinoma (CCAC). The following
year, a study in The New England Journal of Medicine
documented the association between in utero DES exposure and
the development of CCAC. By the end of that year, the FDA
issued a drug bulletin warning of potential problems with DES and
advised against its use during pregnancy. So-called DES daughters
experienced a wide variety of effects including infertility,
reproductive tract abnormalities, and increased risks of vaginal
cancer. More recently, a number of DES sons were also found to
have increased levels of reproductive tract abnormalities. In 1977,
inspired by the tragedies caused by giving thalidomide and DES to
pregnant women, the FDA recommended against including women
of child-bearing potential in the early phases of drug testing except
for life-threatening illnesses.
The discovery of DES in beef from hormone-treated cattle after the
FDA drug warning led in 1979 to what many complained was a
long-delayed ban on its use by farmers. The DES issue was one of
several that helped focus part of the war against cancer as a fight
against environmental carcinogens (see below).
Cancer research/cancer cures
New evidence at the beginning of the decade seemed to promote
an infectious model of cancer development. In 1970, Howard
Martin Temin (at the University of Wisconsin Madison) and David
Baltimore (at the Massachusetts Institute of Technology; MIT)
independently discovered viral reverse transcriptase, showing that
some RNA viruses (the retroviruses), in their own version of
genetic engineering, were capable of creating DNA copies of
themselves. The viral DNA was able to integrate into the infected
host cell, which then transformed into a cancer cell. Reverse
transcriptase eventually became critical to the study of the aids virus
in following decades. Temin and Baltimore shared the 1975 Nobel
Prize in Physiology or Medicine with Renato Dulbecco of the Salk
Institute. Many claims of cancer virus discoveries based on animal
studies came and went early in the decade, but they proved
inapplicable to humans. Hopes of treating the class of diseases
known as cancers with traditional vaccination techniques declined.
Other research developments helped expand knowledge of the
mechanics and causes of cancer. In 1978, for example, the cancer suppressor
gene P53 was first observed by
David Lane at the University of Dundee. By 1979, it was possible to use DNA
from malignant cells to
transform cultured mouse cells into tumorscreating an entirely new tool
for cancer study.
Strnka 44

Pharmaceutical_Century
Although many treatments for cancer existed at the beginning of the 1970s,
few real cures were available.
Surgical intervention was the treatment of choice for apparently defined
tumors in readily accessible locations.
In other cases, surgery was combined with or replaced by chemotherapy
and/or radiation therapy. Oncology
remained, however, as much an art as a science in terms of actual cures.
Too much depended on too many
variables for treatments to be uniformly applicable or uniformly
beneficial.
There were, however, some obvious successes in the 1970s. Donald Pinkel of
St. Judes Hospital (Memphis)
developed the first cure for acute lymphoblastic leukemia, a childhood
cancer, by combining chemotherapy
with radiotherapy.
The advent of allogenic (foreign donor) bone marrow transplants in 1968
made such treatments possible, but
the real breakthrough in using powerful radiation and chemotherapy came
with the development of autologous
marrow transplantation. The method was first used in 1977 to cure patients
with lymphoma. Autologous
transplantation involves removing and usually cryopreserving a patients
own marrow and reinfusing that
marrow after the administration of high-dosage drug or radiation therapy.
Because autologous marrow can
contain contaminating tumor cells, a variety of methods have been
established to attempt to remove or
deactivate them, including antibodies, toxins, and even in vitro
chemotherapy. E. Donnall Thomas of the Fred
Hutchinson Cancer Research Center (Seattle) was instrumental in developing
bone marrow transplants and
received the 1990 Nobel Prize in Physiology or Medicine for his work.
Although bone marrow transplants were originally used primarily to treat
leukemias, by the end of the century,
they were used successfully as part of high-dose chemotherapy regimes for
Hodgkins disease, multiple
myeloma, neuroblastoma, testicular cancer, and some breast cancers.
In 1975, a WHO survey showed that death rates from breast cancer had not
declined since 1900. Radical
mastectomy was ineffective in many cases because of late diagnosis and the
prevalence of undetected
metastases. The search for alternative and supplemental treatments became a
high research priority. In 1975, a
large cooperative American study demonstrated the benefits of using
phenylalanine mustard following surgical
removal of the cancerous breast. Combination therapies rapidly proved even
more effective; and by 1976,
CMF (cyclophosphamide, methotrexate, and 5-fluorouracil) therapy was
developed at the Instituto Nazionale
Tumori in Milan, Italy. It proved to be a radical improvement over surgery
alone and rapidly became the
chemotherapy of choice for this disease.
A new environment
Launched in part by Rachel Carsons book, Silent Spring, in the
previous decade, the environmental movement in the West became
ever more prominent. The first Earth Day was held on April 22,
1970, to raise environmental awareness. The EPA was launched on
December 2, and Nixon closed out the year by signing the Clean Air
Act on December 31.
The concept of carcinogens entered the popular consciousness.
Ultimately, the combination of government regulations and public
Strnka 45

Pharmaceutical_Century
fears of toxic pollutants in food, air, and water inspired improved
technologies for monitoring extremely small amounts of chemical
contaminants. Advances were made in gas chromatography, ion
chromatography, and especially the EPA-approved combination of
GC/MS. The 1972 Clean Water Act and the Federal Insecticide and
Rodenticide Act added impetus to the need for instrumentation and
analysis standards. By the 1980s, many of these improvements in
analytical instrumentation had profound effects on the scientific
capabilities of the pharmaceutical industry. One example is atomic
absorption spectroscopy, which in the 1970s made it possible to
assay trace metals in foods to the parts-per-billion range. The new
power of such technologies enabled nutritional researchers to
determine, for the first time, that several trace elements (most usually
considered pollutants) were actually necessary to human health.
These included tin (1970), vanadium (1971), and nickel (1973).
In 1974, the issue of chemical-induced cancer became even broader
when F. Sherwood Rowland of the University of CaliforniaIrvine
and Mario Molina of MIT demonstrated that chlorofluorocarbons
(CFCs) such as Freon could erode the UV-absorbing ozone layer.
The predicted results were increased skin cancer and cataracts,
along with a host of adverse effects on the environment. This
research led to a ban of CFCs in aerosol spray cans in the United
States. Rowland and Molina shared the 1995 Nobel Prize in
Chemistry for their ozone work with Paul Crutzen of the Max Planck
Institute for Chemistry (Mainz, Germany).
By 1977, asbestos toxicity had become a critical issue. Researchers
at the Mount Sinai School of Medicine (New York) discovered that asbestos
inhalation could cause cancer
after a latency period of 20 years or more. This discovery helped lead to
the passage of the Toxic Substances
Control Act, which mandated that the EPA inventory the safety of all
chemicals marketed in the United States
before July 1977 and required manufacturers to provide safety data 90 days
before marketing any chemicals
produced after that date. Animal testing increased where questions existed,
and the issue of chemical
carcinogenicity became prominent in the public mind and in the commercial
sector.
Also in the 1970s, DDT was gradually withdrawn from vector eradication
programs around the world because
of the growing environmental movement that resulted in an outright ban on
the product in the United States in
1971. This created a continuing controversy, especially with regard to who
attempts to eliminate malaria and
sleeping sickness in the developing world. In many areas, however, the
emergence of DDT-resistant insects
already pointed to the eventual futility of such efforts. Although DDT was
never banned completely except by
industrialized nations, its use declined dramatically for these reasons.
Recombinant DNA and more
In 1970, two years before the birth of recombinant DNA (rDNA) technology,
cytogeneticist Robert John
Cecil Harris coined the term genetic engineering.
But more importantly, in 1970, Werner Arber of the Biozentrum der
Universitt Basel (Switzerland)
discovered restriction enzymes. Hamilton O. Smith at Johns Hopkins
University (Baltimore) verified Arbers
hypothesis with a purified bacterial restriction enzyme and showed that
this enzyme cuts DNA in the middle of
a specific symmetrical sequence. Daniel Nathans, also at Johns Hopkins,
demonstrated the use of restriction
enzymes in the construction of genetic maps. He also developed and applied
new methods of using restriction
Strnka 46

Pharmaceutical_Century
enzymes to solve various problems in genetics. The three scientists shared
the 1978 Nobel Prize in Physiology
or Medicine for their work in producing the first genetic map (of the SV40
virus).
In 1972, rDNA was born when Paul Berg of Stanford University demonstrated
the ability to splice together
blunt-end fragments of widely disparate sources of DNA. That same year,
Stanley Cohen and Herbert Boyer,
both from Stanford, met at a Waikiki Beach delicatessen where they
discussed ways to combine plasmid
isolation with DNA splicing. They had the idea to combine the use of the
restriction enzyme EcoR1 (which
Boyer had discovered in 1970 and found capable of creating sticky ends)
with DNA ligase (discovered in
the late 1960s) to form engineered plasmids capable of producing foreign
proteins in bacteriathe basis for
the modern biotechnology industry. By 1973, Cohen and Boyer had produced
their first recombinant plasmids.
They received a patent on this technology for Stanford University that
would become one of the biggest
money-makers in pharmaceutical history.
The year 1975 was the year of DNA sequencing. Walter Gilbert and Allan
Maxam of Harvard University and
Fred Sanger of Cambridge University simultaneously developed different
methods for determining the
sequence of bases in DNA with relative ease and efficiency. For this
accomplishment, Gilbert and Sanger
shared the 1980 Nobel Prize in Physiology or Medicine.
By 1976, Silicon Valley venture capitalist Robert Swanson teamed up with
Herbert Boyer to form Genentech
Inc. (short for genetic engineering technology). It was the harbinger of a
wild proliferation of biotechnology
companies over the next decades. Genentechs goal of cloning human insulin
in Escherichia coli was achieved
in 1978, and the technology was licensed to Eli Lilly.
In 1977, the first mammalian gene (the rat insulin gene) was cloned into a
bacterial plasmid by Axel Ullrich of
the Max Planck Institute. In 1978, somatostatin was produced using rDNA
techniques.
The recombinant DNA era grew from these beginnings and had a major impact
on pharmaceutical production
and research in the 1980s and 1990s.
High tech/new mech
In June 1970, Raymond V. Damadian at the State University of
New York discovered that cancerous tissue in rats exhibited
dramatically prolonged NMR relaxation times. He also found that
the relaxation times of normal tissues also vary significantly,
although less dramatically than cancer tissue. Damadians March
1971 Science article, Tumor Detection by Nuclear Magnetic
Resonance, became the basis for magnetic resonance imaging
(MRI)s pioneer patent, issued to him in 1974, which included a
description of a three-dimensional in vivo method for obtaining
diagnostic NMR signals from humans. In 1977, Damadian and
colleagues achieved the first NMR image of a human in a
whole-body MRI scanner. In 1988, Damadian was awarded the
National Medal of Technology. MRI became one of the most
sensitive and useful tools for disease diagnosis, and the basis of
MR spectroscopy, itself one of the most sophisticated in vivo
physiological research tools available by the end of the century.
In 1971, the first coaxial tomography scanner was installed in
England. By 1972, the first whole-body computed tomography
Strnka 47

Pharmaceutical_Century
(CT) scanner was marketed by Pfizer. That same year, the
Brookhaven Linac Isotope Producer went on line, helping to
increase the availability of isotopes for medical purposes and basic
research. In 1977, the first use of positron emission tomography
(PET) for obtaining brain images was demonstrated.
The 1970s saw a revolution in computing with regard to speed, size, and
availability. In 1970, Ted Hoff at Intel
invented the first microprocessor. In 1975, the first personal computer,
the Altair, was put on the market by
American inventor Ed Roberts. Also in 1975, William Henry Gates III and
Paul Gardner Allen founded
Microsoft. And in 1976, the prototype for the first Apple Computer
(marketed as the Apple II in 1977) was
developed by Stephen Wozniak and Steven Jobs. It signaled the movement of
personal computing from the
hobbyist to the general public and, more importantly, into pharmaceutical
laboratories where scientists used
PCs to augment their research instruments.
In 1975, Edwin Mellor Southern of the University of Oxford invented a
blotting technique for analyzing
restriction enzyme digest fragments of DNA separated by electrophoresis.
This technique became one of the
most powerful technologies for DNA analysis and manipulation and was the
conceptual template for the
development of northern blotting (for RNA analysis) and western blotting
(for proteins).
Although HPLC systems had been commercially available from ISCO since 1963,
they were not widely used
until the 1970s whenunder license to Waters Associates and Varian
Associatesthe demands of
biotechnology and clinical practice made such systems seem a vibrant new
technology. By 1979,
Hewlett-Packard was offering the first microprocessor-controlled HPLC, a
technology that represented the
move to computerized systems throughout the life sciences and
instrumentation in general. Throughout the
decade, GC and MS became routine parts of life science research, and the
first linkage of LC/MS was offered
by Finnigan. These instruments would have increasing impact throughout the
rest of the century.
(Re)emerging diseases
In 1969, U.S. Surgeon General William Stewart claimed in testifying before
Congress that the time has come
to close the book on infectious diseases. He believed that it was
especially time to reinvest money to treat
killers such as heart disease and cancer, since, in his opinion, it was
only a matter of time before the war
against infection would be won by a combination of antibiotics and
vaccines. The traditional diseases were
indeed on the run before the onslaught of modern medicines. What he could
not foresee was the emergence of
new diseases and the reemergence of old plagues in the guise of
drug-resistant strains. The 1970s helped
throw cold water on this kind of optimism, perhaps signaled by the shift in
emphasis implied by the 1970 name
change of the Communicable Disease Center to the Centers for Disease
Control.
There were certainly enough new diseases and old friends to control. For
example, in 1972, the first cases of
recurrent polyarthritis (Lyme disease) were recorded in Old Lyme and Lyme,
CT, ultimately resulting in the
spread of the tick-borne disease throughout the hemisphere.
Strnka 48

Pharmaceutical_Century
The rodent-borne arena viruses were identified in the 1960s and shown to be
the causes of numerous diseases
seen since 1934 in both developed and less-developed countries.
Particularly deadly was the newly
discovered Lassa fever virus, first identified in Africa in 1969 and
responsible for small outbreaks in 1970 and
1972 in Nigeria, Liberia, and Sierra Leone, with a mortality rate of some
3638%.
Then, in 1976, epidemics of a different hemorrhagic fever occurred
simultaneously in Zaire and Sudan.
Fatalities reached 88% in Zaire (now known as the Democratic Republic of
the Congo) and 53% in Sudan,
resulting in a total of 430 deaths. Ebola virus, named after a small river
in northwest Zaire, was isolated from
both epidemics. In 1977, a fatality was attributed to Ebola in a different
area of Zaire. The investigation of this
death led to the discovery that there were probably two previous fatal
cases. A missionary physician
contracted the disease in 1972 while conducting an autopsy on a patient
thought to have died of yellow fever.
In 1979, the original outbreak site in Sudan generated a new episode of
Ebola hemorrhagic fever that resulted
in 34 cases with 22 fatalities. Investigators were unable to discover the
source of the initial infections. The
dreaded nature of the diseasethe copious bleeding, the pain, and the lack
of a curesent ripples of concern
throughout the worlds medical community.
In 1976, the unknown Legionnaires disease appeared at a convention in
Philadelphia, killing 29 American
Legion convention attendees. The cause was identified as the newly
discovered bacterium Legionella. Also in
1976, a Nobel Prize in Physiology or Medicine was awarded to Baruch
Blumberg (originally at the NIH, then
at the University of Pennsylvania) for the discovery of a new disease agent
in 1963hepatitis B, for which he
helped to develop a blood test in 1971.
Nonetheless, one bit of excellent news at the end of the decade made such
minor outbreaks of new diseases
seem trivial in the scope of human history. From 1350 B.C., when the first
recorded smallpox epidemic
occurred during the EgyptianHittite war, to A.D. 180, when a large-scale
epidemic killed between 3.5 and 7
million people (coinciding with the first stages of the decline of the
Roman Empire), through the millions of
Native Americans killed in the 16th century, smallpox was a quintessential
scourge.
But in 1979, a WHO global commission was able to certify the worldwide
eradication of smallpox, achieved
by a combination of quarantines and vaccination. The last known natural
case of the disease occurred in 1977
in Somalia. Government stocks of the virus remain a biological warfare
threat, but the achievement may still,
perhaps, be considered the most unique event in the Pharmaceutical
Centurythe disease-control equivalent
of landing a man on the moon. By 1982, vaccine production ceased. By the
1990s, a controversy arose
between those who wanted to maintain stocks in laboratories for medical and
genetic research purposes (and
possibly as protection against clandestine biowarfare) and those who hoped
to destroy the virus forever.
On a lesser but still important note, the first leprosy vaccine using the
nine-banded armadillo as a source was
developed in 1979 by British physician Richard Rees at the National
Strnka 49

Pharmaceutical_Century
Institute for Medical Research (Mill Hill,
London).
Toward a healthier world
The elimination of smallpox was just part of a large-scale move in the
1970s to deal with the issue of global
health.
In 1974, the WHO launched an ambitious Expanded Program on Immunization to
protect children from polio
myelitis, measles, diphtheria, whooping cough, tetanus, and tuberculosis.
Throughout the 1970s and the rest of the century, the role of DDT in vector
control continued to be a
controversial issue, especially for the eradication or control of malaria
and sleeping sickness. The WHO would
prove to be an inconsistent ally of environmental groups that urged a ban
of the pesticide. The organizations
recommendations deemphasized the use of the compound at the same time that
its reports emphasized its
profound effectiveness.
Of direct interest to the world pharmaceutical industry, in 1977 the WHO
published the first Model List of
Essential Drugs208 individual drugs which could together provide safe,
effective treatment for the majority
of communicable and noncommunicable diseases. This formed the basis of a
global movement toward
improved health provision as individual nations adapted and adopted this
list of drugs as part of a program for
obtaining these universal pharmacological desiderata.
That 70s showdown
The decade lurched from the OPEC oil embargoes to Watergate,
through stagflation, world famine, and hostage crises to a final
realization, in 1979, that it all might have started with a comet or
asteroid that crashed into the earth 65 million years before, killing off
the dinosaurs. And that perhaps it might end the same way.
Medical marvels and new technologies promised excitement at the
same time that they revealed more problems to be solved. A new
wave of computing arose in Silicon Valley. Oak Ridge National
Laboratory detected a single atom for the first timeone atom of
cesium in the presence of 1019 argon atoms and 1018 methane
moleculesusing lasers. And biotechnology was born as both a
research program and a big business. The environmental movement
contributed not only a new awareness of the dangers of carcinogens,
but a demand for more and better analytical instruments capable of
extending the range of chemical monitoring. These would make their
way into the biomedical field with the demands of the new
biotechnologies.
The pharmaceutical industry would enter the 1980s with one eye on its
pocketbook, to be sure, feeling harried
by economics and a host of new regulations, both safety and environmental.
But the other eye looked to a
world of unimagined possibilities transformed by the new DNA chemistries
and by new technologies for
analysis and computation.
Arteries, AIDS, and Engineering

(1980s)

Whether the changes to the pharmaceutical industry and the


world in the 1980s will prove most notable for the rise of and reaction to
a new
disease, AIDS, or the flowering of entrepreneurial biotechnology and
genetic
engineering, it is too soon to say. These changesalong with advances in
Strnka 50

Pharmaceutical_Century
immunology, automation, and computers, the development of new paradigms of
cardiovascular and other diseases, and restructured social moresall
competed
for attention in the transformational 1980s.
AIDS: A new plague
It struck the big cities first, and within those cities, at first, it only
affected certain
segments of the population, primarily homosexual men. The first published
reports
of the new disease seemed like no more than medical curiosities. On June 5,
1981,
the Atlanta-based Centers for Disease Control and Prevention (CDC), a
federal
agency charged with keeping tabs on disease, published an unusual notice in
its
Morbidity and Mortality Weekly Report: the occurrence of Pneumocystis
carinii
pneumonia (PCP) among gay men. In New York, a dermatologist encountered
cases of a rare cancer,
Kaposis sarcoma (KS), a disease so obscure he recognized it only
fromdescriptions in antiquated textbooks.
By the end of 1981, PCF and KS were recognized as harbingers of a new and
deadly disease. The disease
was initially called Gay Related Immune Deficiency. Within a year, similar
symptoms appeared in other
demographic groups, primarily hemophiliacs and users of intravenous drugs.
The CDC renamed the disease
Acquired Immune Deficiency Syndrome (AIDS). By the end of 1983, the CDC had
recorded some 3000
cases of this new plague. The prospects for AIDS patients were not good:
almost half had already died.
AIDS did not follow normal patterns of disease and infection. It produced
no visible symptomsat least not
until the advanced stages of infection. Instead of triggering an immune
response, it insidiously destroyed the
bodys own mechanisms for fighting off infection. People stricken with the
syndrome died of a host of
opportunistic infections such as rare viruses, fungal infections, and
cancers. When the disease began to appear
among heterosexuals, panic and fear increased. Physicians and scientists
eventually mitigated some of the
hysteria when they were able to explain the methods of transmission. As
AIDS was studied, it became clear
that the disease was spread through intimate contact such as sex and
sharing syringes, as well as transfusions
and other exposure to contaminated blood. It could not be spread through
casual contact such as shaking
hands, coughing, or sneezing. In 1984, Robert Gallo of the National Cancer
Institute (NCI) and Luc
Montagnier of the Institut Pasteur proved that AIDS was caused by a virus.
There is still a controversy over
priority of discovery.
However, knowledge of the diseases cause did not mean readiness to combat
the plague. Homophobia and
racism, combined with nationalism and fears of creating panic in the blood
supply market, contributed to
deadly delays before action was taken by any government. The relatively low
number of sufferers skyrocketed
around the world and created an uncontrollable epidemic.
Immunology comes of age
The 1980s were a decade of worldwide interest in immunology, an interest
unmatched since the development
of the vaccine era at the beginning of the century. In 1980, the Nobel
Strnka 51

Pharmaceutical_Century
Prize in Physiology or Medicine went to
three scientists for their work in elucidating the genetic basis of the
immune system. Baruj Benacerraf at
Harvard University (Cambridge, MA), George Snell of the Jackson Laboratory
(Bar Harbor, ME), and Jean
Dausset of the University of Paris explored the genetic basis of the immune
response. Their work demonstrated
that histocompatibility antigens (called H-factors or H-antigens)
determined the interaction of the myriad cells
responsible for an immunological response.
Early efforts to study immunology were aimed at understanding the structure
and function of the immune
system, but some scientists looked to immunology to try to understand
diseases that lacked a clear outside
agent. In some diseases, some part of the body appears to be attacked not
by an infectious agent but by the
immune system. Physicians and researchers wondered if the immune system
could cause, as well as defend
against, disease.
By the mid-1980s, it was clear that numerous diseases, including lupus and
rheumatoid arthritis, were
connected to an immune system malfunction. These were called autoimmune
diseases because they were
caused by a patients own immune system. Late in 1983, juvenile-onset
diabetes was shown to be an
autoimmune disease in which the bodys immune system attacks
insulin-producing cells in the pancreas.
Allergies were also linked to overreactions of the immune system.
By 1984, researchers had discovered an important piece of the puzzle of
immune system functioning. Professor
Susumu Tonegawa and his colleagues discovered how the immune system
recognizes self versus
not-selfa key to immune system function. Tonegawa elucidated the
complete structure of the cellular T cell
receptor and the genetics governing its production. It was already known
that T cells were the keystone of the
entire immune system. Not only do they recognize self and not-self and so
determine what the immune system
will attack, they also regulate the production of B cells, which produce
antibodies. Immunologists regarded this
as a major breakthrough, in large part because the human immunodeficiency
virus (HIV) that causes AIDS was
known to attack T cells. The T cell response is also implicated in other
autoimmune diseases and many cancers
in which T cells fail to recognize not-self cells.
The question remained, however, how the body could possibly contain enough
genes to account for the
bewildering number of immune responses. In 1987, for the third time in the
decade, the Nobel Prize in
Physiology or Medicine went to scientists working on the immune system. As
Tonegawa had demonstrated in
1976, the immune system can produce an almost infinite number of responses,
each of which is tailored to suit
a specific invader. Tonegawa showed that rather than containing a vast
array of genes for every possible
pathogen, a few genetic elements reshuffled themselves. Thus a small amount
of genetic information could
account for many antibodies.
The immune system relies on the interaction of numerous kinds of cells
circulating throughout the body.
Unfortunately, AIDS was known to target those very cells. There are two
principal types of cells, B cells and T
cells. T cells, sometimes called helper T cells, direct the production of
Strnka 52

Pharmaceutical_Century
B cells, an immune response targeted
to a single type of biological or chemical invader. There are also
suppressor cells to keep the immune
response in check. In healthy individuals, helpers outnumber suppressors by
about two to one. In
immunocompromised individuals, however, the T cells are exceedingly low
and, accordingly, the number of
suppressors extremely high. This imbalance appears capable of shutting down
the bodys immune response,
leaving it vulnerable to infections a healthy body wards off with ease.
Eventually, scientists understood the
precise mechanism of this process.
Even before 1983, when the viral cause of the disease was determined, the
first diagnostic tests were
developed to detect antibodies related to the disease. Initially, because
people at risk for AIDS were
statistically associated with hepatitis, scientists used the hepatitis core
antibody test to identify people with
hepatitis, and therefore, at risk for AIDS. By 1985, a diagnostic method
was specifically designed to detect
antibodies produced against the low titer HIV itself. Diagnosing infected
individuals and protecting the valuable
world blood supply spurred the diagnosis effort.
By the late 1980s, under the impetus and fear associated with AIDS,
both immunology and virology received huge increases in research
funding, especially from the U.S. government.
Pharmaceutical technology and AIDS
Knowing the cause of a disease and how to diagnose it is quite a
different matter from knowing how to cure it. Although by no means
the solution, ironically, the pharmaceutical technology initially used to
combat HIV was discovered some 20 years before AIDS appeared.
Azidothymidine (AZT) was developed in 1964 as an anticancer drug
by Jerome Horowitz of the Michigan Cancer Foundation (Detroit).
But because AZT was ineffective against cancer, Horowitz never
filed a patent.
In 1987, the ultimate approval of AZT as an antiviral treatment for
AIDS was the result of both the hard technology of the laboratory
and the soft technologies of personal outrage and determination (and
deft use of the 1983 Orphan Drug Act). Initially, the discovery of the
viral nature of AIDS resulted in little, if any, R&D in corporate
circles. The number of infected people was considered too small to
justify the cost of new drug development, and most scientists thought
retroviruses were untreatable. However, Sam Broder, a physician
and researcher at the NCI, thought differently. As clinical director of
the NCIs 1984 Special Task Force on AIDS, Broder was
determined to do something. Needing a larger lab, Broder went to
the pharmaceutical industry for support.
As Broder canvassed the drug industry, he promised to test
potentially useful compounds in NCI labs if the companies would
commit to develop and market drugs that showed potential. One of
the companies he approached was Burroughs Wellcome, the
American subsidiary of the British firm Wellcome PLC. Burroughs
Wellcome had worked on nucleoside analogues, a class of antivirals that
Broder thought might work against
HIV. Burroughs Wellcome had successfully brought to market an antiherpes
drug, acyclovir.
Although many companies were reluctant to work on viral agents because of
the health hazards to researchers,
Broder persevered. Finally, Burroughs Wellcome and 50 other companies began
to ship chemicals to the NCI
labs for testing. Each sample was coded with a letter to protect its
identity and to protect each companys
Strnka 53

Pharmaceutical_Century
rights to its compounds. In early 1985, Broder and his team found a
compound that appeared to block the
spread of HIV in vitro. Coded Sample S, it was AZT sent by Burroughs
Wellcome.
There is a long road between in vitro efficacy and shipping a drug to
pharmaciesa road dominated by the
laborious approval process of the FDA. The agencys mission is to keep
dangerous drugs away from the
American public, and after the thalidomide scare of the late 1950s and
early 1960s, the FDA clung tenaciously
to its policies of caution and stringent testing.
However, with the advent of AIDS, many people began to question that
caution. AZT was risky. It could be
toxic to bone marrow and cause other less drastic side effects such as
sleeplessness, headaches, nausea, and
muscular pain. Even though the FDA advocated the right of patients to
knowingly take experimental drugs, it
was extremely reluctant to approve AZT. Calls were heard to reform or
liberalize the approval process, and a
report issued by the General Accounting Office (GAO) claimed that of 209
drugs approved between 1976
and 1985, 102 had caused serious side effects, giving lie to the apparent
myth that FDA approval automatically
safeguarded the public.
The agendas of patient advocates, ideological conservatives who opposed
government intrusion, and the
pharmaceutical industry converged in opposition to the FDAs caution. But
AIDS attracted more than its share
of false cures, and the FDA rightly kept such things out of the medical
mainstream. Nonetheless, intense public
demand (including protests by AIDS activist groups such as act up) and
unusually speedy testing brought the
drug to the public by the late 1980s. AZT was hard to tolerate and, despite
misapprehensions of its critics, it
was never thought to be a magic bullet that would cure AIDS. It was a
desperate measure in a desperate time
that at best slowed the course of the disease. AZT was, however, the first
of what came to be a major new
class of antiviral drugs.
Its approval process also had ramifications. The case of AZT became the tip
of the iceberg in a new world
where consumer pressures on the FDA, especially from disease advocacy
groups and their congressional
supporters, would lead to more and rapid approval of experimental drugs for
certain conditions. It was a
controversial change that in the 1990s would create more interest in
alternative medicine, nutritional
supplements, fast-track drugs, and attempts to further weaken the FDAs
role in the name of both consumer
rights and free enterprise.
Computers and pharmaceuticals
In the quest for new and better drugs, genetic and biological technologies
came to the fore in the 1980s. These
included a combination of hard (machine-based) and wet (wet
chemistry-based) technologies. In the early
1980s, pharmaceutical companies hoped that developments in genetic
engineering and other fields of molecular
biology would lead to sophisticated drugs with complicated structures that
could act in ways as complicated
and precise as proteins. Through understanding the three-dimensional
structure and hence the function of
proteins, drug designers interested in genetic engineering hoped they could
create protein-based drugs that
Strnka 54

Pharmaceutical_Century
replicated these structures and functions.
Unfortunately for the industry, its shareholders, and sick people who might
have been helped by these elegantly
tailored compounds, it did not turn out that way. By the end of the decade,
it was clear that whatever
economic usefulness there was in molecular biology developments (via
increased efficiency of chemical drugs),
such developments did not yet enable the manufacturing of complex
biologically derived drugs. So while
knowledge of protein structure has supplied useful models for cell
receptorssuch as the CD4 receptors on T
cells, to which drugs might bindit did not produce genetic wonder drugs.
Concomitant with this interest in molecular biology and genetic engineering
was the development of a new way
of conceiving drug R&D: a new soft technology of innovation. Throughout the
history of the pharmaceutical
industry, discovering new pharmacologically active compounds depended on a
try and see empirical
approach. Chemicals were tested for efficacy in what was called random
screening or chemical roulette,
names that testify to the haphazard and chancy nature of this drug
discovery approach. The rise of molecular
biology, with its promise of genetic engineering, fostered a new way of
looking at drug design. Instead of an
empirical try and see method, pharmaceutical designers began to compare
knowledge of human physiology
and the causes of medical disorders with knowledge of drugs and their
methods of physiological action to
conceptualize the right molecules. This ideal design is then handed over to
research chemists in the laboratory,
who search for a close match. In this quest, the biological model of cell
receptor and biologically active
molecule serve as a guide.
The role of hard technologies in biologically based drug research cannot be
overstated. In fact, important drug
discoveries owe a considerable amount to concomitant technological
developments, particularly in imaging
technology and computers. X-ray crystallography, scanning electron
microscopy, NMR, and laser and
magnetic- and optical-based imaging techniques allow the visualization of
atoms within a molecule. This
capability is crucial, as it is precisely this three-dimensional
arrangement that gives matter its chemically and
biologically useful qualities.
Computers were of particular use in this brave new world of drug design,
and their power and capabilities
increased dramatically during the 1980s. The increased computational power
of computers enabled
researchers to work through the complex mathematics that describe the
molecular structure of idealized drugs.
Drug designers, in turn, could use the increasingly powerful imaging
capabilities of computers to convert
mathematical protein models into three-dimensional images. Gone were the
days when modelers used sticks,
balls, and wire to create models of limited scale and complexity. In the
1980s, they used computers to
transform mathematical equations into interactive, virtual pictures of
elegant new models made up of thousands
of different atoms.
Since the 1970s, the pharmaceutical industry had been using computers to
design drugs to match receptors,
and it was one of the first industries to harness the steadily increasing
power of computers to design molecules.
Strnka 55

Pharmaceutical_Century
Software applications to simulate drugs first became popular in the 1980s,
as did genetics-based algorithms
and fuzzy logic. Research on genetic algorithms began in the 1970s and
continues today. Although not popular
until the early 1990s, genetic algorithms allow drug designers to evolve
a best fit to a target sequence through
successive generations until a fit or solution is found. Algorithms have
been used to predict physiological
properties and bioactivity. Fuzzy logic, which formalizes imprecise
concepts by defining degrees of truth or
falsehood, has proven useful in modeling pharmacological action, protein
structure, and receptors.
The business of biotechnology
The best drug in the world is worthless if it cannot be developed,
marketed, or manufactured. By the
mid-1980s, small biotechnology firms were struggling for survival, which
led to the formation of mutually
beneficial partnerships with large pharmaceutical companies and a host of
corporate buyouts of the smaller
firms by big pharma.
Eli Lilly was one of the big pharma companies that formed early
partnerships with smaller biotech firms.
Beginning in the 1970s, Lilly was one of the first drug companies to enter
into biotechnology research. By the
mid-1980s, Lilly had already put two biotechnology-based drugs into
production: insulin and human growth
hormone. Lilly produced human insulin through recombinant DNA techniques
and marketed it, beginning in
1982, as Humulin. The human genes responsible for producing insulin were
grafted into bacterial cells through a
technique first developed in the production of interferon in the 1970s.
Insulin, produced by the bacteria, was
then purified using monoclonal antibodies. Diabetics no longer had to take
insulin isolated from pigs.
By 1987, Lilly ranked second among all institutions (including
universities) and first among companies (including both large drug
manufacturers and small biotechnology companies) in U.S. patents
for genetically engineered drugs. By the late 1980s, Lilly recognized
the link between genetics, modeling, and computational power and,
already well invested in computer hardware, the company moved
to install a supercomputer.
In 1988, typical of many of the big pharma companies, Lilly formed
a partnership with a small biotechnology company: Agouron
Pharmaceuticals, a company that specialized in three-dimensional
computerized drug design. The partnership gave Lilly expertise in
an area it was already interested in, as well as manufacturing and
marketing rights to new products, while Agouron gained a stable
source of funding. Such partnerships united small firms that had
narrow but potentially lucrative specializations with larger
companies that already had development and marketing structures
in place. Other partnerships between small and large firms allowed
large drug companies to catch up in a part of the industry in which
they were not strongly represented. This commercialization of drug
discovery allowed companies to apply the results of biotechnology
and genetic engineering on an increasing scale in the late 1980s, a
process that continues.
The rise of drug resistance
New developments in the West in the late 1980s had particular implications
for drugs and pharmaceutical
technologies. Diseases that were thought to have been eliminated in
developed countries reappeared in the late
1980s with a frightening twist: they had developed drug resistance.
Strnka 56

Pharmaceutical_Century
Tuberculosis in particular experienced a resurgence. In the mid-1980s, the
worldwide decline in tuberculosis
cases leveled off and then began to rise. New cases of tuberculosis were
highest in less developed countries,
and immigration was blamed for the increased number of cases in developed
countries. (However, throughout
the 20th century in the United States, tuberculosis was a constant and
continued health problem for senior
citizens, Native Americans, and the urban and rural poor. In 1981, an
estimated 1 billion peopleone-third of
the worlds populationwere infected. So thinking about tuberculosis in
terms of a returned epidemic
obscures the unabated high incidence of the disease worldwide over the
preceding decades.)
The most troubling aspect of the reappearance of
tuberculosis was its resistance to not just one or two drugs,
but to multiple drugs.
Multidrug resistance stemmed from several sources. Every
use of an antibiotic against a microorganism is an incidence of
natural selection in actionan evolutionary version of the Red
Queens Race, when you have to run as fast as you can just
to stay in place. Using an antibiotic kills susceptible
organisms. Yet mutant organisms are present in every
population. If even a single pathogenic organism survives, it
can multiply freely.
Agricultural and medical practices have contributed to
resistant strains of various organisms. In agriculture, animal
feed is regularly and widely supplemented with antibiotics in
subtherapeutic doses. In medical practice, there has been
widespread indiscriminate and inappropriate use of antibiotics
to the degree that hospitals have become reservoirs of
resistant organisms.
Some tuberculosis patients unwittingly fostered
multidrug-resistant tuberculosis strains by failing to comply
with the admittedly lengthy, but required, drug treatment
regimen.
In this context, developing new and presumably more powerful drugs and
technologies became even more
important. One such technology was combinatorial chemistry, a nascent
science at the end of the 1980s.
Combinatorial chemistry sought to find new drugs by, in essence, mixing and
matching appropriate starter
compounds and reagents and then assessing them for suitability. Computers
were increasingly important as the
approach matured. Specialized software was developed that could not only
select appropriate chemicals but
also sort through the potentially awesome number of potential drugs.
Prevention, the best cure
Even as the pharmaceutical industry produced ever more sophisticated drugs,
there was a new emphasis in
health care: preventive medicine.
While new drugs were being designed, medicine focused on preventing disease
rather than simply trying to
restore some facsimile of health after it had developed. The link between
exercise, diet, and health dates back
4000 years or more. More recently, medical texts from the 18th and 19th
centuries noted that active patients
were healthier patients.
In the early 1900s, eminent physician Sir William Osler characterized heart
disease as rare. By the 1980s, in
Strnka 57

Pharmaceutical_Century
many Western countries some 30% of all deaths were attributed to heart
disease, and for every two people
who died from a heart attack, another three suffered one but survived.
During this decade, scientists finally
made a definitive connection between heart disease and diet, cholesterol,
and exercise levels.
So what prompted this change in perspective? In part, the answer lies with
the spread of managed care.
Managed care started in the United States before World War II, when
Kaiser-Permanente got its start. Health
maintenance organizations (HMOs), of which Kaiser was and is the archetypal
representative and which were
and are controversial, spread during the 1980s as part of an effort to
contain rising medical costs. One way to
keep costs down was to prevent people from becoming ill in the first place.
But another reason for the growth of managed care had to do with a profound
shift in the way that diseases,
particularly diseases of lifestyle, were approached. Coronary heart disease
has long been considered the
emblematic disease of lifestyle. During the 1980s, a causal model of heart
disease that had been around since
the 1950s suddenly became the dominant, if not the only, paradigm for heart
disease. This was the risk factor
approach.
This view of heart diseaseits origins, its outcomes, its causesis a set
of unquestioned and unstated
assumptions about how individuals contribute to disease. Drawing from
population studies about the
relationship between heart disease and individual diet, genetic heritage,
and habits, as much as from
biochemical and physiological causes of atherosclerosis, the risk factor
approach tended to be a more holistic
approach.
Whereas the older view of disease prevention focused on identifying those
who either did not know they were
affected or who had not sought medical attention (for instance,
tuberculosis patients earlier in the century), this
new conceptual technology aimed to identify individuals who were at risk
for a disease. According to the logic
of this approach, everyone was potentially at risk for something, which
provided a rationale for population
screening. It was a new way to understand individual responsibility for and
contribution to disease.
Focusing on risk factors reflected a cultural preoccupation with the
individual and the notion that individuals were responsible for their
own health and illness. As part of a wave of new conservatism
against the earlier paternalism of the Great Society, accepting the risk
factor approach implied that social contributions to disease, such as
the machinations of the tobacco and food industries, poverty, and
work-related sedentary lifestyles, were not to blame for illness.
Individuals were considered responsible for choices and behavior
that ran counter to known risk factors.
While heart disease became a key focus of the 1980s, other
disorders, including breast, prostate, and colon cancer, were
ultimately subsumed by this risk factor approach. AIDS, the ultimate
risk factor disease, became the epitome of this approach. Debates
raged about the disease and related issues of homosexuality,
condoms, abstinence, and needle exchange and tore at the fabric of
society as the 1990s dawned.
Conclusion
The 1980s saw the resurgence of old perils, the emergence of new ones, and
Strnka 58

Pharmaceutical_Century
the rapid mobilization of new
biotechnological tools to combat both. The decade also saw the surge and
temporary fallback of
entrepreneurial biotechnology. Throughout the decade, the hard technologies
of genetic engineering and
developments in immunology, automation, genomics, and computerscombined
with the soft technologies of
personal action and acknowledgment of risk and changes in government
regulations and ways of
thinkingaffected the direction of biomedicine and pharmacy. The coming
decade would see the explosive
growth of these various trendsinto both flowers and weeds.
Harnessing genes, recasting flesh (1990s)
In the 1990s, the Human Genome Project took off like a
rocket, along with many global genomic initiatives aimed at plants,
animals, and
microorganisms. Gene therapy developed, as did the potential of human
cloning
and the use of human tissues as medicine. The new technologies provided
hope
for solving the failures of the old through a new paradigmone that was
more
complex, holistic, and individualized. The changing view became one of
medicines
and therapies based on knowledge, not trial and error; on human flesh, not
natures pharmacy.
The new therapeutic paradigm evolved from an earlier promise of hormone
drugs
and the flowering of intelligent drug design. It grew from an understanding
of
receptors and from breakthroughs in genomics and genetic engineering. It
found
favor in the new power of combinatorial chemistry and the modeling and data
management abilities of the fastest computers. Hope was renewed for a next
generation of magic bullets born of human genes. As the 1990s ended, many
genetically based drugs were in clinical trials and a wealth of genome
sequences, their promise as yet unknown,
led scientists to reason that the 21st century would be the biotech
century.
Computers and combinatorial technology
Robotics and automation allowed researchers to finally break through a host
of constraints on rational drug
design. Achievements in miniaturization in robotics and computer systems
allowed the manipulation of many
thousands of samples and reactions in the time and space where previously
only a few could be performed.
They permitted the final transformation of pharmacology from the tedious,
hit-and-miss science based primarily
on organic synthesis to one based firmly on physiology and complex
biochemistry, allowing explosive
movement into rational drug discovery in both laboratory design and
natural-product surveys. (And even when
the technology remained hit-and-miss because of the lack of a compatible
knowledge base, the sheer quantity
of samples and reactions now testable created efficiencies of scale that
made the random nature of the process
extraordinarily worthwhile.)
Combinatorial chemists produce libraries of chemicals based on the
controlled and sequential modification of
generally immobilized or otherwise tagged chemical starting blocks. These
original moieties are chosen, under
optimal knowledge conditions, for their predicted or possible behavior in a
Strnka 59

Pharmaceutical_Century
functional drug, protein, polymer,
or pesticide. Developing the knowledge base for starting materials proved
to be one of the greatest benefits of
computer modeling of receptors and the development of computational
libraries (multiple structures derived
from computer algorithms that analyze and predict potentially useful
sequences from databases of gene or
protein sequences, or structural information from previous drugs). Here the
search for natural products
remained criticalfor the discovery of new starting places.
Finding new drug starting places, as much as finding drugs that were useful
as is, became important in the
1990s as companies tried to tap into the knowledge base of traditional
medical practitioners in the developing
world through collaboration rather than appropriation. In this manner,
several companies promoted the analysis
of the biodiversity available in tropical rainforests and oceans. This
added value of biodiversity became both
the rallying cry for environmentalists and a point of solidarity for
political muscle-building in the developing
world. As the industrialized worlds demand for new drugs (and associated
profits) increased, developing
nations sought to prevent the perceived exploitation of their heritage.
And just as previous sources of drugs from the natural world were not
wholly ignored (even as the human
genome became the grail of modern medical hopes), combinatorial
approaches created a new demand for
the services of traditional organic and analytical chemistry. Although
computer models were beneficial, new
wet chemistry techniques still had to be defined on the basis of the new
discoveries in genomics and
proteomics. They had to be modified for microsystems and mass
productionfor the triumph of the microtiter
plate over the flask, the test tube, and the beaker.
Drugs were still chemical products after all.
High-throughput screening
The vast increase in the number of potential drugs produced through
combinatorial methods created a new
bottleneck in the systemscreening and evaluating these libraries, which
held hundreds of thousands of
candidates. To conduct high-throughput screening (HTS) on this excess of
riches, new and more automated
assays were pressed into service. Part of this move into HTS was due to the
burgeoning of useful bioassays,
which permitted screening by individual cells and cell components, tissues,
engineered receptor proteins,
nucleic acid sequences, and immunologicals. New forms of assays
proliferated, and so did opportunities for
evaluating these assays quickly through new technologies.
Researchers continued to move away from radioactivity in bioassays,
automated sequencing, and synthesis by
using various new tagging molecules to directly monitor cells, substrates,
and reaction products by fluorescence
or phosphorescence. Fluorescence made it possible to examine, for the first
time, the behavior of single
molecules or molecular species in in vivo systems. Roger Tsien and
colleagues, for example, constructed
variants of the standard green fluorescent protein for use in a
calmodulin-based chimera to create a
genetic-based, fluorescent indicator of Ca2+. They called this marker
yellow chameleon and used it in
transgenic Caenorhabditis elegans to follow calcium metabolism during
muscle contraction in living organisms.
Strnka 60

Pharmaceutical_Century
Of particular promise was the use of such light technologies with DNA
microarrays, which allowed
quantitative analysis and comparison of gene expression by multicolor
spectral imaging. Many genes are
differentiated in their levels of expression, especially in cancerous
versus normal cells, and microarray
techniques showed promise in the discovery of those differentiated genes.
Microarrays thus became a basic
research tool and a highly promising candidate for HTS in drug development.
Genomics meets proteomics
Knowledge of details of the genetic code, first learned during the 1960s,
achieved practical application during
the 1990s on a previously unimaginable scale. Moving from gene to protein
and back again provided an
explosion of information as the human (and other) genome projects racked up
spectacular successes. Planned
at the end of the 1980s, the U.S. Human Genome Project and the world Human
Genome Organization led the
way. Started first as a collection of government and university
collaborations, the search for the human genome
was rapidly adopted by segments of industry. The issue of patenting human,
plant, and animal genes would be
a persistent controversy.
Inspired by this new obsession with genomics, the 1990s may ultimately be
best known for the production of
the first complete genetic maps. The first full microorganism genome was
sequenced in 1995 (Haemophilus
influenza, by Craig Venter and colleagues at The Institute for Genomic
Research). This achievement was
followed rapidly by the genome sequencing of Saccharomyces cerevisiae
(bakers yeast) in 1996;
Escherichia coli, Borrelia burgdorferi, and Heliobacter pylori in 1997; the
nematode C. elegans in 1998;
and the first sequenced human chromosome (22) in 1999. The entrance of
industry into the race to sequence
the human genome at the tail end of the decade sped up the worldwide
effort, albeit amid controversy.
Hot on the heels of the genomics blastoff was the development of
proteomicsthe science of analyzing, predicting, and using the
proteins produced from the genes and from the cellular processing
performed on these macromolecules before they achieve full
functionality in cells.
Both proteomics and genomics rely on bioinformatics to be useful.
Bioinformatics is essentially the computerized storage and analysis of
biological data, from standard gene sequence databases (such as the
online repository GenBank maintained by the NIH) to complex fuzzy
logic systems such as GRAIL (developed in 1991 by Edward
Eberbacher of Oak Ridge National Laboratory). GRAIL and more
than a dozen other programs were used to find prospective genes in
genomic databases such as GenBank by employing various pattern
recognition techniques.
Pattern recognition techniques were also at the heart of the new
DNA microarrays discussed above, and they were increasingly used
to detect comparative patterns of gene transcription in cells under
various conditions and states (including diseased vs healthy).
Human biotechnology
In September 1990, the first human gene therapy was started by W.
French Anderson at NIH in an attempt to cure adenosine deaminase
(ADA) deficiencyreferred to as bubble-boy syndromeby
inserting the correct gene for ADA into an afflicted four-year-old girl.
Although the treatment did not provide a
Strnka 61

Pharmaceutical_Century
complete cure, it did allow the young patient to live a more normal life
with supplemental ADA injections.
Other attempts at gene therapy also remained more promising than
successful. Clinical trials on humans were
disappointing compared with the phenomenal successes in mice, although
limited tumor suppression did occur
in some cancers, and there were promising reports on the treatment of
hemophilia. Jesse Gelsinger, a teenager
who suffered from the life-threatening liver disorder ornithine
transcarbamylase deficiency, volunteered for
adenovirus-delivered gene therapy at a University of Pennsylvania clinical
trial in 1999. His subsequent death
sent a shock wave through the entire research community, exposed apparent
flaws in regulatory protocols and
compliance, and increased public distrust of one more aspect of harnessing
genes.
Using gene products as drugs, however, was a different story. From
recombinant human insulin sold in the
1980s to the humanized antibodies of the 1990s, the successes of harnessing
the human genomewhether
sensibly (in the case of the gene product) or by using antisense
techniques as inhibitors of human genes (in
1998 Formivirsen, used to treat cytomegalovirus, became the first approved
antisense therapeutic)proved
tempting to the research laboratories of most major pharmaceutical
companies. Many biotechnology
medicinesfrom erythropoietin, tumor necrosis factor, dismutases, growth
hormones, and interferons to
interleukins and humanized monoclonal antibodiesentered clinical trials
throughout the decade.
Beginning in the 1990s, stem cell therapy held great promise. This
treatment uses human cells to repair and
ameliorate inborn or acquired medical conditions, from Parkinsons disease
and diabetes to traumatic spinal
paralysis. By 1998, embryonic stem cells could be grown in vitro, which
promised a wealth of new
opportunities for this precious (and controversial) resource.
Promising too were the new forms of tissue engineering for therapeutic
purposes. Great strides were made in
tissue, organ, and bone replacements. The demand for transplants, growing
at 15% per year by the end of the
decade, led to the search for appropriate artificial or animal substitutes.
Cartilage repair systems, such as
Carticel by Genzyme Tissue Repair, became commonplace. Patients cells
shipped to the company were
treated with Carticel, cultured, and subsequently reimplanted.
Second-generation products permitted
autologous cells to be cultured on membranes, allowing tissue formation in
vitro. Several companies focused
on developing orthobiologicsproteins, such as growth factors, that
stimulate the patients ability to
regenerate tissues.
Epicel, a graft made from autologous cells, was also developed by
Genzyme Tissue Repair to replace the skin of burn victims with
greater than 50% skin damage. In 1998, Organogenesis introduced
the first FDA-approved, ready-to-order Apligraf human skin
replacement, which was made of living human epidermal
keratinocytes and dermal fibroblasts. Also undergoing research in
1999 was Vitrix soft tissue replacementwhich was made of
fibroblasts and collagen. By the end of the decade, artificial liver
systems (which work outside the body) were developed as
temporary blood cleansers providing detoxification and various
digestion-related processes. In many cases, such treatments allowed
Strnka 62

Pharmaceutical_Century
the patients own liver to regenerate during the metabolic rest.
Such uses of cells and tissues raised numerous ethical questions,
which were galvanized in the media by the 1996 arrival of the clonal
sheep Dolly. Reaction against the power of the new biotechnology
was not restricted to fears of dehumanizing humanity through
xeroxing. The possibility of routine xenotransplantation (using
animal organs as replacements in humans) came to the fore with
advances in immunology and genetic engineering that promised the
ability to humanize animal tissues (specifically, those of pigs) in ways
similar to the development of humanized antibodies in mice.
The issue of xenotransplantation not only raised fears of new
diseases creeping into the human population from animal donors, but
was seen by some as a further degradation of human dignity either
intrinsically or through the misuse of animals. The animal rights lobby
throughout the decade argued passionately against the use of animals
for human health purposes.
The Red Queens race
Continuing the problems seen in the 1980s, old and new diseases
were increasingly immune to the array of weapons devised against
them. Like Alice in Through the Looking Glass, drug researchers
had to run as fast as they could just to keep up in the race against
bacterial resistance to traditional antibiotics (see Chapter 7). As the
decade progressed, more diseases became untreatable with the
standard suite of drugs. Various streptococcal infections, strains of
tuberculosis bacteria, pathogenic E. coli, gonorrhea, and the
so-called flesh-eating bacterianecrotizing fasciitis, most commonly
caused by group A streptococcusall became immune to previously
successful magic bullets. Patients died who earlier would have lived.
As the problem manifested, pharmaceutical, software, and instrument
companies alike turned to combinatorial chemistry and HTS technologies in
an effort to regain the racing edge.
AIDS remained a profoundly disturbing example of the failure of technology
to master a disease, despite the
incredible advances in understanding its biology that took place in the
1990s. Vaccine efforts occupied much
of the popular press, and genetically engineered viral fragments seemed to
be the best hope. But the
proliferation of viral strains erased hope for a single, easy form of
vaccination. Resistance to AZT therapy
increased, and even the new protease inhibitors and so-called drug
cocktails developed in the 1990s proved
to be only stopgap measures as viral strains appeared that were resistant
to everything thrown at them.
Individual lives were prolonged, and the death rate in Western countries,
where the expensive new drugs were
available, dropped precipitously. But the ultimate solution to AIDS had not
been found, nor even a
countermeasure to its spread in the developing world and among poor
populations of industrialized nations.
The middle of the decade saw a resurgence of the old plague (bubonic) in
India, and even polio remained a
problem in the developing world. In Africa, Ebola resurfaced in scattered
outbreaksalthough it was nothing
compared with the continental devastation caused by AIDS. In the rest of
the world, fears of biological
warfare raised by the Gulf War continued. Vaccination efforts were stepped
up for many diseases. The
Assembly of the World Health Organization set a global goal in 1990 of a
95% reduction in measles deaths in
1995 compared with pre-immunization levels. By the deadline, estimated
global coverage for measles vaccine
Strnka 63

Pharmaceutical_Century
had reached 78%, at the same time as the industrialized world experienced a
backlash against vaccination
because of concerns about adverse side effects.
Vaccine technology continued to improve with the development of recombinant
vaccines for several diseases,
new efforts to produce vaccines for old scourges such as malaria, and new
nasal delivery systems that
stimulated the mucosal-associated antibody system.
DNA vaccinesthe injection of engineered plasmids into human cells to
stimulate antigen production and
immunizationwere first described in a 1989 patent and published in 1990 by
Wolff, Malone, Felgner, and
colleagues. They entered clinical trials in 1995. Although one editor of
Bio/Technology called this the Third
Vaccine Revolution, by the end of the decade the reality of this expansive
claim remained in doubt, especially
because of the continuing debate over genetic engineering.
Efforts to develop food-based vaccines through the production of
transgenics engineered with the appropriate
antigens continued. This research stimulated studies on mucosal immunity
and efforts to enable complex
proteins to cross the gutblood barrier.
Even with these technological developments, the decade ended with the
negatives of ever-expanding disease
problems, exacerbated by predictions that global warming would lead to new
epidemics of insectborne and
tropical diseases. However, a note of optimism remained that rational
design and automated production
technologies would ultimately be able to defeat these diseases.
High tech and new mech
To improve the front end of rational drug design and to foster the use and
growth of knowledge bases in
genomics and proteomics, many old and new technologies were adapted to
pharmaceutical use in the 1990s.
In the 1990s, the use of mass spectrometry (MS) for bioanalytical analysis
underwent a renaissance, with
improvements such as ultrahigh-performance ms using Fourier-transform ion
cyclotron resonance (FT-ICR
MS) and tandem-in-time (multidimensional) MS for biological macromolecules.
Improved techniques such as
peak-parking (reducing the column flow rate into the mass spectrometer the
instant a peak is detected, to
allow samples to be split and analyzed by multiple MS nearly
simultaneously) added several dimensions that
were previously impossible. These changes improved the ability to analyze
the complex mixtures required in
studies of cellular metabolism and gene regulation. Results from
multidimensional runs were analyzed by
increasingly sophisticated bioinformatics programs and used to improve
their knowledge base. In combination
with HPLC and various capillary electrophoretic systems, ms became part of
a paradigm for pharmaceutical
R&D as a valuable new approach for identifying drug targets and protein
function.
Equivalently, the development of multidimensional NMR techniques,
especially those using more powerful instruments (e.g., 500-MHz
NMR) opened the door to solving the structure of proteins and
peptides in aqueous environments, as they exist in biological systems.
The new NMR techniques allowed observations of the physical
flexibility of proteins and the dynamics of their interactions with other
moleculesa huge advantage in studies of a proteins biochemical
Strnka 64

Pharmaceutical_Century
function, especially in receptors and their target molecules (including
potential drugs).
By viewing the computer-generated three-dimensional structure of
the protein, which was made possible by the data gathered from
these instruments, the way in which a ligand fits into a proteins active
site could be directly observed and studied for the first time. The
three-dimensional structure provided information about biological
function, including the catalysis of reaction and binding of molecules
such as DNA, RNA, and other proteins. In drug design, ligand
binding by a target protein was used to induce the ultimate effects of
interest, such as cell growth or cell death.
By using new technologies to study the structure of the target protein
in a disease and learn about its active or ligand-binding sites, rational
drug design sought to design inhibitors or activators that elicited a
response. This correlation between structure and biological function
(known as the structureactivity relationship, or SAR) became a
fundamental underpinning of the revolution in bioinformatics. In the 1990s,
the SAR was the basis by which
genomics and proteomics were translated into pharmaceutical products.
The new Big Pharma
Ultimately, the biggest change in the pharmaceutical industry, enabled by
the progression of technologies
throughout the century and culminating in the 1990s, was the aforementioned
transformation from a
hit-and-miss approach to rational drug discovery in both laboratory design
and natural-product surveys.
A new business atmosphere, first seen in the 1980s and institutionalized in
the 1990s, revealed itself. It was
characterized by mergers and takeovers, and by a dramatic increase in the
use of contract research
organizationsnot only for clinical development, but even for basic R&D.
Big Pharma confronted a new
business climate and new regulations, born in part from dealing with world
market forces and protests by
activists in developing countries.
Marketing changed dramatically in the 1990s, partly because of a new
consumerism. The Internet made
possible the direct purchase of medicines by drug consumers and of raw
materials by drug producers,
transforming the nature of business. Direct-to-consumer advertising
proliferated on radio and TV because of
new FDA regulations in 1997 that liberalized requirements for the
presentation of risks of medications on
electronic media compared with print.
The phenomenal demand for nutritional supplements and so-called alternative
medicines created both new
opportunities and increased competition in the industrywhich led to major
scandals in vitamin price-fixing
among some of the most respected, or at least some of the biggest, drug
corporations. So powerful was the
new consumer demand that it represented one of the few times in recent
history that the burgeoning power of
the FDA was thwarted when the agency attempted to control nutritional
supplements as drugs. (The FDA
retained the right to regulate them as foods.)
Promise and peril
At the start of the Pharmaceutical Century, the average life expectancy of
Americans was 47. At centurys
end, the average child born in the United States was projected to live to
76. As the 1900s gave way to the
2000s, biotechnology provided the promise of even more astounding advances
Strnka 65

Pharmaceutical_Century
in health and longevity. But
concomitant with these technological changes was a sea change in the vision
of what it was to be a human
being.
In the 19th century, the natural world was the source of most medicines.
With the dawn of magic bullets in the
20th century, complex organic chemistry opened up a world of drugs created
in the laboratory, either modified
from nature or synthesized de novo. As the Pharmaceutical Century
progressed, from the first knowledge of
human hormones to the discovery of the nature of genes and the tools for
genetic engineering, the modern
paradigm saw a recasting of what human flesh was forsuddenly it was a
source of medicines, tissues, and
patentable knowledge. Humankind, not the natural world, became the
hoped-for premier source of drug
discovery.
With genes and chemicals suddenly capable of manipulating the warp
and woof of the human loomboth mind and body alikethe
human pattern seemed to become fluid to design. According to
pessimists, even if biotechnology were not abused to create
superhumans, pharmaceuticals and health care could become the
greatest differentiators of human groups in historynot by genetic
races, but by economic factors. The new knowledge was found in
the highest and most expensive technologies, and often in the hands
of those more interested in patents than panaceas.
Yet with this peril of inequality comes the promise of human
transformation for good. According to optimists,
biotechnologyespecially the use of transgenic plants and animals
for the production of new drugs and vaccines, xenotransplantation,
and the likepromises cheaper, more universal health care.
Meanwhile, lifestyle drugspharmaceuticals for nonacute conditions
such as sterility, impotence, and baldnessalso have emerged as a
fast-growing category.
From aspirin to Herceptina monoclonal antibody that blocks the
overexpressed Her2 receptor in breast
cancer patientsfrom herbal medicines to transgenic plants, from horse
serum to xenotransplantation, from
animal insulin to recombinant human growth hormone, the Pharmaceutical
Century was one of transformation.
It is too soon to predict what pattern the process will weave, but the
human loom has gone high tech. The 21st
century will be a brave new tapestry.

Strnka 66

You might also like