Professional Documents
Culture Documents
Malaria is one of the most common infectious diseases and an enormous public health
problem. The disease is caused by protozoan parasites of the genus Plasmodium. Only
four types of the plasmodium parasite can infect humans; the most serious forms of the
disease are caused by Plasmodium falciparum and Plasmodium vivax, but other related
species (Plasmodium ovale, Plasmodium malariae) can also affect humans. This group of
human-pathogenic Plasmodium species is usually referred to as malaria parasites.
Malaria parasites are transmitted by female Anopheles mosquitoes. The parasites multiply
within red blood cells, causing symptoms that include symptoms of anemia (light
headedness, shortness of breath, tachycardia etc.), as well as other general symptoms
such as fever, chills, nausea, flu-like illness, and in severe cases, coma and death. Malaria
transmission can be reduced by preventing mosquito bites with mosquito nets and insect
repellents, or by mosquito control measures such as spraying insecticides inside houses
and draining standing water where mosquitoes lay their eggs.
Although some are under development, no vaccine is currently available for malaria;
preventative drugs must be taken continuously to reduce the risk of infection. These
prophylactic drug treatments are often too expensive for most people living in endemic
areas. Most adults from endemic areas have a degree of long-term infection, which tends
to recur and also possess partial immunity (resistance); the resistance reduces with time
and such adults may become susceptible to severe malaria if they have spent a significant
amount of time in non-endemic areas. They are strongly recommended to take full
precautions if they return to an endemic area. Malaria infections are treated through the
use of antimalarial drugs, such as quinine or artemisinin derivatives, although drug
resistance is increasingly common.
Malaria causes about 400–900 million cases of fever and approximately one to three
million deaths annually[18][19] — this represents at least one death every 30 seconds. The
vast majority of cases occur in children under the age of 5 years;[20] pregnant women are
also especially vulnerable. Despite efforts to reduce transmission and increase treatment,
there has been little change in which areas are at risk of this disease since 1992.[21]
Indeed, if the prevalence of malaria stays on its present upwards course, the death rate
could double in the next twenty years.[18] Precise statistics are unknown because many
cases occur in rural areas where people do not have access to hospitals or the means to
afford health care. Consequently, the majority of cases are undocumented.[18]
Although co-infection with HIV and malaria does cause increased mortality, this is less
of a problem than with HIV/tuberculosis co-infection, due to the two diseases usually
attacking different age-ranges, with malaria being most common in the young and active
tuberculosis most common in the old.[22] Although HIV/malaria co-infection produces
less severe symptoms than the interaction between HIV and TB, HIV and malaria do
contribute to each other's spread. This effect comes from malaria increasing viral load
and HIV infection increasing a person's susceptibility to malaria infection.[23]
Malaria is presently endemic in a broad band around the equator, in areas of the
Americas, many parts of Asia, and much of Africa; however, it is in sub-Saharan Africa
where 85– 90% of malaria fatalities occur.[24] The geographic distribution of malaria
within large regions is complex, and malaria-afflicted and malaria-free areas are often
found close to each other.[25] In drier areas, outbreaks of malaria can be predicted with
reasonable accuracy by mapping rainfall.[26] Malaria is more common in rural areas than
in cities; this is in contrast to dengue fever where urban areas present the greater risk.[27]
For example, the cities of Vietnam, Laos and Cambodia are essentially malaria-free, but
the disease is present in many rural regions.[28] By contrast, in Africa malaria is present in
both rural and urban areas, though the risk is lower in the larger cities.[29] The global
endemic levels of malaria have not been mapped since the 1960s. However, the
Wellcome Trust, UK, has funded the Malaria Atlas Project[30] to rectify this, providing a
more contemporary and robust means with which to assess current and future malaria
disease burden.
Socio-economic effects
Malaria is not just a disease commonly associated with poverty, but is also a cause of
poverty and a major hindrance to economic development. The disease has been
associated with major negative economic effects on regions where it is widespread. A
comparison of average per capita GDP in 1995, adjusted to give parity of purchasing
power, between malarious and non-malarious countries demonstrates a fivefold
difference ($1,526 USD versus $8,268 USD). Moreover, in countries where malaria is
common, average per capita GDP has risen (between 1965 and 1990) only 0.4% per year,
compared to 2.4% per year in other countries.[31] However, correlation does not
demonstrate causation, and the prevalence is at least partly because these regions do not
have the financial capacities to prevent malaria. In its entirety, the economic impact of
malaria has been estimated to cost Africa $12 billion USD every year. The economic
impact includes costs of health care, working days lost due to sickness, days lost in
education, decreased productivity due to brain damage from cerebral malaria, and loss of
investment and tourism.[20] In some countries with a heavy malaria burden, the disease
may account for as much as 40% of public health expenditure, 30-50% of inpatient
admissions, and up to 50% of outpatient visits.[32]
Symptoms
Symptoms of malaria include fever, shivering, arthralgia (joint pain), vomiting, anemia
(caused by hemolysis), hemoglobinuria, and convulsions. There may be a feeling of
tingling in the skin, particularly with malaria caused by P. falciparum.[citation needed] The
classical symptom of malaria is cyclical occurrence of sudden coldness followed by rigor
and then fever and sweating lasting four to six hours, occurring every two days in P.
vivax and P. ovale infections, while every three for P. malariae.[33] P. falciparum can
have recurrent fever every 36-48 hours or a less pronounced and almost continuous fever.
For reasons that are poorly understood, but which may be related to high intracranial
pressure, children with malaria frequently exhibit abnormal posturing, a sign indicating
severe brain damage.[34] Malaria has been found to cause cognitive impairments,
especially in children. It causes widespread anemia during a period of rapid brain
development and also direct brain damage. This neurologic damage results from cerebral
malaria to which children are more vulnerable.[35]
Severe malaria is almost exclusively caused by P. falciparum infection and usually arises
6-14 days after infection.[36] Consequences of severe malaria include coma and death if
untreated—young children and pregnant women are especially vulnerable. Splenomegaly
(enlarged spleen), severe headache, cerebral ischemia, hepatomegaly (enlarged liver),
hypoglycemia, and hemoglobinuria with renal failure may occur. Renal failure may cause
blackwater fever, where hemoglobin from lysed red blood cells leaks into the urine.
Severe malaria can progress extremely rapidly and cause death within hours or days.[36] In
the most severe cases of the disease fatality rates can exceed 20%, even with intensive
care and treatment.[37] In endemic areas, treatment is often less satisfactory and the overall
fatality rate for all cases of malaria can be as high as one in ten.[38] Over the longer term,
developmental impairments have been documented in children who have suffered
episodes of severe malaria.[39]
Chronic malaria is seen in both P. vivax and P. ovale, but not in P. falciparum. Here, the
disease can relapse months or years after exposure, due to the presence of latent parasites
in the liver. Describing a case of malaria as cured by observing the disappearance of
parasites from the bloodstream can therefore be deceptive. The longest incubation period
reported for a P. vivax infection is 30 years.[36] Approximately one in five of P. vivax
malaria cases in temperate areas involve overwintering by hypnozoites (i.e., relapses
begin the year after the mosquito bite).[40]
Causes
A Plasmodium sporozoite traverses the cytoplasm of a mosquito midgut epithelial cell in
this false-color electron micrograph.
Malaria parasites
Only female mosquitoes feed on blood, thus males do not transmit the disease. The
females of the Anopheles genus of mosquito prefer to feed at night. They usually start
searching for a meal at dusk, and will continue throughout the night until taking a meal.
Malaria parasites can also be transmitted by blood transfusions, although this is rare.[47]
Pathogenesis
The life cycle of malaria parasites in the human body. A mosquito infects a person,by
taking a blood meal. First, sporozoites enter the bloodstream, and migrate to the liver.
They infect liver cells (hepatocytes), where they multiply into merozoites, rupture the
liver cells, and escape back into the bloodstream. Then, the merozoites infect red blood
cells, where they develop into ring forms, then trophozoites (a feeding stage), then
schizonts (a reproduction stage), then back into merozoites. Sexual forms called
gametocytes are also produced, which if taken up by a mosquito will infect the insect and
continue the life cycle.
Some P. vivax and P. ovale sporozoites do not immediately develop into exoerythrocytic-
phase merozoites, but instead produce hypnozoites that remain dormant for periods
ranging from several months (6–12 months is typical) to as long as three years. After a
period of dormancy, they reactivate and produce merozoites. Hypnozoites are responsible
for long incubation and late relapses in these two species of malaria.[50]
The parasite is relatively protected from attack by the body's immune system because for
most of its human life cycle it resides within the liver and blood cells and is relatively
invisible to immune surveillance. However, circulating infected blood cells are destroyed
in the spleen. To avoid this fate, the P. falciparum parasite displays adhesive proteins on
the surface of the infected blood cells, causing the blood cells to stick to the walls of
small blood vessels, thereby sequestering the parasite from passage through the general
circulation and the spleen.[51] This "stickiness" is the main factor giving rise to
hemorrhagic complications of malaria. High endothelial venules (the smallest branches of
the circulatory system) can be blocked by the attachment of masses of these infected red
blood cells. The blockage of these vessels causes symptoms such as in placental and
cerebral malaria. In cerebral malaria the sequestrated red blood cells can breach the blood
brain barrier possibly leading to coma.[52]
Although the red blood cell surface adhesive proteins (called PfEMP1, for Plasmodium
falciparum erythrocyte membrane protein 1) are exposed to the immune system they do
not serve as good immune targets because of their extreme diversity; there are at least 60
variations of the protein within a single parasite and perhaps limitless versions within
parasite populations.[51] Like a thief changing disguises or a spy with multiple passports,
the parasite switches between a broad repertoire of PfEMP1 surface proteins, thus staying
one step ahead of the pursuing immune system.
Some merozoites turn into male and female gametocytes. If a mosquito pierces the skin
of an infected person, it potentially picks up gametocytes within the blood. Fertilization
and sexual recombination of the parasite occurs in the mosquito's gut, thereby defining
the mosquito as the definitive host of the disease. New sporozoites develop and travel to
the mosquito's salivary gland, completing the cycle. Pregnant women are especially
attractive to the mosquitoes,[53] and malaria in pregnant women is an important cause of
stillbirths, infant mortality and low birth weight,[54] particularly in P. falciparum infection,
but also in other species infection, such as P. vivax.[55]
Thalassaemias
Another well documented set of mutations found in the human genome associated with
malaria are those involved in causing blood disorders known as thalassaemias. Studies in
Sardinia and Papua New Guinea have found that the gene frequency of β-thalassaemias is
related to the level of malarial endemicity in a given population. A study on more than
500 children in Liberia found that those with β-thalassaemia had a 50% decreased chance
of getting clinical malaria. Similar studies have found links between gene frequency and
malaria endemicity in the α+ form of α-thalassaemia. Presumably these genes have also
been selected in the course of human evolution.
Duffy antigens
The Duffy antigens are antigens expressed on red blood cells and other cells in the body
acting as a chemokine receptor. The expression of Duffy antigens on blood cells is
encoded by Fy genes (Fya, Fyb, Fyc etc.). Plasmodium vivax malaria uses the Duffy
antigen to enter blood cells. However, it is possible to express no Duffy antigen on red
blood cells (Fy-/Fy-). This genotype confers complete resistance to P. vivax infection.
The genotype is very rare in European, Asian and American populations, but is found in
almost all of the indigenous population of West and Central Africa.[57] This is thought to
be due to very high exposure to P. vivax in Africa in the last few thousand years.
G6PD
HLA-B53 is associated with low risk of severe malaria. This MHC class I molecule
presents liver stage and sporozoite antigens to T-Cells. Interleukin-4, encoded by IL4, is
produced by activated T cells and promotes proliferation and differentiation of antibody-
producing B cells. A study of the Fulani of Burkina Faso, who have both fewer malaria
attacks and higher levels of antimalarial antibodies than do neighboring ethnic groups,
found that the IL4-524 T allele was associated with elevated antibody levels against
malaria antigens, which raises the possibility that this might be a factor in increased
resistance to malaria.[58]
Diagnosis
Blood smear from a P. falciparum culture (K1 strain). Several red blood cells have ring
stages inside them. Close to the center there is a schizont and on the left a trophozoite.
Symptomatic diagnosis
Areas that cannot afford even simple laboratory diagnostic tests often use only a history
of subjective fever as the indication to treat for malaria. Using Giemsa-stained blood
smears from children in Malawi, one study showed that unnecessary treatment for
malaria was significantly decreased when clinical predictors (rectal temperature, nailbed
pallor, and splenomegaly) were used as treatment indications, rather than the current
national policy of using only a history of subjective fevers (sensitivity increased from
21% to 41%).[60]
From the thick film, an experienced microscopist can detect parasite levels (or
parasitemia) down to as low as 0.0000001% of red blood cells. Diagnosis of species can
be difficult because the early trophozoites ("ring form") of all four species look identical
and it is never possible to diagnose species on the basis of a single ring form; species
identification is always based on several trophozoites.
Field tests
In areas where microscopy is not available, or where laboratory staff are not experienced
at malaria diagnosis, there are antigen detection tests that require only a drop of blood.[62]
Immunochromatographic tests (also called: Malaria Rapid Diagnostic Tests, Antigen-
Capture Assay or "Dipsticks") have been developed, distributed and fieldtested. These
tests use finger-stick or venous blood, the completed test takes a total of 15-20 minutes,
and a laboratory is not needed. The threshold of detection by these rapid diagnostic tests
is in the range of 100 parasites/µl of blood compared to 5 by thick film microscopy. The
first rapid diagnostic tests were using P. falciparum glutamate dehydrogenase as antigen
[63]
. PGluDH was soon replaced by P.falciparum lactate dehydrogenase, a 33 kDa
oxidoreductase [EC 1.1.1.27]. It is the last enzyme of the glycolytic pathway, essential
for ATP generation and one of the most abundant enzymes expressed by P.falciparum.
PLDH does not persist in the blood but clears about the same time as the parasites
following successful treatment. The lack of antigen persistence after treatment makes the
pLDH test useful in predicting treatment failure. In this respect, pLDH is similar to
pGluDH. The OptiMAL-IT assay can distinguish between P. falciparum and P. vivax
because of antigenic differences between their pLDH isoenzymes. OptiMAL-IT will
reliably detect falciparum down to 0.01% parasitemia and non-falciparum down to 0.1%.
Paracheck-Pf will detect parasitemias down to 0.002% but will not distinguish between
falciparum and non-falciparum malaria. Parasite nucleic acids are detected using
polymerase chain reaction. This technique is more accurate than microscopy. However, it
is expensive, and requires a specialized laboratory. Moreover, levels of parasitemia are
not necessarily correlative with the progression of disease, particularly when the parasite
is able to adhere to blood vessel walls. Therefore more sensitive, low-tech diagnosis tools
need to be developed in order to detect low levels of parasitaemia in the field. Areas that
cannot afford even simple laboratory diagnostic tests often use only a history of
subjective fever as the indication to treat for malaria. Using Giemsa-stained blood smears
from children in Malawi, one study showed that unnecessary treatment for malaria was
significantly decreased when clinical predictors (rectal temperature, nailbed pallor, and
splenomegaly) were used as treatment indications, rather than the current national policy
of using only a history of subjective fevers (sensitivity increased from 21% to 41%).[64]
Molecular methods
Molecular methods are available in some clinical laboratories and rapid real-time assays
(for example, QT-NASBA based on the polymerase chain reaction)[65] are being
developed with the hope of being able to deploy them in endemic areas.
Laboratory tests
OptiMAL-IT will reliably detect falciparum down to 0.01% parasitemia and non-
falciparum down to 0.1%. Paracheck-Pf will detect parasitemias down to 0.002% but
will not distinguish between falciparum and non-falciparum malaria. Parasite nucleic
acids are detected using polymerase chain reaction. This technique is more accurate than
microscopy. However, it is expensive, and requires a specialized laboratory. Moreover,
levels of parasitemia are not necessarily correlative with the progression of disease,
particularly when the parasite is able to adhere to blood vessel walls. Therefore more
sensitive, low-tech diagnosis tools need to be developed in order to detect low levels of
parasitaemia in the field. [66]
Treatment
Active malaria infection with P. falciparum is a medical emergency requiring
hospitalization. Infection with P. vivax, P. ovale or P. malariae can often be treated on an
outpatient basis. Treatment of malaria involves supportive measures as well as specific
antimalarial drugs. When properly treated, someone with malaria can expect a complete
recovery.[67]
Antimalarial drugs
There are several families of drugs used to treat malaria. Chloroquine is very cheap and,
until recently, was very effective, which made it the antimalarial drug of choice for many
years in most parts of the world. However, resistance of Plasmodium falciparum to
chloroquine has spread recently from Asia to Africa, making the drug ineffective against
the most dangerous Plasmodium strain in many affected regions of the world. In those
areas where chloroquine is still effective it remains the first choice. Unfortunately,
chloroquine-resistance is associated with reduced sensitivity to other drugs such as
quinine and amodiaquine.[68]
There are several other substances which are used for treatment and, partially, for
prevention (prophylaxis). Many drugs may be used for both purposes; larger doses are
used to treat cases of malaria. Their deployment depends mainly on the frequency of
resistant parasites in the area where the drug is used. One drug currently being
investigated for possible use as an anti-malarial, especially for treatment of drug-resistant
strains, is the beta blocker propranolol. Propranolol has been shown to block both
Plasmodium's ability to enter red blood cell and establish an infection, as well as parasite
replication. A December 2006 study by Northwestern University researchers suggested
that propranolol may reduce the dosages required for existing drugs to be effective
against P. falciparum by 5- to 10-fold, suggesting a role in combination therapies.[69]
The development of drugs was facilitated when Plasmodium falciparum was successfully
cultured.[71] This allowed in vitro testing of new drug candidates.
Extracts of the plant Artemisia annua, containing the compound artemisinin or semi-
synthetic derivatives (a substance unrelated to quinine), offer over 90% efficacy rates, but
their supply is not meeting demand.[72] In 2007, the Bill & Melinda Gates Foundation
contributed $13.6m to support research at the University of York to develop fast and
high-yield strains of artemisia, with researchers predicting an increase in yield of up to
1000% compared to current varieties.[73] One study in Rwanda showed that children with
uncomplicated P. falciparum malaria demonstrated fewer clinical and parasitological
failures on post-treatment day 28 when amodiaquine was combined with artesunate,
rather than administered alone (OR = 0.34). However, increased resistance to
amodiaquine during this study period was also noted.[74] Since 2001 the World Health
Organization has recommended using artemisinin-based combination therapy (ACT) as
first-line treatment for uncomplicated malaria in areas experiencing resistance to older
medications. The most recent WHO treatment guidelines for malaria recommend four
different ACTs. While numerous countries, including most African nations, have adopted
the change in their official malaria treatment policies, cost remains a major barrier to
ACT implementation. Because ACTs cost up to twenty times as much as older
medications, they remain unaffordable in many malaria-endemic countries. The
molecular target of artemisinin is controversial, although recent studies suggest that
SERCA, a calcium pump in the endoplasmic reticulum may be associated with
artemisinin resistance.[75] Malaria parasites can develop resistance to artemisinin and
resistance can be produced by mutation of SERCA.[76] However, other studies suggest the
mitochondrion is the major target for artemisinin and its analogs.[77]
In February 2002, the journal Science and other press outlets[78] announced progress on a
new treatment for infected individuals. A team of French and South African researchers
had identified a new drug they were calling "G25".[79] It cured malaria in test primates by
blocking the ability of the parasite to copy itself within the red blood cells of its victims.
In 2005 the same team of researchers published their research on achieving an oral form,
which they refer to as "TE3" or "te3".[80] As of early 2006, there is no information in the
mainstream press as to when this family of drugs will become commercially available.
In 1996, Professor Geoff McFadden stumbled upon the work of British biologist Ian
Wilson, who had discovered that the plasmodia responsible for causing malaria retained
parts of chloroplasts,[81] an organelle usually found in plants, complete with their own
functioning genomes. This led Professor McFadden to the realisation that any number of
herbicides may in fact be successful in the fight against malaria, and so he set about
trialing large numbers of them, and enjoyed a 75% success rate.
These "apicoplasts" are thought to have originated through the endosymbiosis of algae[82]
and play a crucial role in fatty acid bio-synthesis in plasmodia.[83] To date, 466 proteins
have been found to be produced by apicoplasts[84] and these are now being looked at as
possible targets for novel anti-malarial drugs.
Although effective anti-malarial drugs are on the market, the disease remains a threat to
people living in endemic areas who have no proper and prompt access to effective drugs.
Access to pharmacies and health facilities, as well as drug costs, are major obstacles.
Médecins Sans Frontières estimates that the cost of treating a malaria-infected person in
an endemic country was between US$0.25 and $2.40 per dose in 2002.[85]
Counterfeit drugs
Methods used to prevent the spread of disease, or to protect individuals in areas where
malaria is endemic, include prophylactic drugs, mosquito eradication, and the prevention
of mosquito bites. The continued existence of malaria in an area requires a combination
of high human population density, high mosquito population density, and high rates of
transmission from humans to mosquitoes and from mosquitoes to humans. If any of these
is lowered sufficiently, the parasite will sooner or later disappear from that area, as
happened in North America, Europe and much of Middle East. However, unless the
parasite is eliminated from the whole world, it could become re-established if conditions
revert to a combination that favors the parasite's reproduction. Many countries are seeing
an increasing number of imported malaria cases due to extensive travel and migration.
(See Anopheles.)
There is currently no vaccine that will prevent malaria, but this is an active field of
research.
Many researchers argue that prevention of malaria may be more cost-effective than
treatment of the disease in the long run, but the capital costs required are out of reach of
many of the world's poorest people. Economic adviser Jeffrey Sachs estimates that
malaria can be controlled for US$3 billion in aid per year. It has been argued that, in
order to meet the Millennium Development Goals, money should be redirected from
HIV/AIDS treatment to malaria prevention, which for the same amount of money would
provide greater benefit to African economies.[90]
Brazil, Eritrea, India, and Vietnam have, unlike many other developing nations,
successfully reduced the malaria burden. Common success factors included conducive
country conditions, a targeted technical approach using a package of effective tools, data-
driven decision-making, active leadership at all levels of government, involvement of
communities, decentralized implementation and control of finances, skilled technical and
managerial capacity at national and sub-national levels, hands-on technical and
programmatic support from partner agencies, and sufficient and flexible financing.[91]
Vector control
Before DDT, malaria was successfully eradicated or controlled also in several tropical
areas by removing or poisoning the breeding grounds of the mosquitoes or the aquatic
habitats of the larva stages, for example by filling or applying oil to places with standing
water. These methods have seen little application in Africa for more than half a century.
[92]
On December 21, 2007, a study published in PLoS Pathogens found that the hemolytic C-
type lectin CEL-III from Cucumaria echinata, a sea cucumber found in the Bay of
Bengal, impaired the development of the malaria parasite when produced by transgenic
mosquitoes.[96][97] This could potentially be used one day to control malaria by using
genetically modified mosquitoes refractory to the parasites, although the authors of the
study recognize that there are numerous scientific and ethical problems to be overcome
before such a control strategy could be implemented.
Prophylactic drugs
Several drugs, most of which are also used for treatment of malaria, can be taken
preventively. Generally, these drugs are taken daily or weekly, at a lower dose than
would be used for treatment of a person who had actually contracted the disease. Use of
prophylactic drugs is seldom practical for full-time residents of malaria-endemic areas,
and their use is usually restricted to short-term visitors and travelers to malarial regions.
This is due to the cost of purchasing the drugs, negative side effects from long-term use,
and because some effective anti-malarial drugs are difficult to obtain outside of wealthy
nations.
Quinine was used starting in the seventeenth century as a prophylactic against malaria.
The development of more effective alternatives such as quinacrine, chloroquine, and
primaquine in the twentieth century reduced the reliance on quinine. Today, quinine is
still used to treat chloroquine resistant Plasmodium falciparum, as well as severe and
cerebral stages of malaria, but is not generally used for prophylaxis. Of interesting
historical note is the observation by Samuel Hahnemann in the late 18th century that
over-dosing of quinine leads to a symptomatic state very similar to that of malaria itself.
This lead Hahnemann to develop the medical Law of Similars, and the subsequent
medical system of Homeopathy.
Indoor residual spraying (IRS) is the practice of spraying insecticides on the interior
walls of homes in malaria effected areas. After feeding, many mosquito species rest on a
nearby surface while digesting the bloodmeal, so if the walls of dwellings have been
coated with insecticides, the resting mosquitos will be killed before they can bite another
victim, transferring the malaria parasite.
The first and historically the most popular insecticide used for IRS is DDT. While it was
initially used to exclusively to combat malaria, its use quickly spread to agriculture. In
time, pest-control, rather than disease-control, came to dominate DDT use, and this large-
scale agricultural use led to the evolution of resistant mosquitoes in many regions. During
the 1960s, awareness of the negative consequences of its indiscriminate use increased
ultimately leading to bans on agricultural applications of DDT in many countries in the
1970s.
Though DDT has never been banned for use in malaria control and there are several other
insecticides suitable for IRS, some advocates have claimed that bans are responsible for
tens of millions of deaths in tropical countries where DDT had once been effective in
controlling malaria. Furthermore, most of the problems associated with DDT use stem
specifically from its industrial-scale application in agriculture, rather than its use in
public health.[98]
The World Health Organization (WHO) currently advises the use of 12 different
insecticides in IRS operations. These include DDT and a series of alternative insecticides
(such as the pyrethroids permethrin and deltamethrin) to both combat malaria in areas
where mosquitoes are DDT-resistant, and to slow the evolution of resistance.[99] This
public health use of small amounts of DDT is permitted under the Stockholm Convention
on Persistent Organic Pollutants (POPs), which prohibits the agricultural use of DDT.[100]
However, because of its legacy, many developed countries discourage DDT use even in
small quantities.[101]
Mosquito nets help keep mosquitoes away from people, and thus greatly reduce the
infection and transmission of malaria. The nets are not a perfect barrier, so they are often
treated with an insecticide designed to kill the mosquito before it has time to search for a
way past the net. Insecticide-treated nets (ITN) are estimated to be twice as effective as
untreated nets,[90] and offer greater than 70% protection compared with no net.[102] Since
the Anopheles mosquitoes feed at night, the preferred method is to hang a large "bed net"
above the center of a bed such that it drapes down and covers the bed completely.
For maximum effectiveness, the nets should be re-impregnated with insecticide every six
months. This process poses a significant logistical problem in rural areas. New
technologies like Olyset or DawaPlus allow for production of long-lasting insecticidal
mosquito nets (LLINs), which release insecticide for approximately 5 years,[103] and cost
about US$5.50. ITNs have the advantage of protecting people sleeping under the net and
simultaneously killing mosquitoes that contact the net. This has the effect of killing the
most dangerous mosquitoes. Some protection is also provided to others, including people
sleeping in the same room but not under the net.
Unfortunately, the cost of treating malaria is high relative to income, and the illness
results in lost wages. Consequently, the financial burden means that the cost of a
mosquito net is often unaffordable to people in developing countries, especially for those
most at risk. Only 1 out of 20 people in Africa own a bed net.[90] Although shipped into
Africa mainly from Europe as free development help, the nets quickly become expensive
trade goods. They are mainly used for fishing, and by combining hundreds of donated
mosquito nets, whole river sections can be completely shut off, catching even the
smallest fish.[104]
A study among Afghan refugees in Pakistan found that treating top-sheets and chaddars
(head coverings) with permethrin has similar effectiveness to using a treated net, but is
much cheaper.[105]
A new approach, announced in Science on June 10, 2005, uses spores of the fungus
Beauveria bassiana, sprayed on walls and bed nets, to kill mosquitoes. While some
mosquitoes have developed resistance to chemicals, they have not been found to develop
a resistance to fungal infections.[106]
Vaccination
Vaccines for malaria are under development, with no completely effective vaccine yet
available. The first promising studies demonstrating the potential for a malaria vaccine
were performed in 1967 by immunizing mice with live, radiation-attenuated sporozoites,
providing protection to about 60% of the mice upon subsequent injection with normal,
viable sporozoites.[107] Since the 1970s, there has been a considerable effort to develop
similar vaccination strategies within humans. It was determined that an individual can be
protected from a P. falciparum infection if they receive over 1000 bites from infected,
irradiated mosquitoes.[108]
It has been generally accepted that it is impractical to provide at-risk individuals with this
vaccination strategy, but that has been recently challenged with work being done by Dr.
Stephen Hoffman of Sanaria, one of the key researchers who originally sequenced the
genome of Plasmodium falciparum. His work most recently has revolved around solving
the logistical problem of isolating and preparing the parasites equivalent to a 1000
irradiated mosquitoes for mass storage and inoculation of human beings. The company
has recently received several multi-million dollar grants from the Bill & Melinda Gates
Foundation and the U.S. government to begin early clinical studies in 2007 and 2008.[109]
The Seattle Biomedical Research Institute (SBRI), funded by the Malaria Vaccine
Initiative, assures potential volunteers that "the [2009] clinical trials won't be a life-
threatening experience. While many volunteers [in Seattle] will actually contract malaria,
the cloned strain used in the experiments can be quickly cured, and does not cause a
recurring form of the disease." "Some participants will get experimental drugs or
vaccines, while others will get placebo."[110]
Instead, much work has been performed to try and understand the immunological
processes that provide protection after immunization with irradiated sporozoites. After
the mouse vaccination study in 1967,[107] it was hypothesized that the injected sporozoites
themselves were being recognized by the immune system, which was in turn creating
antibodies against the parasite. It was determined that the immune system was creating
antibodies against the circumsporozoite protein (CSP) which coated the sporozoite.[111]
Moreover, antibodies against CSP prevented the sporozoite from invading hepatocytes.
[112]
CSP was therefore chosen as the most promising protein on which to develop a
vaccine against the malaria sporozoite. It is for these historical reasons that vaccines
based on CSP are the most numerous of all malaria vaccines.
The first vaccine developed that has undergone field trials, is the SPf66, developed by
Manuel Elkin Patarroyo in 1987. It presents a combination of antigens from the
sporozoite (using CS repeats) and merozoite parasites. During phase I trials a 75%
efficacy rate was demonstrated and the vaccine appeared to be well tolerated by subjects
and immunogenic. The phase IIb and III trials were less promising, with the efficacy
falling to between 38.8% and 60.2%. A trial was carried out in Tanzania in 1993
demonstrating the efficacy to be 31% after a years follow up, however the most recent
(though controversial) study in the Gambia did not show any effect. Despite the relatively
long trial periods and the number of studies carried out, it is still not known how the
SPf66 vaccine confers immunity; it therefore remains an unlikely solution to malaria. The
CSP was the next vaccine developed that initially appeared promising enough to undergo
trials. It is also based on the circumsporoziote protein, but additionally has the
recombinant (Asn-Ala-Pro15Asn-Val-Asp-Pro)2-Leu-Arg(R32LR) protein covalently
bound to a purified Pseudomonas aeruginosa toxin (A9). However at an early stage a
complete lack of protective immunity was demonstrated in those inoculated. The study
group used in Kenya had an 82% incidence of parasitaemia whilst the control group only
had an 89% incidence. The vaccine intended to cause an increased T-lymphocyte
response in those exposed, this was also not observed.
The efficacy of Patarroyo's vaccine has been disputed with some US scientists
concluding in The Lancet (1997) that "the vaccine was not effective and should be
dropped" while the Colombian accused them of "arrogance" putting down their assertions
to the fact that he came from a developing country.
The RTS,S/AS02A vaccine is the candidate furthest along in vaccine trials. It is being
developed by a partnership between the PATH Malaria Vaccine Initiative (a grantee of
the Gates Foundation), the pharmaceutical company, GlaxoSmithKline, and the Walter
Reed Army Institute of Research[115] In the vaccine, a portion of CSP has been fused to
the immunogenic "S antigen" of the hepatitis B virus; this recombinant protein is injected
alongside the potent AS02A adjuvant.[113] In October 2004, the RTS,S/AS02A researchers
announced results of a Phase IIb trial, indicating the vaccine reduced infection risk by
approximately 30% and severity of infection by over 50%. The study looked at over
2,000 Mozambican children.[116] More recent testing of the RTS,S/AS02A vaccine has
focused on the safety and efficacy of administering it earlier in infancy: In October 2007,
the researchers announced results of a phase I/IIb trial conducted on 214 Mozambican
infants between the ages of 10 and 18 months in which the full three-dose course of the
vaccine led to a 62% reduction of infection with no serious side-effects save some pain at
the point of injection.[117] Further research will delay this vaccine from commercial
release until around 2011.[118]
Other methods
Education in recognising the symptoms of malaria has reduced the number of cases in
some areas of the developing world by as much as 20%. Recognising the disease in the
early stages can also stop the disease from becoming a killer. Education can also inform
people to cover over areas of stagnant, still water eg Water Tanks which are ideal
breeding grounds for the parasite and mosquito thus, cutting down the risk of the
transmission between people. This is most put in practice in urban areas where there is
large centres of population in a confined space and transmission would be most likely in
these areas.
The Malaria Control Project is currently using downtime computing power donated by
individual volunteers around the world (see Volunteer computing and BOINC) to
simulate models of the health effects and transmission dynamics in order to find the best
method or combination of methods for malaria control. This modeling is extremely
computer intensive due to the simulations of large human populations with a vast range
of parameters related to biological and social factors that influence the spread of the
disease. It is expected to take a few months using volunteered computing power
compared to the 40 years it would have taken with the current resources available to the
scientists who developed the program.[119]
Antimalarial drug
Antimalarial drugs are designed to prevent or cure malaria. Some antimalarial agents,
particularly chloroquine and hydroxychloroquine, are also used in the treatment of
rheumatoid arthritis and lupus associated arthritis. There are many of these drugs
currently on the market. Quinine is the oldest and most famous anti-malarial. Two types
of antimalarial drugs are to be distingueshed: the kind one takes as prevention (called
prophylactic drugs) and therapy drugs. The first one is taken as prevention and requires
continuous administration to reduce the risk of infection. The second type, called therapy
drugs are taken once the a person is already infected.
Prophylactic drugs
Quinine
Quinine has a long history stretching from Peru, and the discovery of the Cinchona tree,
and the potential uses of its bark, to the current day and a collection of derivatives that are
still frequently used in the prevention and treatment of malaria. Quinine is an alkaloid
that acts as a blood schizonticidal and weak gametocide against Plasmodium vivax and
Plasmodium malariae. As an alkaloid, it is accumulated in the food vacuoles of
plasmodium species, especially Plasmodium falciparum. It acts by inhibiting the
hemozoin biocrystallization, thus facilitating an aggregation of cytotoxic heme. Quinine
is less effective and more toxic as a blood schizonticidal agent than Chloroquine;
however it is still very effective and widely used in the treatment of acute cases of severe
P. falciparum. It is especially useful in areas where there is known to be a high level of
resistance to Chloroquine, Mefloquine and sulfa drug combinations with pyrimethamine.
Quinine is also used in post-exposure treatment of individuals returning from an area
where malaria is endemic.
The treatment regimen of Quinine is complex and is determined largely by the parasite’s
level of resistance and the reason for drug therapy (i.e. acute treatment or prophylaxis).
The World Health Organization recommendation for Quinine is 8mg/kg three times daily
for 3 days (in areas where the level of adherence is questionable) and for 7 days (where
parasites are sensitive to Quinine). In areas where there is an increased level of resistance
to Quinine 8mg/kg three times daily for 7 days is recommended, combined with
Doxycycline, Tetracycline or Clindamycin. Doses can be given by oral, intravenous or
intramuscular routes. The recommended method depends on the urgency of treatment and
the available facilities (i.e. sterilised needles for IV or IM injections).
Other Alkaloids
Quinimax and Quinidine are the two most commonly used alkaloids related to Quinine,
in the treatment or prevention of Malaria. Quinimax is a combination of four alkaloids
(namely Quinine Quinidine Cinchoine and Cinchonidine). This combination has been
shown in several studies to be more effective than Quinine, supposedly due to a
synergistic action between the four Cinchona derivatives. Quinidine is a direct derivative
of Quinine. It is a distereoisomer, thus having similar anti-malarial properties to the
parent compound. Quinidine is recommended only for the treatment or severe cases of
malaria.
Chloroquine
Chloroquine was until recently the most widely used anti-malarial. It was the original
prototype from which most other methods of treatment are derived. It is also the least
expensive, best tested and safest of all available drugs. The emergence of drug resistant
parasitic strains is rapidly decreasing its effectiveness; however it is still the first-line
drug of choice in most sub-Saharan African countries. It is now suggested that it is used
in combination with other antimalarial drugs to extend its effective usage.
A slightly different drug called nivaquine or chloroquine phosphate has also been
invented. Popular drugs that make use of this compound are Chloroquine FNA, Resochin
and Dawaquin.
Children and adults should receive 25mg of chloroquine per kg given over 3 days. A
pharmacokinetically superior regime, recommended by the WHO, involves giving an
initial dose of 10mg/kg followed 6-8 hours later by 5mg/kg, then 5mg/kg on the
following 2 days. For chemoprophylaxis: 5mg/kg/week (single dose) or 10mg/kg/week
divided into 6 daily doses is advised. It should be noted that chloroquine is only
recommended as a prophylactic drug in regions only affected by P. vivax and sensitive P.
falciparum strains. Chloroquine has been used in the treatment of malaria for many years
and no abortifacient or teratogenic effects have been reported during this time, therefore
it is considered very safe to use during pregnancy. However, itching can occur at
intolerable level.
Amodiaquine
The drug should be given in doses between 25mg/kg and 35mg/kg over 3 days in a
similar method to that used in Chloroquine administration. Adverse reactions are
generally similar in severity and type to that seen in Chloroquine treatment. In addition,
bradycardia, itching, nausea, vomiting and some abdominal pain have been recorded.
Some blood and hepatic disorders have also been seen in a small number of patients.
Pyrimethamine
Pyrimethamine is used in the treatment of uncomplicated malaria. It is particularly useful
in cases of chloroquine-resistant P. Falciparum strains when combined with
Sulphadoxine. It acts by inhibiting dihydrofolate reductase in the parasite thus preventing
the biosynthesis of purines and pyrimidines. Therefore halting the processes of DNA
synthesis, cell division and reproduction. It acts primarily on the schizonts during the
hepatic and erythrocytic phases.
Sulphadoxine
Proguanil
Mefloquine
Mefloquine was developed during the Vietnam War and is chemically related to quinine.
It was developed to protect American troops against multi-drug resistant P. falciparum. It
is a very potent blood schizonticide with a long half-life. It is thought to act by forming
toxic heme complexes that damage parasitic food vacuoles. It is now used solely for the
prevention of resistant strains of P. falciparum despite being effective against P. vivax, P.
ovale and P. marlariae. Mefloquine is effective in prophylaxis and for acute therapy. It is
now strictly used for resistant strains (and is usually combined with Artesunate).
Chloroquine/Proguanil or sufha drug-pyrimethamine combinations should be used in all
other Plasmodia infections.
Mefloquine can only be taken for a period up to 6 months (due to side effects, ...). After
this, other drugs (such as those based on paludrine/nivaquine) again need to be taken. [2]
Atovaquone
Recently, a new type of antimalarial drug has also been available which is very effective
as no musquitos populations have already been able to generate resistance due to
exposure. The new drug is called Atovaquone. Also, the drug produces has no side-
effects such as the cardiovascular effect with mefloquine which can trigger heart rythm
problems. The drug is avialable under the name Malarone, yet is very expensive (costing
way more than Lariam).
Primaquine
Primaquine is a highly active 8-aminoquinolone that is used in treating all types of
malaria infection. It is most effective against gametocytes but also acts on hypnozoites,
blood schizonticytes and the dormant plasmodia in P. vivax and P. ovale. It is the only
known drug to cure both relapsing malaria infections and acute cases. The mechanism of
action is not fully understood but it is thought to mediate some effect through creating
oxygen free radicals that interfere with the plasmodial electron transport chain during
respiration.
For the prevention of relapse in P. vivax and P. ovale 0.15 mg/kg should be given for 14
days. As a gametocytocidal drug in P. falciparum infections a single dose of 0.75mg/kg
repeated 7 days later is sufficient. This treatment method is only used in conjunction with
another effective blood schizonticidal drug. There are few significant side effects
although is has been shown that Primaquine may cause anorexia, nausea, vomiting,
cramps, chest weakness, anaemia, some suppression of myeloid activity and abdominal
pains. In cases of over-dosage granulocytopenia may occur.
• Artemesinin has a very rapid action and the vast majority of acute patients treated
show significant improvement within 1-3 days of receiving treatment. It has
demonstrated the fastest clearance of all anti-malarials currently used and acts
primarily on the trophozite phase, thus preventing progression of the disease. It is
converted to active metabolite dihydroartemesinin that then inhibits the
sarcoplasmic/endoplasmic reticulum Calcium ATPase encoded by P. falciparum.
On the first day of treatment 20 mg/kg should be given, this dose is then reduced
to 10mg/kg per day for the 6 following days. Few side effects are associated with
artemesinin use. However, headaches, nausea, vomiting, abnormal bleeding, dark
urine, itching and some drug fever have been reported by a small number of
patients. Some cardiac changes were reported during a clinical trial, notably non
specific ST changes and a first degree atrioventricular block (these disappeared
when the patients recovered from the malarial fever).
Therapy drugs
Halofantrine
Halofantrine is a relatively new drug developed by the Walter Reed Army Institute of
Research in the 1960s. It is a phenanthrene methanol, chemically related to Quinine and
acts acting as a blood schizonticide effective against all plasmodium parasites. Its
mechanism of action is similar to other anti-malarials. Cytotoxic complexes are formed
with ferritoporphyrin XI that cause plasmodial membrane damage. Despite being
effective against drug resistant parasites, Halofantrine is not commonly used in the
treatment (prophylactic or therapeutic) of malaria due to its high cost. It has very variable
bioavailability and has been shown to have potentially high levels of cardiotoxicity. It is
still a useful drug and can be used in patients that are known to be free of heart disease
and are suffering from severe and resistant forms of acute malaria. A popular drug based
on halofantrine is Halfan. The level of governmental control and the prescription-only
basis on which it can be used contributes to the cost, thus Halofantrine is not frequently
used.
Other agents
Doxycycline
When treating acute cases and given in combination with Quinine; 100mg/kg of
Doxycycline should be given per day for 7 days. In prophylactic therapy, 100mg (adult
dose) of Doxycycline should be given every day during exposure to malaria.
The most commonly experienced side effects are permanent enamel hypoplasia, transient
depression of bone growth, gastrointestinal disturbances and some increased levels of
photosensitivity. Due to its effect of bone and tooth growth it is not used in children
under 8, pregnant or lactating women and those with a known hepatic dysfunction.
Tetracycline is only used in combination for the treatment of acute cases of P.Falciparum
infections. This is due to its slow onset. Unlike Doxycycline it is not used in
chemoprophylaxis. For Tetracycline, 250mg is the recommended adult dosage (it should
not be used in children) for 5 or 7 days depending on the level of adherence and
compliance expected. Oesophageal ulceration, gastrointestinal upset and interferences
with the process of ossification and depression of bone growth are known to occur. The
majority of side effects associated with Doxycycline are also experienced.
Clindamycin
Clindamycin should be given in conjunction with Quinine as a 300mg dose (in adults)
four times a day for 5 days. The only side effects recorded in patients taking Clindamycin
are nausea, vomiting and abdominal pains and cramps. However these can be alleviated
by consuming large quantities of water and food when taking the drug.
Pseudomembranous colitis (caused by Clostridium difficile} has also developed in some
patients; this condition may be fatal in a small number of cases.
Drug regimens
The following regimens are recommended by the WHO, UK HPA and CDC for adults
and children aged 12 and over:
• chloroquine 300 to 310 mg once weekly, and proguanil 200 mg once daily
(started one week before travel, and continued for four weeks after returning);
• doxycycline 100 mg once daily (started one day before travel, and continued for
four weeks after returning);
• mefloquine 228 to 250 mg once weekly (started two-and-a-half weeks before
travel, and continued for four weeks after returning);
• Malarone 1 tablet daily (started one day before travel, and continued for 1 week
after returning).
Resistance to antimalarials
Anti-malarial drug resistance has been defined as: "the ability of a parasite to survive
and/or multiply despite the administration and absorption of a drug given in doses equal
to or higher than those usually recommended but within tolerance of the subject. The
drug in question must gain access to the parasite or the infected red blood cell for the
duration of the time necessary for its normal action." In most instances this refers to
parasites that remaining following on from an observed treatment. Thus excluding all
cases where anti-malarial prophylaxis has failed. In order for a case to be defined as
resistant, the patient under question must have received a known and observed anti-
malarial therapy whilst the blood drug and metabolite concentrations are monitored
concurrently. The techniques used to demonstrate this are: in vivo, in vitro, animal model
testing and the most recently developed molecular techniques.
Drug resistant parasites are often used to explain malaria treatment failure. However, they
are two potentially very different clinical scenarios. The failure to clear parasitemia and
recover from an acute clinical episode when a suitable treatment has been given and anti-
malarial resistance in its true form. Drug resistance may lead to treatment failure, but
treatment failure is not necessarily caused by drug resistance despite assisting with its
development. A multitude of factors can be involved in the processes including problems
with non-compliance and adherence, poor drug quality, interactions with other
pharmaceuticals, poor absorption, misdiagnosis and incorrect doses being given. The
majority of these factors also contribute to the development of drug resistance.
The generation of resistance can be complicated and varies between plasmodium species.
It is generally accepted to be initiated primarily through a spontaneous mutation that
provides some evolutionary benefit, thus giving an anti-malarial used a reduced level of
sensitivity. This can be caused by a single point mutation or multiple mutations. In most
instances a mutation will be fatal for the parasite or the drug pressure will remove
parasites that remain susceptible, however some resistant parasites will survive.
Resistance can become firmly established within a parasite population, existing for long
periods of time.
Plasmodium have developed resistance against antifolate combination drugs, the most
commonly used being sulfadoxine and pyrimethamine. Two gene mutations are thought
to be responsible, allowing synergistic blockages of two enzymes involved in folate
synthesis. Regional variations of specific mutations give differing levels of resistance.
Spread of resistance
There is no single factor that confers the greatest degree of influence on the spread of
drug resistance, but a number of plausible causes associated with an increase have been
acknowledged. These include aspects of economics, human behaviour, pharmokinetics,
and the biology of vectors and parasites.
Prevention of resistance
Provisions essential to this process include the delivery of fast primary care where staff
are well trained and supported with the necessary supplies for efficient treatment. This in
itself is inadequate in large areas where malaria is endemic thus presenting an initial
problem. One method proposed that aims to avoid the fundamental lack in certain
countries health care infrastructure is the privatisation of some areas, thus enabling drugs
to be purchased on the open market from sources that are not officially related to the
health care industry. Although this is now gaining some support there are many problems
related to limited access and improper drug use, which could potentially increase the rate
of resistance development to an even greater extent.
There are two general approaches to preventing the spread of resistance: preventing
malaria infections and, preventing the transmission of resistant parasites.
Preventing malaria infections developing has a substantial effect on the potential rate of
development of resistance, by directly reducing the number of cases of malaria thus
decreasing the requirement for anti-malarial therapy. Preventing the transmission of
resistant parasites limits the risk of resistant malarial infections becoming endemic and
can be controlled by a variety of non-medical methods including insecticide-treated bed
nets, indoor residual spraying, environmental controls (such as swamp draining) and
personal protective methods such as using mosquito repellent. Chemoprophylaxis is also
important in the transmission of malaria infection and resistance in defined populations
(for example travellers).
Combination therapy
The problem of the development of malaria resistance must be weighed against the
essential goal of anti-malarial care; that is to reduce morbidity and mortality. Thus a
balance must be reached that attempts to achieve both goals whilst not compromising
either too much by doing so. The most successful attempts so far have been in the
administration of combination therapy. This can be defined as, ‘the simultaneous use of
two or more blood schizonticidal drugs with independent modes of action and different
biochemical targets in the parasite’. There is much evidence to support the use of
combination therapies, some of which has been discussed previously, however several
problems prevent the wide use in the areas where its use is most advisable. These include:
problems identifying the most suitable drug for different epidemiological situations, the
expense of combined therapy (it is over 10 times more expensive than traditional mono-
therapy), how soon the programmes should be introduced and problems linked with
policy implementation and issues of compliance.
The combinations of drugs currently prescribed can be divided into two categories: Non-
artemesinin and Quinine based combinations and, Artemesinin based combinations.
Artemesinin-based combinations
Artemesinin has a very different mode of action than conventional anti-malarials (see
information above), this makes is particularly useful in the treatment of resistant
infections, however in order to prevent the development of resistance to this drug it is
only recommended in combination with another non-artemesinin based therapy. It
produces a very rapid reduction in the parasite biomass with an associated reduction in
clinical symptoms and is known to cause a reduction in the transmission of gametocytes
thus decreasing the potential for the spread of resistant alleles. At present there is no
known resistance to Artemesinin and very few reported side-effects to drug usage,
however this data is limited.
• Artesunate and Amodiaquine–This combination has also been tested and proved
to be more efficacious and similarly well tolerated to the Chloroquine
combination. The cure rate was greater than 90%, potentially providing a viable
alternative where levels of Chloroquine resistance are high. The main
disadvantage is a suggested link with neutropenia. Dosage is recommended as
4mg/kg of Artesunate and 10mg/kg of Amodiaquine per day for 3 days.
Other combinations
There are several anti-malarial combinations currently being developed that are hoped to
be highly efficacious, cost-effective, safe and well tolerated. These are to be newly
developed compounds and not derivatives of currently used drugs, thus decreasing the
likelihood of resistance.
• Pyronaridine and Artesunate has been tested and demonstrated a clinical response
rate of 100% in one trial in Hainan (an area with high levels of P. falciparum
resistance to Pyronaridine).
Amoebiasis
Amoebiasis, or Amebiasis is caused by the amoeba Entamoeba histolytica. It is an
intestinal infection that may or may not be symptomatic and can be present in an infected
person for several years. It is estimated that it causes 70,000 deaths per year world wide.
Symptoms, when present, can range from mild diarrhea to dysentery with blood and
mucus in the stool.
When symptoms are present it is generally known as invasive amoebiasis and occurs in
two major forms. Invasion of the intestinal lining causes "amoebic dysentery" or
"amoebic colitis". If the parasite reaches the bloodstream it can spread through the body,
most frequently ending up in the liver where it causes "amoebic liver abscesses". When
no symptoms are present, the infected individual is still a carrier, able to spread the
parasite to others through poor hygienic practices. While symptoms at onset can be
similar to Bacillary dysentery, amoebiasis is not bacteriological in origin and treatments
differ, although both infections can be prevented by good sanitary practices.
Transmission
Amoebiasis is usually transmitted by contamination of drinking water and foods with
feces, but it can also be transmitted indirectly through contact with dirty hands or objects
as well as by anal-oral contact.
Infection is spread through ingestion of the cyst form of the parasite, a resistant structure
that is found in stools. There may also be free amoebae, or trophozoites, that do not form
cysts but these die quickly after leaving the body and are only rarely the source of new
infections. Since amoebiasis is transmitted through contaminated food and water, it is
often endemic in the poorer regions of the world due to less well developed waste
disposal systems and untreated water supplies.
Contact with contaminated water, for example by washing or brushing your teeth in water
from a contaminated source, or ingesting vegetables washed in such water, can lead to
infection as well.
Prevention
To help prevent the spread of amoebiasis around the home :
• Wash hands thoroughly with soap and hot running water for at least 10 seconds
after using the toilet or changing a baby's diaper, and before handling food.
• Clean bathrooms and toilets often; pay particular attention to toilet seats and taps.
• Avoid sharing towels or face washers.
• Avoid raw vegetables when in endemic areas, as they may have been fertilized
using human feces.
• Boil water or treat with iodine tablets.
Infections can sometimes last for years. Symptoms take from a few days to a few weeks
to develop and manifest themselves, but usually it is about two to four weeks. Symptoms
can range from mild diarrhea to dysentery with blood and mucus. The blood comes from
amoebae invading the lining of the intestine. In about 10% of invasive cases the amoebae
enter the bloodstream and may travel to other organs in the body. Most commonly this
means the liver, as this is where blood from the intestine reaches first, but they can end
up almost anywhere.
Onset time is highly variable and the average asymptomatic infection persists for over a
year. It is theorized that the absence of symptoms or their intensity may vary with such
factors as strain of amoeba, immune response of the host, and perhaps associated bacteria
and viruses.
In asymptomatic infections the amoeba lives by eating and digesting bacteria and food
particles in the gut, a part of the gastrointestinal tract.[citation needed] It does not usually come
in contact with the intestine itself due to the protective layer of mucus that lines the gut.
Disease occurs when amoeba comes in contact with the cells lining the intestine. It then
secretes the same substances it uses to digest bacteria, which include enzymes that
destroy cell membranes and proteins. This process can lead to penetration and digestion
of human tissues, resulting first in flask-shaped ulcers in the intestine. Entamoeba
histolytica ingests the destroyed cells by phagocytosis and is often seen with red blood
cells inside when viewed in stool samples. Especially in Latin America,[citation needed] a
granulomatous mass (known as an amoeboma) may form in the wall of the ascending
colon or rectum due to long-lasting cellular response, and is sometimes confused with
cancer.[1]
Microscopy is still by far the most widespread method of diagnosis around the world.
However it is not as sensitive or accurate in diagnosis as the other tests available. It is
important to distinguish the E. histolytica cyst from the cysts of nonpathogenic intestinal
protozoa such as Entamoeba coli by its appearance. E. histolytica cysts have a maximum
of four nuclei, while the commensal Entamoeba coli cyst has up to 8 nuclei. Additionally,
in E. histolytica, the endosome is centrally located in the nucleus, while it is usually off-
center in Entamoeba coli. Finally, chromatoidal bodies in E. histolytica cysts are
rounded, while they are jagged in Entamoeba coli. However, other species, Entamoeba
dispar and E. moshkovskii, are also commensals and cannot be distinguished from E.
histolytica under the microscope. As E. dispar is much more common than E. histolytica
in most parts of the world this means that there is a lot of incorrect diagnosis of E.
histolytica infection taking place. The WHO recommends that infections diagnosed by
microscopy alone should not be treated if they are asymptomatic and there is no other
reason to suspect that the infection is actually E. histolytica.
Treatment
E. histolytica infections occur in both the intestine and (in people with symptoms) in
tissue of the intestine and/or liver. As a result two different sorts of drugs are needed to
rid the body of the infection, one for each location. Metronidazole, or a related drug such
as Tinidazole, Secnidazole or Ornidazole, is used to destroy amebae that have invaded
tissue. These are rapidly absorbed into the bloodstream and transported to the site of
infection. Because they are rapidly absorbed there is almost none remaining in the
intestine. Since most of the amebae remain in the intestine when tissue invasion occurs, it
is important to get rid of those also or the patient will be at risk of developing another
case of invasive disease. Several drugs are available for treating intestinal infections, the
most effective of which has been shown to be Paromomycin (also known as Humatin);
Diloxanide Furoate (also known as Furamide) is used in the US and Iodoquinol (also
known as Yodoxin) is used in certain other countries. Both tissue and lumenal drugs must
be used to treat infections, with Metronidazole usually being given first, followed by
Paromomycin or Diloxanide. E. dispar does not require treatment, but many laboratories
(even in the developed world) do not have the facilities to distinguish this from E.
histolytica.
For amebic dysentery a multi-prong approach must be used, starting with one of:
In addition to the above, one of the following luminal amebicides should be prescribed as
an adjunctive treatment, either concurrently or sequentially, to destroy E. histolytica in
the colon:
Doses for children are calculated by body weight and a pharmacist should be consulted
for help.
Herbal treatments
In Mexico, it is common to use herbal tinctures of chaparro amargo (Castela texana). 30
drops are taken in a small glass of water first thing in the morning, and 30 drops before
the last meal of the day, for seven days straight. After taking a seven day break from the
treatment, it is resumed for seven days. Some mild cramping may be felt; it is claimed
this means that the amoebas are dying and will be expelled from the body. Many
Mexicans use the chaparro amargo treatment regularly, three times a year. The efficacy
of such treatments has not been scientifically proven.
A 1998 study in Africa suggests that 2 tablespoons per week of papaya seeds may have
some antiamoebic action and aid in prevention of amoebiasis, but this remains
unconfirmed. Papaya fruit and seeds are often considered beneficial to digestion in areas
where this plant is common.
Complications
In the majority of cases, amoebas remain in the gastrointestinal tract of the hosts. Severe
ulceration of the gastrointestinal mucosal surfaces occurs in less than 16% of cases. In
fewer cases, the parasite invades the soft tissues, most commonly the liver. Only rarely
are masses formed (amoebomas) that lead to intestinal obstruction.
Populations at risk
All people are believed to be susceptible to infection and there is no evidence that
individuals with a damaged or undeveloped immunity may suffer more severe forms of
the disease.
Food analysis
E. histolytica cysts may be recovered from contaminated food by methods similar to
those used for recovering Giardia lamblia cysts from feces. Filtration is probably the
most practical method for recovery from drinking water and liquid foods. E. histolytica
cysts must be distinguished from cysts of other parasitic (but nonpathogenic) protozoa
and from cysts of free-living protozoa as discussed above. Recovery procedures are not
very accurate; cysts are easily lost or damaged beyond recognition, which leads to many
falsely negative results in recovery tests.[3]
Outbreaks
The most dramatic incident the USA was the Chicago World's Fair outbreak in 1933
caused by contaminated drinking water; defective plumbing permitted sewage to
contaminate water. There were 1,000 cases (with 58 deaths). In 1998 there was an
[outbreak] of amoebiasis in the Republic of Georgia. One hundred and seventy-seven
cases were reported between 26 May and 3 September 1998, including 71 cases of
intestinal amoebiasis and 106 probable cases of liver abscess. In recent times, food
handlers are suspected of causing many scattered infections.
Giardiasis
Giardiasis — popularly known as beaver fever or backpacker's diarrhea — is a
disease caused by the flagellate protozoan Giardia lamblia (also sometimes called
Giardia intestinalis and Giardia duodenalis).[1] The giardia organism inhabits the
digestive tract of a wide variety of domestic and wild animal species, including humans.
It is a common cause of gastroenteritis in humans, infecting approximately 200 million
people worldwide.
Transmission
Giardiasis is passed via the fecal-oral route. Primary routes are personal contact and
contaminated water and food. People who spend time in institutional or day-care
environments are more susceptible, as are travelers and those who consume improperly
treated water. It is a particular danger to people hiking or backpacking in wilderness areas
worldwide. Giardia is suspected to be zoonotic—communicable between animals and
humans. Major reservoir hosts include beavers, dogs, cats, horses, cattle and birds.
Symptoms
Symptoms include loss of appetite, lethargy, fever, explosive diarrhea, hematuria (blood
in urine), loose or watery stool, stomach cramps, upset stomach, projectile vomiting
(uncommon), bloating, flatulence, and burping (often sulphurous). Symptoms typically
begin 1–2 weeks after infection and may wane and reappear cyclically. Symptoms are
caused by Giardia organisms coating the inside of the small intestine and blocking
nutrient absorption. Most people are asymptomatic; only about a third of infected people
exhibit symptoms. Untreated, symptoms may last for six weeks or longer.
Symptomatic infections are well recognised as causing lactose intolerance,[2] which, while
usually temporary, may become permanent.[3][4] Although hydrogen breath tests indicate
poorer rates of carbohydrate absorption in those asymptomatically infected, such tests are
not diagnostic of infection.[5] It has been suggested that these observations are explained
by symptomatic giardia infection allowing for the overgrowth of other bacteria.[6][5]
Some studies have shown that giardiasis should be considered as a cause of Vitamin B12
deficiency, this a result of the problems caused within the intestinal absorption system. [7]
Treatment
Drugs used to treat adults include metronidazole, albendazole and quinacrine.
Furazolidone and nitazoxanide may be used in children. Treatment is not always
necessary, as the body can defeat the infection by itself.
The drug tinidazole can treat giardiasis in a single treatment of 2000 mg, instead of the
longer treatment of the other medications listed. The shorter duration of treatment may
also cause less patient distress. Tinidazole is now approved by the FDA[8] and available to
U.S. patients.
Lab Diagnosis
• The mainstay of diagnosis of Giardiasis is stool microscopy. This can be for
motile trophozoites or for the distinctive oval G.lamblia cysts.
• The entero-test uses a gelatin capsule with an attached thread. One end is
attached to the inner aspect of the patient's cheek, and the capsule is swallowed.
Later the thread is withdrawn and shaken in saline to release trophozoites which
can be detected microscopically.
Anthelmintic
Anthelmintics or antihelminthics are drugs that expel parasitic worms (helminths) from
the body, by either stunning or killing them. They may also be called vermifuges
(stunning) or vermicides (killing).
Pharmaceutical classes
Examples of pharmaceuticals used as anthelmintics include:
Please note that many of these pharameuticals are extremely toxic. Taken in improper
dosages they can be dangerous to humans as well as lethal to parasites.
Natural anthelmintics
Examples of naturally occuring anthelmintics include:
Please note that many natural vermifuges or anthelmintics are poisonous and, in improper
dosages, dangerous to humans as well as parasites.
Anthelmintic resistance
The ability of worms to survive treatments that are generally effective at the
recommended dose rate is considered a major threat to the current future control of worm
parasites of small ruminants and horses.
The clinical definition of resistance is a 95% or less reduction in a "Fecal Egg Count"
test.[clarify]
Development of resistance
Treatment eliminates worms whose genotype renders them susceptible. Worms that are
resistant survive and pass on their "resistance" genes. Resistant worms accumulate and
finally treatment failure occurs.
Bioassay
Bioassay is a shorthand commonly used term for biological assay and is a type of
scientific experiment.
Immunosuppressant
An immunosuppressant is a substance that performs immunosuppression of the immune
system. They may either be exogenous. as immunosuppressive drugs, or endogenous, as
e.g. testosterone.[1]
After organ transplantation, the body will nearly always reject the new organ(s) due to
differences in human leukocyte antigen haplotypes between the donor and recipient. As a
result, the immune system detects the new tissue as "hostile", and attempts to remove it
by attacking it with recipient leukocytes, resulting in the death of the tissue.
Immunosuppressants are applied as a countermeasure; the side effect is that the body
becomes extremely vulnerable to infections and malignancy, much like in an advanced
HIV infection.
Immunosuppressive drug
Immunosuppressive drugs, immunosuppressive agents, or immunosuppressants are
drugs that inhibit or prevent activity of the immune system. They are used in
immunosuppressive therapy to:
• Prevent the rejection of transplanted organs and tissues (e.g., bone marrow, heart,
kidney, liver)
• Treat autoimmune diseases or diseases that are most likely of autoimmune origin
(e.g., rheumatoid arthritis, multiple sclerosis, myasthenia gravis, systemic lupus
erythematosus, Crohn's disease, pemphigus, and ulcerative colitis).
• Treat some other non-autoimmune inflammatory diseases (e.g., long term allergic
asthma control).
These drugs are not without side-effects and risks. Because the majority of them act non-
selectively, the immune system is less able to resist infections and the spread of
malignant cells. There are also other side-effects, such as hypertension, dyslipidemia,
hyperglycemia, peptic ulcers, liver, and kidney injury. The immunosuppressive drugs
also interact with other medicines and affect their metabolism and action. Actual or
suspected immunosuppressive agents can be evaluated in terms of their effects on
lymphocyte subpopulations in tissues using immunohistochemistry.[1]
• glucocorticoids
• cytostatics
• antibodies
• drugs acting on immunophilins
• other drugs.
Glucocorticoids
In pharmacologic (supraphysiologic) doses, glucocorticoids are used to suppress various
allergic, inflammatory, and autoimmune disorders. They are also administered as
posttransplantory immunosuppressants to prevent the acute transplant rejection and graft-
versus-host disease. Nevertheless, they do not prevent an infection and also inhibit later
reparative processes.
Immunosuppressive mechanism
Glucocorticoids suppress the cell-mediated immunity. They act by inhibiting genes that
code for the cytokines IL-1, IL-2, IL-3, IL-4, IL-5, IL-6, IL-8, and TNF-γ, the most
important of which is the IL-2. Smaller cytokine production reduces the T cell
proliferation.
Glucocorticoids also suppress the humoral immunity, causing B cells to express smaller
amounts of IL-2 and IL-2 receptors. This diminishes both B cell clone expansion and
antibody synthesis.
Antiinflammatory effects
Glucocorticoids influence all types of inflammatory events, no matter what their cause.
They induce the lipocortin-1 (annexin-1) synthesis, which then binds to cell membranes
preventing the phospholipase A2 from coming into contact with its substrate arachidonic
acid. This leads to diminished eicosanoid production. The cyclooxygenase (both COX-1
and COX-2) expression is also suppressed, potentiating the effect.
Glucocorticoids also stimulate the lipocortin-1 escaping to the extracellular space, where
it binds to the leukocyte membrane receptors and inhibits various inflammatory events:
epithelial adhesion, emigration, chemotaxis, phagocytosis, respiratory burst, and the
release of various inflammatory mediators (lysosomal enzymes, cytokines, tissue
plasminogen activator, chemokines, etc.) from neutrophils, macrophages, and mastocytes.
Cytostatics
Cytostatics inhibit cell division. In immunotherapy, they are used in smaller doses than in
the treatment of malignant diseases. They affect the proliferation of both T cells and B
cells. Due to their highest effectiveness, purine analogs are most frequently administered.
Alkylating agents
Antimetabolites
By preventing the clonal expansion of lymphocytes in the induction phase of the immune
response, it affects both the cell and the humoral immunity. It is also efficient in the
treatment of autoimmune diseases.
Cytotoxic antibiotics
Antibodies
Antibodies are used as a quick and potent immunosuppression method to prevent the
acute rejection reaction.
Polyclonal antibodies
Heterologous polyclonal antibodies are obtained from the serum of animals (e.g., rabbit,
horse), and injected with the patient's thymocytes or lymphocytes. The antilymphocyte
(ALG) and antithymocyte antigens (ATG) are being used. They are part of the steroid-
resistant acute rejection reaction and grave aplastic anemia treatment. However, they are
added primarily to other immunosuppressives to diminish their dosage and toxicity. They
also allow transition to cyclosporine therapy.
Polyclonal antibodies inhibit T lymphocytes and cause their lysis, which is both
complement-mediated cytolysis and cell-mediated opsonization followed by removal of
reticuloendothelial cells from the circulation in the spleen and liver]]. In this way,
polyclonal antibodies inhibit cell-mediated immune reactions, including graft rejection,
delayed hypersensitivity (i.e., tuberculin skin reaction), and the graft-versus-host disease
(GVHD), but influence thymus-dependent antibody production.
As of March 2005, there are two preparations available to the market: Atgam (R),
obtained from horse serum, and Thymoglobuline (R), obtained from rabbit serum.
Polyclonal antibodies affect all lymphocytes and cause general immunosuppression,
possibly leading to post-transplant lymphoproliferative disorders (PTLD) or serious
infections, especially by cytomegalovirus. To reduce these risks, treatment is provided in
a hospital, where adequate isolation from infection is available. They are usually
administered for five days intravenously in the appropriate quantity. Patients stay in the
hospital as long as three weeks to give the immune system time to recover to a point
where there is no longer a risk of serum sickness.
Monoclonal antibodies
Monoclonal antibodies are directed towards exactly defined antigens. Therefore, they
cause fewer side-effects. Especially significant are the IL-2 receptor- (CD25-) and CD3-
directed antibodies. They are used to prevent the rejection of transplanted organs, but also
to track changes in the lymphocyte subpopulations. It is reasonable to expect similar new
drugs in the future.
As of 2007 OKT3 (also called muromab) is the only approved anti-CD3 antibody. It is a
murine anti-CD3 monoclonal antibody of the IgG2a type that prevents T-cell activation
and proliferation by binding the T-cell receptor complex present on all differentiated T
cells. As such it is one of the most potent immunosuppressive substances and is
administered to control the steroid- and/or polyclonal antibodies-resistant acute rejection
episodes. As it acts more specifically than polyclonal antibodies it is also used
prophylactically in transplantations.
At present the OKT3's mechanism of action is only partially understood. It is known that
the molecule binds TCR/CD3 receptor complex. In the first few administrations this
binding non-specifically activates T-cells, leading to a serious syndrome 30 to 60 minutes
later. It is characterized by fever, myalgia, headache, and arthralgia. Sometimes it
develops in a life-threatening reaction of the cardiovascular system and the central
nervous system, requiring a lengthy therapy. Past this period CD3 (R) blocks the TCR-
antigen binding and causes conformational change or the removal of the entire
TCR3/CD3 complex from the T-cell surface. This lowers the number of available T-cells,
perhaps by sensitizing them for the uptake by the epithelial reticular cells. The cross-
binding of CD3 molecules as well activates an intracellular signal causing the T cell
anergy or apoptosis, unless the cells receive another signal through a co-stimulatory
molecule. CD3 antibodies shift the balance from Th1 to Th2 cells.
When deciding to include OKT3 in the treatment a healthcare practitioner must consider
not only its great efficiency but also its toxic side-effects. The risk of excessive
immunosuppression and the risk of development of neutralizing antibodies could make it
inefficacious. Although CD3 antibodies act more specifically than polyclonal antibodies,
they lower the cell-mediated immunity significantly, predisposing the patient to
opportunistic infections and malignancies.
Interleukin-2 is an important immune system regulator necessary for the clone expansion
and survival of activated lymphocytes T. Its effects are mediated by the trimer cell
surface receptor IL-2a, consisting of the α, β, and γ chains. The IL-2a (CD25, T-cell
activation antigen, TAC) is expressed only by the already-activated T lymphocytes.
Therefore, it is of special significance to the selective immunosuppressive treatment, and
the research has been focused on the development of effective and safe anti-IL-2
antibodies. By the use of the recombinant gene technology, the mouse anti-Tac antibodies
have been modified, leading to the presentation of two himeric mouse/human anti-Tac
antibodies in the year 1998: basiliximab (Simulect (R)) and daclizumab (Zenapax (R)).
These drugs act by binding the IL-2a receptor's α chain, preventing the IL-2 induced
clonal expansion of activated lymphocytes and shortening their survival. They are used in
the prophylaxis of the acute organ rejection after the bilateral kidney transplantation, both
being similarly effective and with only few side-effects.
Together with tacrolimus, cyclosporin is a calcineurin inhibitor. It has been in use since
1983 and is one of the most-widely-used immunosuppressive drugs. It is a fungal peptide,
composed of 11 amino acids.
Cyclosporin is used in the treatment of acute rejection reactions, but has been
increasingly substituted with newer immunosuppressants, as it is nephrotoxic.
The drug is used particularly in the liver and kidney transplantations, although in some
clinics it is used in heart, lung and heart/lung transplants. It binds to an immunophilin,
followed by the binding of the complex to calcineurin and the inhibition of its
phosphatase activity. In this way, it prevents the passage of G0 into G1 phase.
Tacrolimus is more potent than cyclosporin and has less-pronounced side-effects.
Contrary to cyclosporine and tacrolimus that affect the first phase of the T lymphocyte
activation, sirolimus affects the second one, namely the signal transduction and their
clonal proliferation. It binds to the same receptor (immunophilin) as tacrolimus, however
the produced complex does not inhibit calcineurin, but another protein. Therefore,
sirolimus acts synergistically with cyclosporine and, in combination with other
immunosuppressants, has few side-effects. Also, it indirectly inhibits several T
lymphocyte kinases and phosphatases, preventing the transmission of signal into their
activity and the transition of the cell cycle from G1 to S phase. In a similar manner, it
prevents the B cell differentiation to the plasma cells, which lowers the quantity of IgM,
IgG, and IgA antibodies produced. It acts as an immunoregulatory agent, and is also
active against tumors that involve the PI3K/AKT/mTOR pathway.
Other drugs
Interferons
IFN-β suppresses the production of Th1 cytokines and the activation of monocytes. It is
used to slow down the progression of multiple sclerosis. IFN-γ is able to trigger
lymphocytic apoptosis.
Opioids
Prolonged use of opioids may cause immunosuppression of both innate and adaptive
immunity.[2] Decrease in proliferation as well as immune function has been observed in
macrophages, as well as lymphocytes. It is thought that these effects are mediated by
opioid receptors expressed on the surface of these immune cells.[2]
TNF or the effects of TNF are also suppressed by various natural compounds, including
curcumin (an ingredient in turmeric) and catechins (in green tea).
These drugs may raise the risk of contracting tuberculosis or inducing a latent infection to
become active. Infliximab and adalimumab have label warnings stating that patients
should be evaluated for latent TB infection and treatment should be initiated prior to
starting therapy with them.
Mycophenolate
Myriocin has been reported being 10 to 100 times more potent than Cyclosporin
Toxicology
Toxicology (from the Greek words toxicos and logos) is the study of the adverse effects
of chemicals on living organisms.[1] It is the study of symptoms, mechanisms, treatments
and detection of poisoning, especially the poisoning of people.
Toxicity of metabolites
Many substances regarded as poisons are toxic only indirectly. An example is "wood
alcohol," or methanol, which is chemically converted to formaldehyde and formic acid in
the liver. It is the formaldehyde and formic acid that cause the toxic effects of methanol
exposure. Many drug molecules are made toxic in the liver, a good example being
acetaminophen (paracetamol), especially in the presence of alcohol. The genetic
variability of certain liver enzymes makes the toxicity of many compounds differ
between one individual and the next. Because demands placed on one liver enzyme can
induce activity in another, many molecules become toxic only in combination with
others. A family of activities that engages many toxicologists includes identifying which
liver enzymes convert a molecule into a poison, what are the toxic products of the
conversion and under what conditions and in which individuals this conversion takes
place.
Chemical toxicology
Chemical toxicology is a scientific discipline involving the study of structure and
mechanism related to the toxic effects of chemical agents, and encompasses technology
advances in research related to chemical aspects of toxicology. Research in this area is
strongly multidisciplinary, spanning computational chemistry and synthetic chemistry,
proteomics and metabolomics, drug discovery, drug metabolism and mechanisms of
action, bioinformatics, bioanalytical chemistry, chemical biology, and molecular
epidemiology. The molecular profiling approaches towards Toxicology are also referred
to as Toxicogenomics[5]
Toxicity
Toxicity is the degree to which a substance is able to damage an exposed organism.
Toxicity can refer to the effect on a whole organism, such as a human, bacterium, or
plant, as well as the effect on a substructure of the organism, such as a cell (cytotoxicity)
or an organ (organotoxicity such as the liver (hepatotoxicity). By extension, the word
may be metaphorically used to describe toxic effects on larger and more complex groups,
such as the family unit or society at large.
A central concept of toxicology is that effects are dose-dependent; even water can lead to
water intoxication when taken in large enough doses, whereas for even a very toxic
substance such as snake venom there is a dose below which there is no detectable toxic
effect.
The skull and crossbones is a common symbol for toxicity.
Types of toxicity
There are generally three types of toxic entities; chemical, biological, and physical.
Toxicity can be measured by the effects on the target (organism, organ, tissue or cell).
Because individuals typically have different levels of response to the same dose of a
toxin, a population-level measure of toxicity is often used which relates the probability of
an outcome for a given individual in a population. One such measure is the LD50. When
such data does not exist, estimates are made by comparison to known similar toxic things,
or to similar exposures in similar organisms. Then "safety factors" are added to account
for uncertainties in data and evaluation processes. For example, if a dose of toxin is safe
for a laboratory rat, one might assume that one tenth that dose would be safe for a human,
allowing a safety factor of 10 to allow for interspecies differences between two
mammals; if the data are from fish, one might use a factor of 100 to account for the
greater difference between two chordate classes (fish and mammals). Similarly, an extra
protection factor may be used for individuals believed to be more susceptible to toxic
effects such as in pregnancy or with certain diseases. Or, a newly synthesized and
previously unstudied chemical that is believed to be very similar in effect to another
compound could be assigned an additional protection factor of 10 to account for possible
differences in effects that are probably much smaller. Obviously, this approach is very
approximate; but such protection factors are deliberately very conservative and the
method has been found to be useful in a wide variety of applications.
Assessing all aspects of the toxicity of cancer-causing agents involves additional issues,
since it is not certain if there is a minimal effective dose for carcinogens, or whether the
risk is just too small to see. In addition, it is possible that a single cell transformed into a
cancer cell is all it takes to develop the full effect (the "one hit" theory).
It is more difficult to assess the toxicity of chemical mixtures than of single, pure
chemicals because each component display its own toxicity and components may interact
to produce enhanced or diminished effects. Common mixtures include gasoline, cigarette
smoke, and industrial waste. Even more complex are situations with more than one type
of toxic entity, such as the discharge from a malfunctioning sewage treatment plant, with
both chemical and biological agents.
acute exposure
a single exposure to a toxic substance which may result in severe biological harm
or death; acute exposures are usually characterized as lasting no longer than a day.
chronic exposure
continuous exposure to a toxin over an extended period of time, often measured in
months or years can cause irreversible side effects.
Acute toxicity
Acute toxicity describes the adverse effects of a substance which result either from a
single exposure[1] or from multiple exposures in a short space of time (usually less than
24 hours).[2] To be described as acute toxicity, the adverse effects should occur within
14 days of the administration of the substance.[2]
Acute toxicity is distinguished from chronic toxicity, which describes the adverse health
effects from repeated exposures, often at lower levels, to a substance over a longer time
period (months or years).
It is obviously unethical to test for acute (or chronic) toxicity in humans. However, some
information can be gained from investigating accidental human exposures (e.g. factory
accidents). Otherwise, most acute toxicity data comes from animal testing or, more
recently, in vitro testing methods and inference from data on similar substances.[1][3]
Measures of acute toxicity
Regulatory values
Limits for short-term exposure, such as STELs or CVs, are only defined if there a
particular acute toxicity associated with a substance.
Experimental values
Chronic toxicity
Chronic toxicity is a property of a substance that has toxic effects on a living organism,
when that organism is exposed to the substance continuously or repeatedly. Compared
with acute toxicity.
For example if a person drinks too much alcohol on a regular basis then their health may
suffer as a result. The alcohol does not have a long biological halflife but it is supplied on
a regular basis to the body of the person.
• Prolonged internal exposure because a substance remains in the body for a long
time
For example if a person were to ingest radium much of it would be absorbed into the
bones where it would exert a harmful effect on a person's health. The radium might cause
a disturbance in the blood cell-forming part of the bone (bone marrow)
Secondary poisoning
Secondary poisoning is the damage caused by non-biological pesticide, by means of
poison. Secondary poisoning appears in a number of forms. The most prevalent among
them is groundwater poisoning, which results from uncontrolled spraying of pesticides in
farm fields that are in close proximity to where the groundwater is stored. An additional
possibility is poisoning of insects or domestic rodents, pets and infants would become
poisened as a result of eating food which has been in contact with the toxic carcasses.
Toxicity – Alcohol
Alcohols often have an odor described as 'biting' that 'hangs' in the nasal passages.
Ethanol in the form of alcoholic beverages has been consumed by humans since pre-
historic times, for a variety of hygienic, dietary, medicinal, religious, and recreational
reasons. The consumption of large doses results in drunkenness or intoxication (which
may lead to a hangover as the effect wears off) and, depending on the dose and regularity
of use, can cause acute respiratory failure or death and with chronic use has medical
repercussions. Because alcohol impairs judgment, it can often be a catalyst for reckless or
irresponsible behavior. The LD50 of ethanol in rats is 10,300 mg/kg.[5]
Other alcohols are substantially more poisonous than ethanol, partly because they take
much longer to be metabolized, and often their metabolism produces even more toxic
substances. Methanol, or wood alcohol, for instance, is oxidized by alcohol
dehydrogenase enzymes in the liver to the poisonous formaldehyde, which can cause
blindness or death.[2]
Methanol itself, while poisonous, has a much weaker sedative effect than ethanol. Some
longer-chain alcohols such as n-propanol, isopropanol, n-butanol, t-butanol and 2-methyl-
2-butanol do however have stronger sedative effects, but also have higher toxicity than
ethanol.[8][9] These longer chain alcohols are found as contaminants in some alcoholic
beverages and are known as fusel alcohols,[10][11] and are reputed to cause severe
hangovers although it is unclear if the fusel alcohols are actually responsible.[12] Many
longer chain alcohols are used in industry as solvents and are occasionally abused by
alcoholics,[13][14] leading to a range of adverse health effects.[15]
Snake identification
The three types of venomous snakes that cause the majority of major clinical problems
are the viper, krait and cobra. Knowledge of what species are present locally can be
crucially important, as is knowledge of typical signs and symptoms of envenoming by
each species of snake.
A scoring systems can be used to try and determine biting snake based on clinical
features,[9] but these scoring systems are extremely specific to a particular geographical
area.
First Aid
Snakebite first aid recommendations vary, in part because different snakes have different
types of venom. Some have little local effect, but life-threatening systemic effects, in
which case containing the venom in the region of the bite (e.g., by pressure
immobilization) is highly desirable. Other venoms instigate localized tissue damage
around the bitten area, and immobilization may increase the severity of the damage in
this area, but also reduce the total area affected; whether this trade-off is desirable
remains a point of controversy.
Because snakes vary from one country to another, first aid methods also vary; treatment
methods suited for rattlesnake bite in the United States might well be fatal if applied to a
tiger snake bite in Australia. As always, this article is not a legitimate substitute for
professional medical advice. Readers are strongly advised to obtain guidelines from a
reputable first aid organization in their own region, and to beware of homegrown or
anecdotal remedies.
1. Protect the patient (and others, including yourself) from further bites. While
identifying the species is desirable in certain regions, do not risk further bites or
delay proper medical treatment by attempting to capture or kill the snake. If the
snake has not already fled, carefully remove the patient from the immediate area.
2. Keep the patient calm and call for help to arrange for transport to the nearest
hospital emergency room, where antivenin for snakes common to the area will
often be available.
3. Make sure to keep the bitten limb in a functional position and below the victim's
heart level so as to minimize blood returning to the heart and other organs of the
body.
4. Do not give the patient anything to eat or drink. This is especially important with
consumable alcohol, a known vasodilator which will speedup the absorption of
venom. Do not administer stimulants or pain medications to the victim, unless
specifically directed to do so by a physician.
5. Remove any items or clothing which may constrict the bitten limb if it swells
(rings, bracelets, watches, footwear, etc.)
6. Keep the patient as still as possible.
7. Do not incise the bitten site.
Many organizations, including the American Medical Association and American Red
Cross, recommend washing the bite with soap and water. However, do not attempt to
clean the area with any type of chemical.
Treatment for Australian snake bites (which may differ to other areas of the world)
stringently recommends against cleaning the wound. Traces of venom left on the
skin/bandages from the strike can be used in combination with a snake bite identification
kit to identify the species of snake. This speeds determination of which anti-venom to
administer in the emergency room.
Pressure immobilization
Pressure immobilization is not appropriate for cytotoxic bites such as those of most
vipers,[10][11][12] but is highly effective against neurotoxic venoms such as those of most
elapids.[13][14][15] Developed by Struan Sutherland in 1978,[16] the object of pressure
immobilization is to contain venom within a bitten limb and prevent it from moving
through the lymphatic system to the vital organs in the body core. This therapy has two
components: pressure to prevent lymphatic drainage, and immobilization of the bitten
limb to prevent the pumping action of the skeletal muscles. Pressure is preferably applied
with an elastic bandage, but any cloth will do in an emergency. Bandaging begins two to
four inches above the bite (i.e. between the bite and the heart), winding around in
overlapping turns and moving up towards the heart, then back down over the bite and
past it towards the hand or foot. Then the limb must be held immobile: not used, and if
possible held with a splint or sling. The bandage should be about as tight as when
strapping a sprained ankle. It must not cut off blood flow, or even be uncomfortable; if it
is uncomfortable, the patient will unconsciously flex the limb, defeating the
immobilization portion of the therapy. The location of the bite should be clearly marked
on the outside of the bandages. Some peripheral edema is an expected consequence of
this process.
Outmoded treatments
The following treatments have all been recommended at one time or another, but are now
considered to be ineffective or outright dangerous, and should not be used under any
circumstances. Many cases in which such treatments appear to work are in fact the result
of dry bites.
In extreme cases, where the victims were in remote areas, all of these misguided attempts
at treatment have resulted in injuries far worse than an otherwise mild to moderate
snakebite. In worst case scenarios, thoroughly constricting tourniquets have been applied
to bitten limbs, thus completely shutting off blood flow to the area. By the time the
victims finally reached appropriate medical facilities their limbs had to be amputated.
Drug interaction
isA drug interaction is a situation in which a substance affects the activity of a drug, i.e.
the effects are increased or decreased, or they produce a new effect that neither produces
on its own. Typically, interaction between drugs come to mind (drug-drug interaction).
However, interactions may also exist between drugs & foods (drug-food interactions), as
well as drugs & herbs (drug-herb interactions).
Generally speaking, drug interactions are avoided, due to the possibility of poor or
unexpected outcomes. However, drug interactions have been deliberately used, such as
co-administering probenecid with penicillin prior to mass production of penicillin.
Because penicillin was difficult to manufacture, it was worthwhile to find a way to
reduce the amount required. Probenecid retards the excretion of penicillin, so a dose of
penicillin persists longer when taken with it, and it allowed patients to take less penicillin
over a course of therapy.
Drug interactions may be the result of various processes. These processes may include
alterations in the pharmacokinetics of the drug, such as alterations in the Absorption,
Distribution, Metabolism, and Excretion (ADME) of a drug. Alternatively, drug
interactions may be the result of the pharmacodynamic properties of the drug, e.g. the co-
administration of a receptor antagonist and an agonist for the same receptor.
One notable system involved in metabolic drug interactions is the enzyme system
comprising the cytochrome P450 oxidases. This system may be affected by either
enzyme induction or enzyme inhibition, as discussed in the examples below.
The examples described above may have different outcomes depending on the nature of
the drugs. For example, if Drug B is a prodrug, then enzyme activation is required for the
drug to reach its active form. Hence, enzyme induction by Drug A would increase the
effectiveness of the drug B by increasing its metabolism to its active form. Enzyme
inhibition by Drug A would decrease the effectiveness of Drug B.