SCIENCE; MUTATING MICROBES

Medicine seemed to have beaten infectious diseases. But antibiotic- resistant bacteria and new viruses have evolved, and the modern megacity helps them spread even faster. In a second extract from The Coming Plague, Laurie Garrett asks whether humans can survive

Laurie Garrett
Saturday 16 September 1995 23:02 BST
Comments

IN 6000 BC there were fewer humans on earth than now occupy New York and Tokyo. The world's roughly 30 million prehistoric residents were scattered over vast expanses of the warmer parts of the planet, and few of them ever ventured far from their birthplace. By 60 BC, the empires of Rome and China boasted urban centres of tens of thousands of people, which functioned as the hubs of trade and culture for the planet's 300 million residents. Cities afforded the disease-causing micro-organisms a range of opportunities unavailable in rural settings.

The more people per square mile, the more ways a micro-organism could pass from one hapless human to another. People would pass the agent to other people as they touched or breathed upon one another, prepared food, defecated or urinated into bodies of water with multiple uses, travelled to distant places, built centres for sexual activity that allowed microbes to exploit another method of transmission, produced prodigious quantities of waste that could serve as food for rodent and insect vectors, dammed rivers and collected rain water in cisterns creating breeding pools for disease-carrying mosquitoes, and often responded to epidemics in hysterical ways that ended up assisting the persistent microbes. In ancient Rome, only about one of every three residents saw the ripe old age of 30 compared with 70 per cent of their rural counterparts.

Cities, in short, were microbe heavens, or, as the British biochemist John Cairns put it, "graveyards of mankind".

BY 1980, with five billion people on the planet, up from a mere 1.7 billion in 1925, global urbanisation was irrepressible and breathtakingly rapid. The most dramatic shifts were occurring in Africa and South Asia, where tidal waves of people were pouring continuously into the cities; some cities in these regions doubled in size in a single decade.

The bulk of this population surge occurred in a handful of so-called megacities - urban centres inhabited by more than 10 million people. In 1950 there were two: New York and London. By 1980, there were 10: Buenos Aires, Rio de Janeiro, Sao Paulo, Mexico City, Los Angeles, New York, Peking, Shanghai, Tokyo, and London. But this was only the beginning. It is predicted that by 2000 there will be 3.1 billion living in increasingly crowded cities, with the majority crammed into 24 megacities, most of them located in the world's poorest countries.

During the 1970s and 1980s this crush of urban humanity was badly affecting human health, even in the wealthier nations. By 1985, less than 40 per cent of Tokyo's housing was connected to proper sewage systems, and tons of untreated human waste was streaming into the ocean. Hong Kong was dumping one million tons of unprocessed human waste into the South China Sea daily. Taiwan had sewage service for only 200,000 of its 20 million people, two- thirds of them living in its four largest cities.

But for the poorest developing countries, the burden of making their growing urban ecologies safe for humans, rather than heavens for microbes, is proving impossible. Except for a handful of East Asian states with strong industrial capacities (South Korea, Malaysia, Singa-pore), the developing world has no cash. Many urban centres are coming to resemble their teeming counterparts in 19th-century Europe.

In 1980 a quarter of Bangkok's residents had no access to health care, according to the World Bank. In the Dharavi slum of Bombay, then inhabited by over 500,000 people, 75 per cent of the women suffered from chronic anaemia, 60 per cent of the population was malnourished, pneumonia afflicted nearly all children, and most residents contracted gastrointestinal disorders due to parasitic infections. The flood of people into the Sudanese capital, Khartoum, led to epidemics of malaria, diarrhoea diseases, anaemia, measles, whooping cough, and diphtheria. In the Ivory Coast, rural tuberculosis rates were down to 0.5 per cent - a success story. But in the large capital city of Abidjan the TB rate was three per cent and climbing.

No disease has surged into the cities more strongly than a new deadly form of Ki denga pepo, Swahili for "it is a sudden overtaking by a spirit". The phrase was used by East Africans to describe a mosquito-carried disease that produced horrible headaches, eye pain, and a swelling achiness of the joints.

In the 19th century an earlier form was endemic throughout the Americas, known as dengue, a Spanish adaptation of the Swahili denga. In most cases it wasn't life-threatening, though it was certainly a miserable experience. The disease was carried by four different strains of dengue viruses - cousins of the yellow fever microbe. The viruses were carried by mosquitoes, particularly the female Aedes aegypti.

As countries throughout the world conducted A. aegypti eradication campaigns during the early 20th century to rid the earth of yellow fever, dengue outbreaks virtually ceased. Then, in 1953, Manila was hit by an apparently new form of dengue that caused haemorrhagic pete-chial skin rashes - pinpoint- sized red spots, sites of breakthrough bleeding - shock, and scarring fevers. The disease seemed more lethal than any previous dengue outbreaks, and was caused by viral strain dengue-2.

Five years later dengue haemorrhagic fever hit Bangkok, where it persisted for five years, eventually sickening 10,367 people and killing 694. US Army medical researcher Dr Scott Halstead, then at the military's laboratory there, teamed up with Thai microbiologist Charas Yamarat to figure out the origin of the apparently new deadly disease. They determined that, as with yellow fever, the A. aegypti mosquito carrying the dengue-2 virus was a fully urbanised insect. Lacking the aggressive characteristics of wild jungle mosquitoes, A. aegypti thrived only in proximity to human beings, laying its eggs in open containers of fresh water and maturing inside human shelters.

When the men closely examined the medical records of people who suffered acute dengue haemorrhagic fever they discovered that nearly all had recently been exposed to another, milder dengue strain. Though that first infection caused little or no apparent illness, it sensitised the humans' immune systems for the later arrival of dengue-2.

Usually when people develop strong antibody immune responses against a virus they are protected against future exposure to the microbe. But dengue-2 had evolved an extraordinary ability to trick the immune system and take control of its primary defence cells, the macrophages. In that way dengue-2 gained entry to every organ in the body, carried by macrophages like Trojan horses for the virus. As the immune system struggled to overcome its invaders, various biochemical reactions produced soaring fevers - as high as 107F - convulsions, classic allergy-like shock, and death.

The new dengue disease paradigm spread through South and East Asia. During the 1950s and 1960s, dengue types 1, 2, and 3 all made sporadic appearances in the Americas, until a slackening in mosquito abatement programmes allowed the A. aegypti population to grow to critical proportions.

A key factor in the expansion of dengue threats to the Americas was the 1985 arrival of A. albopictus. Carried aboard a shipment of waterlogged used tyres sent from Japan for retreading in Houston, Texas, the extremely aggressive mosquitoes - capable of carrying both dengue and yellow fever - quickly out-competed more timid domestic mosquito species. Within two years A. albopictus tiger mosquitoes were seeking human blood in the cities and towns of 17 US states.

When Tom Monath, then director of the US Army Medical Research Institute of Infectious Diseases, set out to reconstruct the events that led to the global emergence of urban dengue haemorrhagic fever, he concluded that every advance of the microbes was a direct result of human activities. He concluded that World War II was responsible for the emergence of A. aegypti-carried dengue in Asia. Massive human migrations, aerial bombing campaigns, densely populated refugee camps, and the wartime disruption of all mosquito control efforts allowed for an unprecedented surge in the insect's population. The mosquitoes were able to use bomb craters filled with water as breeding sites and to draw blood from millions of war victims whose homes were destroyed and no longer provided night-time protection from hungry insects.

The Korean and Vietnam wars only created further opportunities for mosquito breeding and dengue cross-fertilisation. By the conclusion of the Vietnam conflict in 1975, dengues of all four types were endemic in urban centres throughout the region. A Cuban epidemic in 1981, interestingly, followed a period of intensive postwar exchange of personnel between the two countries for professional training and Vietnamese reconstruction efforts.

By the time dengue hit Latin America in the 1960s, conditions in the slums of cities like Sao Paulo, Rio de Janeiro, Caracas, and Santiago, Chile, were similar, from the mosquitoes' perspectives, to those in wartime Asia. The surge in commercial air traffic throughout the world during the 1970s, Monath concluded, facilitated the spread of people whose bodies were incubating as yet asymptomatic infections of dengue-l, -2, -3, and -4.

By 1981, dengue arrived each year shortly after the onset of every rainy season in Manila; tens of thousands of children contracted dengue haemorrhagic fever; and 15 per cent of those children died. By then, dengue was one of Asia's most prevalent childhood illnesses.

All things considered, Uwe Brinkmann, then of the London School of Hygiene and Tropical Medicine, estimated in 1981 that some 300 million residents of cities in developing countries suffered debilitating illnesses at any given time due to chronic parasitic infections, over and above periodic viral epidemics, such as dengue. And though the costs of prevention through large-scale housing, sewage systems, potable water, insect control, and improvement in rubbish collection might seem daunting to the governments of poor countries, Brinkmann argued that the price of doing nothing was far greater, a tremendous toll in human life and productivity.

Tragically, events during the 1980s, when the HIV virus and Aids cut a swath across continents, would prove far worse than Brinkmann had imagined.

AT THE annual meeting of the American Society of Tropical Medicine and Hygiene in 1989, 800 experts staged an extraordinary war games scenario, envisioning a horrendous epidemic in a mythical African region. The purpose was to reveal weaknesses in the public health emergency system that could later be corrected. What transpired was an event eerily prescient of the Rwandan horror of 1994.

In the fictional African nation of Changa, ethnic civil war was raging. Over six months, an estimated 125,000 civilians had been slaughtered, and nearly a quarter of a million people had fled to a squalid encampment just over the border. Conditions were atrocious, with drug-resistant malaria, malnutrition and tuberculosis rampant. Some 25 per cent of the adult refugees were HIV-positive. An international relief effort was under way, while UN peacekeeping forces guarded the borders. As key scientists played their roles at the meeting in Honolulu, a terrible epidemic unfolded among the refugees, multinational health providers, and UN forces. Before it was even noticed, ailing individuals infected with a mysterious microbe had travelled to the United States, the Philippines, Thailand, Germany and neighbouring African countries.

And although every imaginable effort was made to swiftly control the mysterious microbe, within 10 days the virus would have been carried by infected relief workers and soldiers from its African epicentre to the following locations: Bangkok, Manila, Frankfurt, Geneva, Fayette-ville, Washington DC, New York, Honolulu, and the US army's Medical Research Institute of Infectious Diseases at Fort Detrick. Within a month, a global pandemic of what appeared to be an airborne, nearly 100 per cent lethal virus was under way. "It would be very close to Andromeda," said Karl Johnson, former head of the Special Pathogens branch of the US Centers for Disease Control (CDC), referring to the catastrophic killer in Michael Crichton's medical thriller The Andromeda Strain.

The war games revealed an appalling state of non-readiness. There were no prepackaged infectious disease hospitals anywhere in the United States or at WHO in Geneva that were ready at a moment's notice to be airlifted into an epidemic. Virtually no civilian hospitals in the US were equipped to handle a highly contagious, lethal microbe, either in patients or inside petri dishes in their laboratories.

Only one permanent maximum-containment facility existed inside the US Public Health Service system. The vast network of overseas high-security laboratories that had been run by the Rockefeller Foundation and the federally funded CDC no longer existed.

In the 1960s, when biological warfare research was under way in the United States and the Soviet Union, both the US military and the civilian Public Health Service maintained supplies of special respirators that used ultraviolet light to decontaminate air before it was inhaled. "Where are those masks now?" Johnson asked. "Does anybody know?"

None of the experts had the slightest idea. The mood was grim, even nervous. Though all knew the war game was only a scenario, tension was high because it bore such a close resemblance to past disease emergences.

It was part of a larger picture of sharply heightened concerns about preparedness for confronting new disease. Five major US government studies addressed the issue between 1988 and 1994, as did several international agencies and organisations. These reports shared a sense of urgency and despair over the status of public health infrastructures and infectious diseases research in the US and Europe.

American scientists tended to support large-scale monitoring. Satellites, biological containment laboratories, computers and polymerase chain reaction devices for genetic fingerprinting were the tools they hoped to use to spot changes in ecologies that might promote microbial emergences. Failing that, they hoped to be equipped to swoop in with a scientific rapid strike force that would identify and destroy emerging microbes before an outbreak progressed to an epidemic.

D A Henderson, deputy secretary of Health and Human Services, felt active surveillance would best be conducted through 15 tightly networked tropical outpost laboratories, staffed by CDC scientists, colleagues from public health institutions in the host country, and academic researchers from some 50 US universities.

Henderson estimated that the system would cost $150m per year to operate, adding, "Can we afford to invest in such a programme? A better question is whether we can afford not to invest in a programme that could be a determinant in our own survival as a species."

But monitoring systems already in place seemed to be failing. In response to a 1992 treport the CDC gave Dr Ruth Berkelman the task of formulating plans for surveillance and rapid response to emerging diseases. She and her collaborators discovered a long list of serious weaknesses and flaws in the CDC's domestic surveillance system and determined that international monitoring was so haphazard as to be non-existent.

For example, the CDC for the first time in 1990 attempted to keep track of domestic disease outbreaks using a computerised reporting system linking the federal agency to four state health departments. Over a six-month period, 233 communicable disease outbreaks were reported. The project revealed two disturbing findings: no federal or state agency routinely kept track of disease outbreaks of any kind, and once the pilot project was under way, the ability of the target states to survey such events varied radically. Vermont, for example, reported outbreaks at a rate of 14.1 per million residents, Mississippi at 0.8 per million.

Minnesota state epidemiologist Dr Michael Osterholm assisted CDC efforts by surveying all 50 state health departments. He discovered that the tremendous variations in disease reports reflected not differences in the actual incidence of such occurrences, but discrepancies in the policies and capabilities of the departments.

What Osterholm and Berkelman discovered was that nearly two decades of government belt-tightening, coupled with decreased local and state revenues, had rendered most local and regional disease reporting systems horribly deficient, often completely unreliable. Deaths were going unnoticed. Contagious outbreaks were ignored. Even diseases that physicians and hospitals were required by law to report were going unrecorded.

That being the case, officials could only guess about the real incidences of such ailments as penicillin-resistant gonorrhoea, vancomycin-resistant enterococcus, E. coli 0157 food poisoning, multiply drug-resistant tuberculosis, or Lyme disease. As more disease crises cropped up, such as various antibiotic- resistant bacterial diseases, beleaguered state and local health agencies loudly protested CDC proposals to expand the mandatory disease reporting list - they couldn't keep up.

At the international level the situation was even worse. The CDC's Jim LeDuc, working out of WHO headquarters in Geneva, in 1993 surveyed the 34 laboratories worldwide that were supposed to alert the global medical community to outbreaks of dangerous viral diseases. (There was no similar network to follow bacterial outbreaks or parasitic disease trends.) He discovered that only half the labs could reliably diagnose yellow fever; the 1993 Kenya epidemic undoubtedly got out of control because of that regional laboratory's failure to diagnose the cause of the outbreak.

For other microbes the labs were even less prepared: 56 per cent couldn't properly identify hantaviruses; 82 per cent missed California encephalitis. For the less common haemorrhagic disease-producing microbes such as Ebola, Lassa, and Machupo, virtually none had the biological reagents to even try to conduct diagnostic tests.

As a first line of defence against emerging diseases - at least the viruses - LeDuc advocated a modest $1.8m one-shot programme to upgrade all the laboratories and tighten the WHONet voluntary reporting system that linked key hospitals and medical systems worldwide. His proposal was formally endorsed on 26 April 1994, by WHO and a panel of experts chaired by Nobel laureate Joshua Lederberg. Months after the proposal went out to the wealthy nations of the world, LeDuc was still waiting for some dollars, marks, yen or other solid currency.

Meanwhile, physicians working in the midst of crises argued that what was needed was far more fundamental. "You need people on the ground to spot these things first," one said. "You need a health care system. And you need a place to call." If the government is your enemy - if you and your people are victims of oppression - whom do you call?

After years of battling Lassa, the scientists working in West Africa saw civil war in Liberia and government instability in Nigeria wash away all their efforts, and outbreaks of the rat-borne disease become commonplace. All told, it seemed in 1993 that more than 21 million people on earth were living under conditions ideal for microbial emergence: denied governmental representation that might improve their lot; starving; without safe, permanent housing; lacking nearly all forms of basic health care and sanitation. The situation only worsened in 1994, as more than half a million Rwandans fled their country.

"This is a public health crisis," former CDC director Dr William Foege argued. "One trillion dollars is spent on weapons annually. Of the 14 million kids who died in 1989, nine million [deaths] could have been prevented for two and a half billion dollars. That's what's spent in the US annually for cigarette advertising."

Foege felt that international and domestic American health were so integrated by the 1990s due to globalisation of the microbes that it was impossible to ensure a disease-free existence for people in North America and Western Europe without providing similar assurances for residents of Azerbaijan, Cote d'Ivoire and Bangladesh.Yet his kind of global thinking was no longer in vogue at CDC and WHO or inside the governmental health bureaucracies in Washington, Paris and London.

In the belt-tightening world of the 1990s, no one seemed much interested in contributing cash for the development of primary health infrastructures in countries like Armenia, Romania, Albania, Burma or the Dominican Republic. The scale of the problem seemed too great, the pay-off for donors too modest.

Ultimately, humanity will have to change its perspective on its place in earth's ecology if the species hopes to stave off or survive the next plague. Microbes, and their vectors, recognise none of the artificial boundaries erected by human beings. Their world is bounded only by natural limitations: temperature, pH, ultraviolet light, the presence of vulnerable hosts, and mobile vectors. By force of numbers they overwhelm us. And they are evolving far more rapidly than Homo sapiens, adapting to changes in their environments by mutating, undergoing high-speed natural selection, or drawing plasmids and transposons from the vast mobile genetic lending library in their environments.

In this fluid complexity, human beings elbow their way without concern into one ecosphere after another. The human race seems equally complacent about blazing a path into a rain forest with bulldozers and arson or using an antibiotic "scorched earth" policy to chase unwanted microbes across the duodenum.

Only by appreciating the fine nuances in their ecologies can human beings hope to understand how their actions, on the macro level, affect their micro competitors and predators.

Time is short.

As the human population swells, surging past the six billion mark at the millennium, the opportunities for pathogenic microbes multiply. If, as some have predicted, 100 million of those people might then be infected with HIV, the microbes will have an enormous pool of walking immune-deficient petri dishes in which to thrive, swap genes and undergo endless evolutionary experiments.

"The world is just one village. Our tolerance of disease in any place is at our own peril," Joshua Lederberg told a New York gathering earlier this year. "Are we better off today than we were a century ago? In most respects, we're worse off. We have been neglectful of the microbes, and that is a recurring theme that is coming back to haunt us."

!'The Coming Plague' by Laurie Garrett is published by Virago at pounds 20

The bacterial world is in a state of constant evolution and changes so quickly it can leave humans gasping.

Penicillin, the postwar wonder drug that swept the world with its healing power, could cure almost all cases of the bacterium Staphylococcus in the early 1950s. (So effective was it in the early years, that during the war army physicians collected the urine of patients who were on the drug and crystallised excreted penicillin for re-use.) But by the late 1960s, strains of staph had evolved to resist it. So physicians switched to another antibiotic, methicillin.

By the early 1990s, about 40 per cent of the staph cases in big American hospitals were resistant to methicillin as well. Around the world, illness and deaths soared from resistant staph, which usually strikes people already in hospital. And super-strains of the bacteria were emerging. Australian researchers, for example, have seen a patient with a strain that could resist, to varying degrees, some 31 drugs.

Driven by the genetic maxim "survive and reproduce", microbes relentlessly eat, divide and multiply, secreting defensive poisons to thwart their attackers, hiding when necessary, and, if all else fails, mutating. These mutations occur every second, most proving disastrous to the microbe. But every now and then - say, daily - a mutation occurs that gives some microbe a new advantage over its environment and competitors. When that happens and the microbe flourishes, it sometimes shares its new strength with other microbes. Resistance to an antibiotic can be passed around through movable bits of DNA, genetic matter called plasmids or transposons. Once an organism gains resistance this way, it can pass it on to the next generation as well.

And once a resistant strain exists, humans often help it become dominant by eliminating the competition. In a person infected with both resistant and susceptible organisms, for instance, antibiotic treatment wipes out the weaker strains, making room for the stronger ones to flourish. How rapidly can resistant strains spread? Using genetic fingerprinting techniques, researchers in New York traced back more than 470 strains of methicillin- resistant Staphylococcus aureus, or MRSA. They found that all the bacteria descended from a strain that emerged in Cairo in 1961. By the end of that decade, the strain's descendants could be found in New York, New Jersey, Dublin, Geneva, Copenhagen, London, Kampala, Nairobi, Ontario, Halifax, Winnipeg and Saskatoon. A decade later they were seen planet-wide.

Staphylococcus wasn't the only bacterial organism that was overcoming drugs by using plasmids, mobile DNA, mutations, jumping genes and conjugative sharing of resistance factors, in which one microbe stretches out its membrane to temporarily merge with another. In fact, by 1993 nearly every common pathogenic bacterial species had developed some degree of drug resistance. And more than two dozen of these emergent strains posed life- threatening crises to humanity, having outwitted most commonly available antibiotic treatments.

Jim Henson, the inventor of the Muppets, died in the spring of 1990 of a common, allegedly curable, bacterial infection. An apparently new mutant strain of Streptococcus had struck that was resistant to penicillins and possessed genes for a killer toxin very similar to that found in the strain of S. aureus that causes Toxic Shock Syndrome, the bacterial infection associated with super-absorbent tampons which was identified in 1980.

Around the world, the principle holds true: overuse or misuse of antibiotics, particularly in small children and hospitalised patients, prompts emergence of resistant mutant organisms.

The basic problem with the antibiotic approach to control of pathogenic bacteria is evolution. Long before mankind discovered the chemicals, yeasts, fungi and rival bacteria had been making antibiotics and spewing the compounds around newly claimed turf to ensure that rival species couldn't invade their niches.

The rivals, of course, had long since evolved ways to rapidly mutate to withstand such chemical attacks. So rivals would make different chemicals, their foes would mutate again and the cycle repeated itself countless times over the millennia. Humans simply accelerated the natural process by exposing billions of microbes at a time to drugs derived from the natural chemicals, and doing so with less lethal efficiency. Often the genetic changes the microbes underwent in order to overcome the antibiotics offered additional advantages, enhancing the bacteria's ability to withstand wider temperature variation, outwit more elements of the host immune system, or kill host cells with greater certainty.

One of the most disturbing prospects for physicians worldwide was the emergence around 1988 of vancomycin-resistant Enterococcus faecium and faecalis. With vancomycin the only remaining reliable treatment for staph and strep infections, there was great concern that resistant enterococcal bacteria could share their resistance genes with staph and strep. Worse yet, some feared the vancomycin-resistance genes might find their way into the new toxic shock strains of the bacteria.

Physicians and scientists working outside the field of bacteriology generally assumed that, as in the past, another class of antibiotics would be developed and the problem would go away. But they were wrong. A leading bacteriologist said: "There's nothing on the shelf. Nothing in the pipeline. If we lose vancomycin we're going to be back to the 1930s with staph." LG

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in