The rise, fall, rise, and Imminent Fall of DDT

DDT is probably the single most valuable chemical ever synthesized to prevent disease. It has been used continually in public health programs over the past sixty years and has saved millions from diseases like malaria, typhus, and yellow fever. It has experienced a rise, fall, rise and imminent fall.

DDT is probably the single most valuable chemical ever synthesized to prevent disease. It has been used continually in public health programs over the past sixty years and has saved millions from diseases like malaria, typhus, and yellow fever. It has experienced a rise, fall, rise and imminent fall.

Despite a public backlash in the 1960s, mainstream scientific and public health communities continue to recognize its utility and safety. After decades of extensive study and use, DDT has not been proven to be harmful to humans. But by 1997, its future looked bleak.

DDT, the scientific name of which is dichlorodiphenyltrichloroethane, was first synthesized by Othmar Zeidler in 1874, but it was not until the 1930s that a scientist working for a Swiss chemical company discovered its insecticidal properties.

Paul Mueller happened to fall upon it when looking for an insecticide to control clothes moths. He sprayed a small amount of DDT into a container and noted the slow but sure way it killed flies. He wiped the container clean, but when he added new flies, they died, too. Mueller soon realized he had come across a persistent, powerful residual insecticide.

DDT was first used by the Allies during World War II to control lice-borne typhus.

Typhus had always been a problem during wars, especially in camps for prisoners, refugees, or political detainees. In October 1943, Allied forces liberated Naples as they advanced northward through Italy.

A typhus epidemic broke out shortly after the liberation, posing a significant threat to both troops and civilians. The U.S. military mixed DDT with an inert powder and dusted it on troops and refugees, to great effect. Public health expert Fred Soper noted that “a thorough application of 10 percent DDT louse powder to the patient, his clothing and bedding and to the members of his household would greatly reduce the spread of typhus in the community.”

DDT was to prove itself to be the most effective weapon not only against typhus, but also against many other insect-borne diseases.

In 1898, Ronald Ross, a medical doctor stationed with the British army in India, discovered that mosquitoes transmit malaria. Shortly thereafter, a leading Italian zoologist, Giovanni Battista Grassi, identified the specific genus of mosquito (Anopheles) responsible for transmitting the malaria-causing parasite.

The knowledge that mosquitoes transmitted malaria gave public health experts a powerful new way to control the disease: targeting the mosquitoes as the carriers or vectors of the parasite that causes malaria.

Insecticides, notably pyrethrum, had been used in malaria control prior to DDT. They were sprayed on the inside walls of houses where the Anopheles mosquito rests after feeding. The mosquito takes up the insecticide while she rests on the wall and the toxicity kills her.

One of the significant limiting factors of this form of vector control known as indoor residual spraying (IRS), was that it is labor-intensive and therefore expensive, especially since the insecticides used before DDT had to be sprayed every two weeks.

DDT, however, lasted for over six months. This long-lasting residual action meant that a malaria control team could cover many more houses and protect far more people.
When used in malaria control, DDT has three separate mechanisms: repellency, irritancy, and toxicity, which together are remarkably successful at halting the spread of the disease.

Repellency is the most important mechanism, and along with DDT’s long residual time, it makes DDT superior to other insecticides. Its repellency qualities have long been known, but they have largely been forgotten by the malaria-fighting community.

South Africa was one of the first countries to use DDT in malaria control. It started using the insecticide in 1946, and within a few years, the malarial areas had decreased to just 20 percent of those observed in 1946.

In order to anticipate the development of insecticide resistance, WHO proposed to overwhelm the mosquito population with spraying to reduce the population dramatically before any insecticide resistance could develop. This ultimately failed due to donor fatigue, weak public health infrastructure, and poor management.

DDT resistance contributed to the failure, but it is often discussed as the only reason the campaign failed, when its contribution in most areas was negligible. The meme that resistance to DDT was the reason for its demise is a key reason why the public health community continues to ignore DDT’s main strength: repellency, which is as far as we know unaffected by resistance.

Public health agencies and the written literature have found no strong evidence linking DDT or its metabolites with human cancer. Given the fact that DDT has been used in enormous quantities for over six decades and that many studies into its potential carcinogenicity have been conducted without drawing any evidence thereof, we can be relatively sure that DDT is not responsible for human cancer.

The Return of DDT
DDT leaves stains on mud walls, which was the primary reason South Africa’s malaria control program replaced the use of DDT in 1996 with another chemical class, synthetic pyrethroids, although pressure from environmentalists certainly contributed.

What followed was one of the country’s worst malaria epidemics. Over four years, malaria cases increased by around 800 percent and malaria deaths increased tenfold.
At the same time, environmentalist pressure against the chemical was increasing. Environmental groups that had previously shown no interest in malaria, such as the World Wildlife Fund, started to profess expertise in alternatives to DDT use, as long as it was not DDT.

In 2000, the South African Department of Health reintroduced DDT. In just one year, malaria cases fell nearly 80 percent in KwaZulu-Natal province, which had been hit worst by the epidemic. The South African epidemic, tragic though it was, not only ultimately strengthened the case for DDT and IRS in general, but also gave strength to other malaria control programs in Africa that wanted to use DDT and expand their insecticide spraying.

In 2000, a private mining company in Zambia restarted a malaria control program that had been discontinued in the early 1980s due to economic constraints. This malaria control program, managed and paid for by the Konkola Copper Mine, was designed to protect over 365,000 people living in almost 32,500 dwellings.

Almost 80 percent of these dwellings were sprayed inside with DDT, and after the first spraying season, malaria incidence fell by 50 percent. After the second spraying season, malaria cases declined further by 50 percent and today malaria mortality at the mine clinics has fallen to zero.   

The Ugandan health department has wanted to use DDT since at least 2005, and in January 2007, DDT passed its environmental assessment. But as of October 2007, it has not been used.

For over fifty years, DDT has been on WHO’s list of approved insecticides for use in vector control. It experienced resurgence with reforms to WHO’s malaria control policy in late 2005 when the then-director-general, the late J. W. Lee, appointed Arata Kochi to lead the malaria unit.

DDT is not always the appropriate intervention. In some instances, other insecticides will be better, especially for use outside. Nor is DDT a magic bullet. Other interventions, such as insecticide-treated bed nets, play a useful and sometimes critical role in malaria control.

The Imminent Fall of DDT
The voices arguing against DDT have become louder recently, in part because funding for other interventions has come under threat. Countries are using other insecticides in their expanded spray programs, but they are not using DDT.

Since the late-2005 turnaround at USAID and the September 2006 statements from WHO about the benefits of DDT, no country has started to use it again.

Uganda has come closest so far, but to no avail. Health department malaria experts in Kenya and Tanzania have told me and others that they would like to use DDT, but business continues as usual.
The United Nations is once again ramping up opposition to the use of DDT. At its third session, ending on May 4, 2007, “the Conference of the Parties of the Stockholm Convention requested its secretariat in collaboration with the World Health Organization and interested parties to develop a business plan for promoting a global partnership to develop and deploy alternatives to DDT for disease vector control.”

Since there are so many players who want to sell alternatives to DDT, the chemical has few champions, and since those represented in this group are no friends of DDT, the partnership is likely to be broad, well-financed, and politically connected. It may prove to be the final nail in DDT’s coffin.

DDT is no panacea, but it has a better track record on malaria control than any other intervention. Lives are lost every day because of continued opposition to its use.

With development and modernization and, perhaps, a vaccine, DDT will one day no longer be necessary, but that day is still a long way off.

This article was first published in Health Policy Outlook No. 14, November 2007, American Enterprise Institute for Policy Research. The Author Roger Bate is a resident fellow at AEI.