By Erika Mills
Around 3.2 billion people—nearly half the world’s population—are at risk for malaria. In 2015, 214 million were infected and 438,000 died of the disease, mostly children in sub-Saharan Africa, Asia, and parts of the Americas. While there is still much to be done to eliminate the infection, we’ve made significant progress through collaboration between governments, NGOs, scientists, and local communities. Malaria cases and deaths have decreased substantially over the last decade; and the World Health Organization seeks to make more headway over the next 15 years, adopting an ambitious strategy to dramatically reduce the impact of the disease even further. Each year, on April 25th, we observe World Malaria Day to shine a light on the strides made in malaria eradication and galvanize the international community in the fight against the disease.
The anti-malaria community kicked off the first World Malaria Day in 2008, but efforts to contain the disease predate the 21st century by millennia. From antiquity until the 1800s, drainage of wet areas—where the disease thrives, and treatment with traditional medicines (some containing modern antimalarial agents) comprised the primary line of defense across Africa, Europe, and Asia. It was in the 19th century that scientific advancements, including discovery of two parasites that cause malaria and the identification of the anopheles mosquito as the main vector of transmission, the development of the first antimalarial drugs, and the synthesis of dichloro-diphenyl-trichloroethane (DDT), an insecticide whose use would later become controversial because of its carcinogenic effects and harmful impact on the environment, made large scale malaria control efforts possible.
The U.S. spearheaded the widespread use of DDT against disease-spreading mosquitos and eliminated malaria from the country by the mid-20th century. Malaria had been a significant problem throughout the history of the U.S., but successful efforts by the American government and military to contain the disease in US-occupied Cuba, in Central America during the construction of the Panama Canal, and around army bases foreshadowed malaria elimination stateside. The U.S. Public Health Service founded the Communicable Disease Center (an early incarnation of the current Centers for Disease Control and Prevention) after World War II to tackle malaria elimination. In 1947, the nascent agency initiated the National Malaria Eradication Program. Through this program, the CDC partnered with state and local health agencies in 13 southeastern states, spraying DDT in homes and over large swaths of land, draining wetlands, educating the public about prevention methods, and surveilling areas once progress was made. By 1951, the Program had eliminated malaria from the US.
Heartened by the success in the US and promising field trials of DDT in other areas of the world, the World Health Organization (WHO) launched the Global Malaria Eradication Programme in 1955, which relied heavily on spraying the interiors of houses with long-acting insecticides like DDT, although distribution of antimalarial medications and surveillance of cleared areas played a role. The Programme achieved a measure of success, eliminating malaria from a few countries in Europe and other temperate areas, and reducing the incidence of the disease in a number of other places. However, many other countries experienced minimal-to-no success—especially in the poorest and least developed regions, and by the end of the 1960s, some areas saw a resurgence of infection. Resistance to DDT and some antimalarial drugs, population movements, economic crises, and poor infrastructure in many countries further complicated matters. In 1969, the WHO shuttered the Global Malaria Eradication Programme and in the ensuing years, other eradication efforts dropped off. By the end of the 1970s, the global health community had shifted its focus from malaria eradication to control, where it would remain for the next three decades.
In the 2000s, eradication re-emerged in the global anti-malaria discussion. Research and development yielded new tools, like insecticide-treated bed nets, better drugs, and improved vector-control methods. Many governments had stronger health infrastructure and more resources. Key organizations and communities around the world urged for a new focus on eradication. In 2008, the Roll Back Malaria Partnership, a collaboration between the United Nations, malaria-endemic countries, and stakeholders worldwide, released the Global Action Malaria Plan, which detailed a framework for eliminating the disease. The first World Malaria Day was observed that year, marking a renewed commitment to the goal of malaria eradication.
The theme for World Malaria Day 2016 is “End malaria for good,” reflecting the ambitious goals of the WHO’s new 15-year plan: to eliminate the disease in at least 35 countries, reduce the incidence and deaths by at least 90%, and prevent resurgence in all malaria-free countries. Organizations and communities are gathering on social media and at events on six continents, sharing information and ideas, showcasing work, and celebrating successes, like the reduction of the global mortality rate by 60% over the last 15 years. Hopefully, the achievement of a world without malaria will be a success we can celebrate in the years to come.
For more information about activities, visit the World Malaria Day homepage. To learn about historical and contemporary efforts to eliminate malaria and address a wide range of issues that affect health all over the world, check out our online exhibition, Against the Odds: Making a Difference in Global Health.
Erika Mills is outreach coordinator for the Exhibition Program in the History of Medicine Division of the National Library of Medicine.