How public health failed America

In theory, public-health agencies could add chronic-disease-control activities without sacrificing their infectious-disease expertise. In reality, however, public-health departments have experienced a progressive decline in real spending power, particularly since the Great Recession, and as a result have chosen to cut infectious-disease programs. More than 75 percent of the nation’s larger health departments reported eliminating clinical services from 1997 to 2008.

I experienced this shift firsthand. When I began overseeing infectious diseases at New York City’s health department in 2011, I worked for one of the nation’s leading proponents of chronic-disease control: Mayor Michael Bloomberg. Because of budget cuts, we had to shrink our infectious-diseases programs; I had to close or reduce hours for our immunization, sexually transmitted disease, and tuberculosis clinics. I had to justify these decisions to appropriately disgruntled community groups and city council members by saying that the Affordable Care Act’s Medicaid expansion would pay to ensure that these services could be provided by the private sector—a claim that I based more on hope than evidence.

As local health agencies nationwide scaled back their clinics, they also lost their presence in the community. Clinics are an important way of building visibility and credibility, because most people do not understand what a public-health agency is or does. Residents see the good work you do, tell elected officials that your work matters, and then trust you during emergencies. Running clinics also builds logistical expertise.