Via The New York Times, an op-ed: The Scary Shortage of Infectious-Disease Doctors. Is this also true of countries with reasonable health-insurance systems? Excerpt:
Many have heard of the rise of drug-resistant infections. But few know about an issue that’s making this threat even scarier in the United States: the shortage of specialists capable of diagnosing and treating those infections.
Infectious diseases is one of just two medicine subspecialties that routinely do not fill all of their training spots every year in the National Resident Matching Program (the other is nephrology). Between 2009 and 2017, the number of programs filling all of their adult-infectious-disease training positions dropped by more than 40 percent.
This could not be happening at a worse time. Antibiotic-resistant microbes, known as superbugs, are pinballing around the world, killing hundreds of thousands of people every year. The Times recently reported on Candida auris, a deadly new fungus that has infected hospital patients in Illinois, New Jersey and New York.
Everyone who works in health care agrees that we need more infectious-disease doctors, yet very few actually want the job. What’s going on?
The problem is that infectious-disease specialists care for some of the most complicated patients in the health care system, yet they are among the lowest paid. It is one of the only specialties in medicine that sometimes pays worse than being a general practitioner. At many medical centers, a board-certified internist accepts a pay cut of 30 percent to 40 percent to become an infectious-disease specialist.
This has to do with the way our insurance system reimburses doctors. Medicare assigns relative value units to the thousands of services that doctors provide, and these units largely determine how much physicians are paid. The formula prioritizes invasive procedures over intellectual expertise.
The problem is that infectious-disease doctors don’t really do procedures. It is a cognitive specialty, providing expert consultation, and insurance doesn’t pay much for that.
For example, I recently diagnosed a case of fungal pneumonia. To arrive at this diagnosis took hours: first, speaking with the patient’s primary care doctor and pulmonologist, followed by a long stretch reviewing chest X-rays and other imaging studies with a radiologist, before examining the cells in the patient’s lung tissue under a microscope to confirm the diagnosis and report back to the anxious patient and family. Most of this work was done free.
A generation ago, most well-rounded doctors knew what to do when a patient developed a fever. Not anymore. New threats like Candida auris pop up all the time, and to identify a deadly pathogen, we often have to use a sophisticated technique called matrix-assisted laser desorption ionization time-of-flight mass spectrometry. As the name suggests, it requires a bit of expertise.
Back in the day, there were only a handful of reliable antibiotics to choose from. But as more bugs have emerged, and more have become resistant to standard treatments, we’ve had to develop new drugs to fight them. The good news is that in 2018, the Food and Drug Administration approved a cadre of new antibiotics to confront the rising threat of superbugs, including eravacycline, plazomicin and omadacycline, and we can expect even more this year. But very few physicians know how to use these powerful new drugs. Many don’t even know how to pronounce them.
Infectious-disease specialists are often the only health care providers in a hospital — or an entire town — who know when to use all of the new antibiotics (and when to withhold them). These experts serve as an indispensable cog in the health care machine, but if trends continue, we won’t have enough of them to go around.
The terrifying part is that most patients won’t even know about the deficit. Your doctor won’t ask a specialist for help because in some parts of the country, the service simply won’t be available. She’ll just have to wing it.