# America’s Self-Inflicted Doctor Shortage

## Метаданные

- **Канал:** PolyMatter
- **YouTube:** https://www.youtube.com/watch?v=vEH4yo7PqzM
- **Источник:** https://ekstraktznaniy.ru/video/24401

## Транскрипт

### Segment 1 (00:00 - 05:00) []

Wakulla County, Florida is home to 37,000 people. There are three state parks, nearly a dozen schools, and even a small municipal airport. What there are not are doctors. Just ten serve the entire county. Which means, per capita, there are fewer doctors here in the suburbs of Tallahassee than in Afghanistan or Haiti. And Wakulla County is not alone. 83 million Americans live in a “medically underserved” region. With so little healthcare to go around, we drive long distances for basic care, regularly forgo checkups, and wait weeks — or even months — for appointments. When we finally do see someone, they’re often overwhelmed with too many patients to remember our names. And things are only about to get worse. Over the next ten years, the U. S. population is expected to grow by 15 to 20 million people. And the population aged 75 and older — the largest consumers of healthcare — will grow by 54%. Meanwhile, many doctors are nearing retirement. By the end of that same decade, two out of five currently practicing physicians will themselves be at least 65. Add all this up, and we’re looking at a shortage as large as 86,000 doctors by 2036, according to The Association of American Medical Colleges. In other words: Wakulla County and places like it may already have more access to healthcare today than they will for decades to come. The issue is not a lack of interest. “Doctor” is still one of the most coveted professions. Each year, medical schools collectively reject roughly as many applicants as they ultimately enroll. The acceptance rate at most is between just 2 and 5%. Nor is the problem a shortage of schools. The truth is: you could double the number of schools tomorrow and you wouldn’t create a single additional doctor. That’s because, strictly speaking, medical schools don’t create “doctors. ” They create doctors in training. Before you can actually practice medicine independently, you have to complete a “residency” — a minimum of three and as many as nine years of supervised on-the-job training. And there are more aspiring residents than there are residency positions. Last year, there were 47,000 applicants for thirty-seven thousand places. Meaning: after thirteen years of pre-tertiary education, 4 years of undergrad, and 4 years of medical school (not to mention all the entrance exams in between) — nearly 10,000 people find themselves blocked from progressing — denied the right to work 80 hours a week in a hospital for a $60,000 salary over the next 3 to 9 years. This is, obviously, a very personal tragedy. Medical school is already rife with stress, loneliness, and financial pressure. And yet, eight years into an 11 to 17-year post-secondary journey, these graduates have no choice but to hope for better luck next year, while the interest on their $200,000 in student loans accumulates. But it’s also a collective tragedy. Remember: these are already the top 50% of students who scored highly enough on the MCAT for acceptance to medical school. They then stuck with it for four grueling years, successfully earning an M. D. or equivalent. In other words: here we are turning away thousands of unlucky yet perfectly qualified candidates during an already large and rapidly growing doctor shortage. All of this is to say: we could increase the number or size of medical schools, but all that would do is increase the number of graduates fighting for the same number of residencies. More competition, more pressure, more student loan defaults, zero extra doctors. Clearly what we need are more residencies — the actual bottleneck. The real mystery, then, is: what’s stopping us? And the answer is: one failed 1980s experiment. American healthcare began booming after World War II. The GI Bill funneled thousands of veterans into grad school and the Hill-Burton Act of 1946 unleashed billions of dollars for the construction of new hospitals. Congress also began funding residencies. Teaching hospitals, the thinking was, are less efficient — they spend time on instruction and supervision that could otherwise be spent treating patients and therefore, earning money. But that learning is ultimately in the public interest — we need an ample supply

### Segment 2 (05:00 - 10:00) [5:00]

of well-trained doctors — even if it is “inefficient” in the short-term. So, the government began giving hospitals a subsidy for each resident they trained. Well… it worked. Fifteen years later, the United States led the world in terms of doctors per capita. But it didn’t last. As healthcare spending soared to new heights, the industry found itself on a collision course with Reagan-era fiscal restraint. A researcher named Milton Roemer discovered that the number of hospital beds available in a given city was positively correlated with the average length of hospital stays in that city. The theory, which became known as Roemer’s “Law,” was: “a hospital bed built is a bed filled. ” In other words: healthcare was too accessible; Americans were using more than they needed. In 1981, a government committee warned of a dangerous physician “surplus” that would soon bleed the country dry in the form of unnecessary procedures. By the year 2000, there would be at least a hundred thousand “excess” doctors, they predicted. And if excess supply was generating excess demand, it followed that we needed to restrict the former. If healthcare became harder to access, only those who “truly” needed it would bother going through all the hoops. Seeing the writing on the wall and fearing that if they didn’t regulate themselves, Congress would do it for them, medical schools fell in line. For the next twenty-five years — from 1980 to 2005 — the entire U. S. healthcare pipeline was deliberately frozen in place. First, new medical schools all but stopped being built. More opened during 1976 alone than during this whole period. Second, existing schools immediately cut their enrollment — leaving millions of dollars in tuition on the table for fear of a regulatory crackdown. Third, from 1974 onward, the legal presumption was that no new hospitals were needed, either. To get permission to build one, you’d first have to prove adequate demand and apply for a “Certificate of Need. ” And finally, in 1997, the U. S. government stopped subsidizing new residents. Beginning that year, a given hospital could only get paid for as many residencies as it had in 1996. …Well, needless to say, the experiment failed. Demand for healthcare very clearly did not slow down — rather, it just kept growing, and growing. It’s not that supply can never induce demand, but it’s far from one-to-one. For every “unnecessary” procedure you avert by constraining supply, you prevent at least one person from getting the treatment they truly need. Most of us are looking for reasons not to go to the hospital, not the other way around. Americans weren’t lounging in all those new hospital beds in the 50s and 60s and 70s, they were finally getting the care they needed — care that was then deliberately taken away from them over the next twenty-five years. Between 1980 and 2005, the U. S. population grew by 68 million people. We also grew older and richer — and thus, in need of more care per capita. Yet the number of medical schools and students stayed virtually unchanged. Now, word eventually got out. In 2005, The Association of American Medical Colleges went from warning of a doctor surplus to doctor shortage. Med schools drastically increased their enrollment and new schools were built as quickly as possible — trying to make up for lost time. But even today, twenty years later, they still haven’t caught up. The “freeze” still reverberates through the pipeline. The problem is: not everyone got the memo. Some states repealed their Certificate of Need laws, but 35 still require permission to build a hospital. And Congress, with its trademark speed and efficiency — has only added twelve-hundred new subsidized residency positions since 1997. Meaning: it still, in 2026, funds nearly the same number of residents as it did when there were 73 million fewer Americans. No wonder we’re running low on doctors! Now, in theory, hospitals can hire as many residents as they wish. But for each one above the limit set 30 years ago, they lose out on about $150,000 a year in public funding. Likewise, you can — in principle — build a new hospital. But however many residents you have after five years is how many you’re stuck with forever. And that just

### Segment 3 (10:00 - 15:00) [10:00]

isn’t much time to create a brand new teaching hospital from the ground up. But make no mistake: students, not hospitals, are the ultimate victims of this broken system. Congress effectively created a bottleneck — an artificial limit on the number of subsidized residencies — and then handed private hospitals control over who gets through. Hospitals with large numbers of residents on December 31st, 1996 have been grandfathered in — enjoying a now 30-year stream of revenue. They hold the keys — they determine who becomes a doctor. And they don’t hold back in using that leverage to use and abuse their trainees. Doctors almost universally describe their residencies as the most stressful — and sometimes worst — period of their lives. 60 or 70 hour, high-intensity work weeks are the norm, sleep is a precious commodity, and the incredibly high stakes take a steep emotional toll. But even before residency begins, med students are at a severe disadvantage… Since hospitals hold all the cards, they determine how residencies are distributed. The only way to get one — and thus, become a doctor — is to apply to the once-annual “Match. ” Hospitals interview and rank candidates, and candidates rank hospitals. Then, one day in March, applicants are given their single “match. ” There’s no backing out and there’s no negotiating — not even the most qualified A+ applicant can compare offers or bargain for a higher salary. Meanwhile, med school tuition has ballooned almost uncontrollably and the job of practicing medicine has been increasingly bogged down by a tangled web of insurance and bureaucracy. Now, for many, the juice is still well worth the squeeze. Surgeons and cardiologists, for instance, can earn half a million dollars a year or more — and they do so well into their 60s. But as the costs of becoming a doctor have grown — financially, personally, emotionally, and otherwise — so have the expected outcomes. The average student leaves medical school with $206,000 in debt. At some schools, like Tulane or Chicago, that number approaches or even surpasses $300,000. Add on any loans from undergrad plus 11 to 17 years of lost earnings and it’s only natural that aspiring doctors would gravitate toward highly-paid specialties. Primary care physicians go through most of the same process yet earn only a fraction as much. And the economics of modern medicine forces them to see far too many patients to really get to know them. The result is that while the United States has a perfectly normal number of specialists per capita, it has the lowest number of generalists among OECD countries. If you need an aortic valve replacement or a decompressive craniectomy, America is a fantastic place to do it! If you have a sore throat or need some garden-variety antibiotics, good luck! In aggregate, this means that America under-diagnoses treatable illnesses until they’ve quietly grown into serious or even fatal conditions, at which point you’re slapped with an astronomical bill. And it’s even worse in rural areas like Wakulla County. You see, Congress didn’t just freeze the number of subsidized residencies in 1997, it also froze their geographic distribution. That’s a problem since most doctors — 56% — end up working within 100 miles of where they did their residency. Yet Americans, at large, have moved quite a bit since then. The populations of Pittsburgh, Memphis, and Richmond have stayed relatively stable since the 90s and thus have more residents, and therefore, doctors, today. Other cities, like Las Vegas, Austin, Provo, and Orlando, have seen their populations double or even triple, and now find themselves critically low on doctors because their allotment of residencies is stuck in 1997. This is especially hard on Florida and the rest of the Sunbelt, where seniors have dramatically increased the average age and thus demand for healthcare. Now, the million-dollar question, of course, is what can be done? If you ask the American Medical Association, the solution is easy: unfreeze the subsidies. AKA: “give us more money! ” That’s certainly a solution. It’s probably the fastest and simplest one. But, to use a relevant analogy, that might be a mere bandaid when what we really need is a head‑to‑toe reconstruction.

### Segment 4 (15:00 - 18:00) [15:00]

America, after all, already spends more on healthcare than any other country. In 2023, we spent $21 billion on resident subsidies alone. What we don’t do is distribute that money with any kind of strategic plan or intention. Why not, for instance, incentivize the creation of more residencies for primary care — or, better yet, primary care in rural areas — by re-allocating funding accordingly? And why not tie subsidies to individual residents, rather than hospitals — tilting the playing field ever-so-slightly in the direction of overworked and deeply indebted students? We could give foreign-educated doctors (many of whom are U. S. Citizens) an expedited pathway to practicing medicine, without having to redo their entire residency, as they’re forced to now. We could offer a combined, 6-year undergrad and med school option, saving students 2 years of unnecessary time and money, as most of the world does. Or we could simply allow qualified nurses to deliver more primary care, particularly in rural areas facing acute physician shortages. Any one of these changes could make a meaningful dent in the problem. …And yet, here we are. We’ve now been waiting for reform since the proverbial beginning of time and may still be waiting until the end. Speaking of ancient timescales, no discussion of America’s broken healthcare system is complete without a kind of spiritual palate cleanse… and what better way to get your mind off health insurance and waiting rooms than by watching Joe Scott’s new documentary about the furthest thing from that… called “Oldest and Newest Places on Earth? ” As the name implies, Joe challenges himself to find, plan, and travel to the geologically oldest and youngest parts of our planet. You watch him fly all the way to the Arctic Circle in Canada's Northwest Territories by floatplane, then to Hawaii's Kilauea volcano. It’s equal parts science lesson, travel vlog, and even contemplative meditation on the grandness of our planet. I’m not usually much of a natural sciences guy, but I really enjoyed it. There’s something for everyone. And it’s exclusively available on today’s sponsor, Nebula. On Nebula, you’ll find your favorite creators making bigger, better, and higher-budget dream projects… Projects like a hilarious live comedy debate show, in which comedians argue for the abolition of such things as sunscreen, Liquid Death, and Meryl Streep. Projects like RealLifeLore’s “Mad Kings” — a deep dive into the world’s most notorious dictators, from Suddam Hussein’s son to Kim Il-Sung. And projects like DownieExpress’ “Europe 2 Asia” — a five-part journey between continents using only trains. Nebula is ad-free and even gives you free “Guest Passes,” so you can watch along with friends and family. Normally, Nebula costs $6 a month. But you can get it for just half that by signing up for a year with the link on screen or in the description now. That’s just $2. 50 a month for all the Originals I mentioned and more. If you’re not a fan of subscriptions, you can also get $200 off Nebula Lifetime with the Lifetime link below.
