How to rank surgical residency programs

| |
In September, Doximity, a closed online community of over 300,000 physicians, released its ratings of residency programs in nearly every specialty. Many, including me, took issue with the methodology. Emergency medicine societies met with Doximitys co-founder over the issue and echoed some of the comments I had made about the lack of objectivity and emphasis on reputation.

I wonder if it is even possible to develop a set of valid criteria to rate residency programs. Every one I can think of is open to question. Lets take a look at some of them.

Reputation is an unavoidable component in any rating system. Unfortunately, it is rarely based on personal knowledge of any program because there is no way for anyone not directly involved with a program to assess its quality. Reputation is built on history, but all programs have turnover of chairs and faculty. Just as in sports, maintaining a dynasty over many years can sometimes be difficult. Deciding how much weight should be given to reputation is also problematic.

The schools that residents come from might be indicative of a programs quality, but university-based residencies tend to attract applicants from better medical schools. The other issue is who is to say which schools are the best?

Faculty and resident research is easy to measure but may be irrelevant when trying to answer the question of which programs produce the best clinical surgeons. Since professors tend to move from place to place, the current faculty may not be around for the entire 5 years of a surgery residents training.

The number of residents who obtain subspecialty fellowships and where those fellowships are might be worthwhile, but would penalize programs that attract candidates who may be exceptional but are happy to become mere general surgeons.

Resident case loads including volume and breadth of experience would be very useful. However, these numbers have to be self-reported by programs. Self-reported data are often unreliable. Here are some examples why.

For several years, M.D. Anderson has been number one on the list of cancer hospitals as compiled by US News. It turns out that for 7 of those years, the hospital was counting all patients who were admitted through its emergency department as transfers and therefore not included in mortality figures. This resulted in the exclusion of 40% of M.D. Andersons admissions, many of whom were likely the sickest patients.

The number and types of cases done by residents in a program have always been self-reported. The Residency Review Committee for Surgery and The American Board of Surgery have no way of independently verifying the number of cases done by residents, the level of resident participation in any specific case, or whether the minimum numbers for certain complex cases have truly been met.

So where does that leave us?

Im not sure. I am interested in hearing what you have to say about how residency programs can be ranked.

Related Posts by Categories

0 komentar:

Posting Komentar