The Buzz This Week: 

On January 17, 2023, Harvard Medical School (HMS) announced that it would no longer participate in the annual U.S. News and World Report’s (USNWR) “Best Medical School” rankings, a rankings publication many regard as an industry standard. The top-rated medical school stated that it would no longer submit the self-reported data required to be included in the rankings, objecting to USNWR’s methodology and related ethical and philosophical dilemmas. This follows a spate of law school withdrawals from USNWR’s ranking process in 2022. 

George Daley, MD, PhD, Dean of the Faculty of Medicine at HMS, stated in a letter quoted in the New York Times, that he felt “the principled belief that rankings cannot meaningfully reflect the high aspirations for educational excellence, graduate preparedness, and compassionate and equitable patient care that we strive to foster in our medical education programs.” In response, also reported by the New York Times, Eric Gertler, Chairman and Chief Executive Officer of USNWR, stated that he believes students deserve access to all the data and information necessary to select the best school, which he believes USNWR provides and aggregates in its rankings. 

HMS’ decision to withdraw from the USNWR rankings seems to have set in motion a domino effect. The medical schools at Columbia, Stanford, the University of Pennsylvania, and Mount Sinai in New York—all top-tier schools per the USWNR rankings in recent years—followed Harvard’s lead within a few days. By February 1, 2023, the following medical schools also announced that they will be withdrawing from the USNWR rankings, in addition to the aforementioned medical schools:  

  • Chicago Medical School at Rosalind Franklin University of Medicine and Science   
  • Duke University School of Medicine 
  • Johns Hopkins University School of Medicine  
  • Kaiser Permanente Bernard J. Tyson School of Medicine 
  • University of Chicago Pritzker School of Medicine  
  • University of Michigan Medical School 
  • University of Washington School of Medicine 
  • Washington University School of Medicine 

This list of medical schools withdrawing from the USNWR rankings includes 6 of the top 10 and 10 of the top 20 medical schools in the U.S. 

While each medical school has articulated specific reasons for withdrawing from the rankings, most have also echoed comments initially made by HMS, citing “perverse incentives” created by the ranking methodology, and attesting that the ranking system “does not provide a fair, comprehensive overview of each university.” USNWR's weighting of peer assessment surveys, postgraduate employment, and test scores are among the medical schools’ objections. Those withdrawing from the ranking process believe these factors “prioritize prestige and institutional wealth, and incentivize schools to divert need-based aid.” 

As Dean Daley originally stated, “As unintended consequences, rankings create perverse incentives for institutions to report misleading or inaccurate data, set policies to boost rankings rather than nobler objectives or divert financial aid from students with financial need to high-scoring students with means in order to maximize ranking criteria. Ultimately, the suitability of any particular medical school for any given student is too complex, nuanced, and individualized to be served by a rigid ranked list, no matter the methodology.”  

As a reference, the USNWR Medical School rankings are currently based on the following: 

  • Peer assessment (15%) 
  • Assessment by residency directors (15%) 
  • Federal research activity (30%) 
  • Median MCAT score (13%) 
  • Faculty resources (i.e., faculty-to-student ratio) (10%) 
  • Median undergraduate GPA (6%) 
  • Acceptance rate (1%) 
Why It Matters: 

Medical schools that have recently withdrawn from the USNWR rankings (and law schools that did so last year) have justified their actions by questioning the ranking methodologies and metrics included or excluded. They have suggested that too much weight has been placed on test scores and grades, and not enough on other important factors, such as student population diversity, commitment to student and clinical professional wellbeing, and the affordability and value of education. They have therefore decided to stop submitting the self-reported data requested for the USNWR rankings and other evaluative entities. 

In contrast, hospitals and health systems that are scored by USNWR, the Centers for Medicare and Medicaid Services (CMS), and other reputable entities cannot withdraw from rankings because the entities that publish those rankings use publicly available data. Hospitals effectively cannot decide to opt out, despite any objections to the criteria or inputs. 

Currently, the leading rating/ranking organizations for hospitals and health systems are as follows: 

  • USNWR hospital rankings: These rankings are measured on performance in 15 specialties. For 12 of the specialties, rankings are primarily determined by analyzing outcomes data along with a reputation score based on a survey of approximately 30,000 physicians who meet USNWR’s criteria and respond to a survey sent through Doximity, a professional network for physicians. For the remaining 3 specialties, rankings are only based on the reputation score. 
  • CMS Overall Quality Star Ratings: These rankings incorporate metrics from the Hospital Inpatient Quality Reporting Program and the Hospital Outpatient Quality Reporting Program to determine star ratings for hospitals on a scale of 1 to 5.  
  • The Leapfrog Group: This annual list of top hospitals is based upon national, publicly available performance measures and includes a voluntary survey from participating hospitals. 

Many have argued that hospital ratings and rankings—based on publicly available data—are imperfect. Reasons cited include the lack of incorporation of patient acuity, the lack of accounting for qualifying outpatient procedures, and the lack of consideration of a multitude of other factors that should contribute to a hospital’s rating/ranking.  

Despite the limitations in the methodologies for hospital rankings, patients are increasingly looking to these top rankings when making healthcare decisions. A 2020 article published in Medical Care found that patients viewed hospital reputation to be the most important measure for choosing where to receive care, and a 2018 article in the Annals of Surgical Oncology showed that 85% of individuals about to receive cancer surgery would travel an hour to receive care at a top-ranked hospital rather than go to their local hospital. A study in the Journal of Health Economics found that the average hospital experiences a 5% fluctuation in patient volume that can be attributed to its USNWR ranking in a given year. Because of this, until ranking methodologies are refined, hospitals and health systems will need to contain costs, maintain or improve quality, and perform well on other factors incorporated in the ranking lists in order to attract discerning patients. 

More importantly, health systems should not get too lost in the rankings metrics. They should prioritize becoming true high reliability organizations (HROs)—optimizing patient experience and outcomes, improving workforce wellness and engagement, and ensuring the appropriate utilization of resources. Effectively demonstrating that their hospital or health system is an HRO will go farther than any rankings list. Providing highly reliable care that is top quality and safe will help organizations improve rankings and deliver the best care to their patient populations. 

Related Links: 



Modern Healthcare 

Top Medical Schools Reject Rankings as Measure of Success, Quality 

Fierce Healthcare 

Wave of Leading Medical Schools Pull out of U.S. News' Annual Rankings 


Editorial advisor: Roger Ray, MD, Chief Physician Executive.

Related Insights

Contact us

Get in touch

Let us know how we can help you advance healthcare.

Contact Our Team
About Us

About Chartis

We help clients navigate the future of care delivery.

About Us