top of page

Red Flags on Residency Applications and How to Overcome Them - Part I: Diving into the Dta



In my work, many applicants applying for residency have at least one red flag, which they often hesitantly report to me during our initial coaching session. It's important to understand exactly what these flags are and know how to handle these potentially negative aspects of one's application professionally and competently.    

 

I will address this subject in two parts.

  • Part I will provide data about red flags to ensure you understand how Program Directors perceive them.

  • Part II will provide a roadmap detailing how to address and overcome them in your application and interview. 

 

Red flags typically constitute the following:

 

  • Multiple USMLE /COMLEX Attempts or Low USMLE/COMLEX Score(s)

  • Issues during medical school related to failures, repeat coursework or unexpected breaks in one's academic record, and professionalism concerns.

  • Year of graduation ≥ specified number of years from current date

  • Time gaps (may be related to financial issues, personal or family illness, extenuating circumstances in one's country [civil unrest/war] or other). 

 

I would also mention here that plagiarism on one's ERAS application and/or deceptive or fabricated statements could easily merit your exclusion from consideration, although they are not generally listed as a red flag. And it goes without saying that a prior NRMP Match violation or misdemeanor or felony conviction—generally DUI or other drug-related—would be highly likely to land your application in the reject pile. 

 

Other red flags relate to applicants who have been terminated or resigned from their programs and are re-applying for residency, but these are uncommon and always relate to very personal circumstances and will not be addressed here.

 

THE DATA

 

What does available data inform us about what program directors think about red flags? Fortunately, there is some valuable information to share. Understanding the weight given to some of these factors by program directors is not intended to discourage any of you as match aspirants but instead to highlight red-flag challenges. I'll later reinforce the importance of appropriately addressing these in one's residency application.

 

USMLE/COMLEX, Medical School Performance and LORs/MSPE

 

The NRMP's 2020 Program Director Survey (1) provides the following information regarding failed attempts on the USMLE and COMLEX exams. Program Directors were asked if they would consider applicants with first-attempt failures. While the results varied depending on the test, for USMLE Step 1/2 and COMLEX Level 1/2, most program directors (61-72%) were willing to consider an applicant with a first-attempt failure at least some of the time.   Interestingly, they were less likely to consider those with a failed USMLE Step 3 or COMLEX Level 3 exam for residency training in their programs. IMGs sometimes take USMLE Step 3 as a strategy to lessen a non-competitive Step 2 CK result. It is, therefore, important for those with such red flags considering such a tactic to fully recognize that a subsequent Step 3 or COMLEX 3 failure would likely further reduce their residency opportunities. 


Internal Medicine residency program directors' screening practices were examined by Angus et al. in Academic Medicine (2020)(2). Red flags considered the most important exclusion criteria for interview selection were failures on Step 2 CK and unprofessional behavior noted on a candidate's LORs or MSPE. Of the program directors who responded (n=231), 47% indicated that they "absolutely would not invite" a failed attempt on Step 2 CK, and 68% stated the same for those candidates with unprofessional behavior and judgment.     


However, there is another way to interpret this data. One can take away the message that 53% of program directors who encounter someone with a first-attempt failure on USME Step I might still grant them an interview. Similarly, although much less likely, a professionalism lapse would not merit exclusion by all program directors; 32% might still consider such an applicant.

Such information signals a willingness of program directors to look past certain data points on an applicant's file, especially in programs that have embraced a more comprehensive holistic approach to residency application review. 

 

A narrative review (Hartman et al.)(3) examined the most common and highest-rated factors used to select applicants for interviews. The presence of a failed USMLE/COMLEX exam was used by 70% of program directors as an interview selection criterion and was rated as 4.5 out of 5.0 in terms of overall importance. Other application factors utilized relevant to this discussion were professionalism and ethics considerations (percent used 65%; importance 4.5) and applicant with match violation (37%; 4.8), 

 

In a questionnaire sent to Internal medicine program directors (Garber et al.)(4), they were asked to indicate which one out of predefined criteria was the most common data point used to sort/filter applicants if they employed such filters.   While Step 2 CK score was reported as the most common data point selected to filter applicants (32%), the presence of a failing clerkship grade in medicine was listed by only 9% of PDs. 

 

The authors of another report (Hemrajani et al., 2023)(5) found that "flags" from either the interview or file review were statistically significant (p <.003; OR 2.82) in terms of their association with subsequent performance-related difficulties during residency. Although such flags were not explicitly defined, they note that typical areas of concern include course/clerkship failures, legal issues, and portfolio inconsistencies.

 

In a study of Anesthesia program directors from 2020 (Vinagre et al.)(6), 84% considered red flags very important when selecting an applicant for an interview. The four most common "red flags" were: failure of USMLE exams, failure of a course or clinical rotation, gaps in education/missing time from school without explanation, and any felonies/other criminal history".

 

Strasser et al.(7) evaluated the importance of applicant factors by program directors and found that passing the USMLE was considered to be "extremely or very important" by 88% of respondents. An applicant's grades on core clerkships were also deemed highly important, and it is reasonable to infer that failing grades or repeated rotations would raise strong concerns.

 

While this is not an exhaustive list of all studies, there is considerable consistency present. Furthermore, at least in terms of test performance, ample literature supports the fact that USMLE scores are predictors of residency performance, at least in certain domains.


SCORES AS PREDICTORS OF RESIDENCY PERFORMANCE / SPECIALTY BOARD PASS RATES

 

Sharma et al., (8) looked at post-match residency multimodal performance (standardized testing, individual clinical assessments). They found that in multivariate analysis, the USMLE Step 2 score was "the most predictive of performance in all residency performance domains measured."   In contrast, USMLE Step 1 scores were associated only with subsequent in-training examination scores. 

 

In a meta-analysis of 80 studies with 41,704 participants (Kenney S., et al.)(9) "standardized examination performance and medical school grades showed the strongest associations with current measure of doctor performance. 

 

However, the authors of a more recent 2023 study published in the Journal of Graduate Medical Education (Lipman JM et al. )(10), which aimed to summarize the evidence linking elements in the ERAS application with selection and training outcomes, concluded that "the association of Step 1 and Step 2 CK scores with performance metrics are widely mixed." 

 

Finally, each Program Director will harbor their beliefs regarding standardized test scores and their association with the subsequent ability of their graduates to pass the specialty boards.

For instance, Rayamajhi et al. (11) found that USMLE Step 1 (before becoming P/F), Step 2 CK, and Step 3, along with the In-service exam during the third year of residency, predicted the results on the Internal Medicine Certifying exam (ABIM). 

 

The Accreditation Council for Graduate Medical Education (ACGME) has specific cut-offs for each specialty regarding the percentage of graduates who must pass their specialty boards (rolling average over a specified number of years). Failure to achieve these benchmarks can have ramifications for programs including ACGME program citations and applicants' rank order list decisions.   

 

YEAR OF GRADUATION (YOG) and TIME GAPS

 

In terms of graduation year, programs that exclude older applicants indicate such clearly on the   Freida™ AMA Residency database by specifying their YOG requirements. Programs can use a candidate's YOG as a screening criterion through ERAS.     While older candidates usually apply to programs that do not list YOG restrictions or have more liberal inclusion criteria, these cut-off dates are not always set in stone. I will tell you that I have coached candidates with YOG > 10 years who did not meet the prescribed cut-offs yet still interviewed and matched.  

 

While the literature on the impact of a "gap" year for recent grads between undergraduate education and medical school is readily available, there is no comparable data that I could find regarding intervals between medical school and residency application in the US. Applicants with sizeable gaps are almost all IMGs, each with a different story. The Freida™ database indicates which programs will consider those with gaps. As with YOG, in my experience, these rules are generally not written in stone. 

 

I trust that this has been useful. 

 

Stay tuned for Part II in which I will provide specific detailed information regarding potential ways to effectively respond to any red flags which your application may harbor. 

 

_____________________________

 

(2)   Steven V Angus et al. Internal Medicine Residency Program Directors’ Screening Practices and Perceptions About Recruitment Challenges. Acad Med. 2020 Apr;95(4):582-589.

(3)   Nicholas D. Hartman, Cedric W. Lefebvre, David E. Manthey; A Narrative Review of the Evidence Supporting Factors Used by Residency Program Directors to Select Applicants for Interviews. J Grad Med Educ 1 June 2019; 11 (3): 268–273.

(4)   Adam M. Garber, Brian Kwan, Christopher M. Williams, Steven V. Angus, T. Robert Vu, Matthew Hollon, Marty Muntz, Arlene Weissman, Anne Pereira; Use of Filters for Residency Application Review: Results from the Internal Medicine In-Training Examination Program Director Survey. J Grad Med Educ 1 December 2019; 11 (6): 704–707. 

(5)   Reena Hemrajani et al. Association of Interview and Holistic Review Metrics with Resident Performance-Related Difficulties in an Internal Medicine Residency Program. J Grad Med Educ. 2023 Oct;15(5):564-571.

(6)   Vinagre R et al. Red Flags, Geography, Exam Scores, and Other Factors Used by Program Directors in Determining Which Applicants Are Offered an Interview for Anesthesiology Residency. Cureus. 2020 Nov 18;12(11): e11550. 

(7)   Strausser et al. Importance of residency applicant factors based on specialty and demographics:  a national survey of program directors.  BMC Medical Education (2024) 24:275.

(8)   Sharma et al., USMLE Step 2 CK: Best Predictor of Multimodal Performance in an Internal Medicine Residency. J Grad Med Educ. 2019 Aug; 11(4): 412–419.

(9)  Kenney S. et al., Association between residency selection strategies and doctor performance: A meta-analysis.  Medical Education 2013: 47: 790–800

(10)   Lipman, Jeremy M. et al. A Systematic Review of Metrics Utilized in the Selection and Prediction of Future Performance of Residents in the United States.  J Grad Med Educ (2023) 15 (6): 652–668.

(11)   Rayamajhi, S., Dhakal, P., Wang, L. et al. Do USMLE steps, and ITE score predict the American Board of Internal Medicine Certifying Exam results?  BMC Med Educ 20, 79 (2020).

 

48 views0 comments

Recent Posts

See All

Comments


bottom of page