A Strategy to Help Improve Family Medicine In-Training Examination Scores
By Lisa Weiss, MD, MEd, Sumira Koirala, MD
Because of below average performance, residents at St. Elizabeth Boardman Family Medicine Residency Program wanted a way to increase their in-training examination (ITE) test scores. Research has shown that answering questions daily keeps the testing process fresh and may increase ITE scores.1 Additionally, creating new questions forces further inquiry into a specific topic for both the resident generating the question and the residents answering the question.2 Answering questions daily is also a reminder to residents about the importance of continually reading to enhance their knowledge.3 As such, we hypothesized that daily practice would be more beneficial than bunched learning for our residents. Therefore, we instituted the presentation of a daily ITE question plus a resident created question as part of our morning report.
The goal of this educational intervention was to increase overall ITE test scores to a level above the national average. At morning report, a question from the previous year’s ITE is presented on a screen by a resident at the end of the patient care presentations. The residents, faculty, and medical students in attendance provide an answer along with their rationale. The correct answer is then shown and explained, with subsequent open discussion. The same format is followed with the presentation of an additional but related question which is written by the presenting resident. The ITE questions are reviewed in the original order on the examination, and the residents sign up ahead of time to present each day’s question. About 10 minutes during morning report is assigned to allow time for presentation and discussion.
Before the intervention, the mean ITE score for the residency was 28 points below the national average. After participating in this activity for nine months, the mean score was 27 points above the national average.
There are several theories that have been put forth about why the introduction of daily review of an ITE question may have improved exam performance. The residents felt that using the ITE questions provided them with good depth and breadth of topics to review in preparation for the next year’s exam. They also felt that the question format kept them engaged in the presentation and discussion. The creation of related questions further engaged the presenting resident and improved his or her thinking ability and test taking skills.4 This also added an additional question for group discussion, thereby helping to further everyone’s knowledge2,4 and allowing the presenting resident to highlight current patient-specific topics from the clinical setting.
Finally, they felt that the question and discussion format enhanced not only short-term learning, but also retention. They appreciated the immediate feedback from the faculty on the newly created question and the provided explanation, and subsequently reported that they recalled these discussions when taking their actual ITE.5
Many factors such as increased studying time, other changes in the curriculum, presence or lack of life stressors, and previous knowledge base may also influence ITE scores. 6
Variable resident attendance and a busy morning report agenda are the two main limitations. Depending on the day, some residents are absent due to scheduling conflicts with their clinical rotations. Several times per month, the hospital census is high, allowing no time to review an ITE question. Despite these limitations, all residents experience the ITE questions at some point during the week and the frequency of the exposure increases from the first to third year.
It is helpful to have a faculty lead to provide ongoing feedback and to monitor the process to ensure that the residents are regularly participating in the activity
- Chang D, Kenel-Pierre S, Basa J, et al. Study habits centered on completing review questions result in quantitatively higher American Board of Surgery In-Training Exam scores. Journal of Surgical Education. 2014; 71(6):127-31. DOI: https://doi.org/10.1016/j.jsurg.2014.07.011
- Schullo-Feulner A, Janke K, Chapman S, et al. Student-generated, faculty-vetted multiple-choice questions: Value, participant satisfaction, and workload. Currents in Pharmacy Teaching & Learning. 2014; 6 (1):15-21. https://doi.org/10.1016/j.cptl.2013.09.019
- Kim JJ, Kim DY, Kaji AH, et al. Reading Habits of General Surgery Residents and Association with American Board of Surgery In-Training Examination Performance. JAMA Surgery. 2015; 150(9): 882-9. DOI: https://doi.org/10.1001/jamasurg.2015.1698
- Gooi A, Sommerfeld C. Medical school 2.0: How we developed a student-generated question bank using small group learning. Medical Teacher. 2015;37(10):892-896. https//doi.org/10.3109/0142159X.2014.970624
- Roediger HL, Agarwal PK, Kang SHK, Marsh EJ. Benefits of testing memory: Best practices and boundary conditions. In: Davies GM, Wright DB, editors. Current issues in memory. Current issues in applied memory research; New York, NY. 2010. pp. 13–49. https//doi.org/10.4324/9780203869611
- Godellas C, Hauge, L, Huang R. Factors Affecting Improvement on the American Board of Surgery In-Training Exam (ABSITE). Journal of Surgical Research. 2000; 91(1):1-4. https://doi.org/10.1006/jsre.2000.5852