Other Publications

Education Columns

Direct Observation: What It Is and How to Effectively Perform It

Stephen Wilson, MD, MPH, and Gretchen Shelesky, MD, MS, University of Pittsburgh Medical Center St Margaret Family Medicine Residency

The purpose of residency education is to train physicians to care effectively for patients. Though resource intensive, Direct Observation (DO) is a terrific tool for formative and summative feedback that is part of the Accreditation Council for Graduate Medical Education (ACGME) toolbox for evaluation. This article describes and discusses DO, then makes suggestions on how to effectively perform it.

DO involves carefully and purposefully watching and listening to a resident work through a patient encounter. The encounter may be with either a real or simulated patient; we prefer real. Observable skills include history taking, physical examination, and procedural skills.

DO has been shown to have advantages and positive effects for teachers and learners: identification of otherwise unrecognized deficiencies, which can allow for remediation where needed,1, 2 needs assessment before implementing a curriculum,3 more reliable for formative feedback than self-assessment,4 and increased learner confidence.5 Further, teachers become aware of what actually happened instead of just hearing a filtered report from the learner.

DO readily allows direct assessment of key ACGME core competencies in real-time, under actual circumstances: patient care, interpersonal and communication skills, practice-based learning and Improvement, and professionalism. Indirectly, but also readily assessable, are the competencies of medical knowledge and system-based practice. Thus, DO is a six-tool assessment tool. For those in osteopathic training, DO is particularly effective for assessing OMM (osteopathic manual manipulation) skills.

While a physician is in the best position to globally assess all of the competencies simultaneously, other medical professionals can readily observe and describe specific skills, eg, behavioral scientists, clinical pharmacists, nurses, senior residents.

If DO is so valuable, why is it not used more often? It is time intensive, and time is money. Initially, residents are nervous. Some stay that way, but over time most acclimate to or at least accept or expect it as part of the program’s landscape. Start small with one to two DO per resident per year, or one/6 months. More can be done for residents about whom there are questions, concerns, uncertainty, or signs of educational difficulty. For us, each resident’s advisor and the behavioral science faculty do most of the DO. Also, senior residents do DO on interns providing inpatient care.

Step 1: Use a structured tool to guide observation and feedback. This provides baseline standardization among observers. Have the skills to be observed segregated by competency. The residents should have access to the feedback/evaluation tool: a good tool will help the learner learn as well as the evaluator evaluate. The Mini-CEX is an example of a tool. We use our own tool developed from patient interviewing textbooks and experience.

Step 2: Let the resident know when DO will be occurring, describe the purpose, inform that you will be a “fly on the wall”—except on the floor and larger and with only two feet. Promise a debriefing. This adds structure, transparency, and can help to decrease the anxiety, even if just a little.

Step 3: Ask the resident to inform the patient and obtain his/her permission for your presence.

Step 4: Position yourself non-obtrusively such that you can adequately observe the resident and the patient without interfering with the doctor-patient space.

Step 5: Be quiet unless asked by the resident to do something or provide assistance. Be observant and jot a few notes. The stereotypical “medical stuff” is only part of what you are there to observe.

Step 6: Debrief. Before you jump into sharing your observations and insights, first ask the resident how things went; ask what went well and where she/he feels things could have gone or been done differently. Now use the review tool to specifically review the encounter, identifying skills that were effectively used, then touch on areas for improvement. Close with a summary, acknowledge the anxiety of DO, thank the resident for the opportunity to watch him/her in action, and offer a final chance for any questions.

To fully complete the DO process, place the tool with your summary at the end into the resident’s file.

The use of DO early in residency has helped us earlier to discover both deficiently and highly skilled residents. We have found that DO with feedback helps ameliorate the initial loss of comfort and insecurity, which can increase the risk of burnout that many interns experience.

Do Direct Observation. Do it often until it becomes a part of what your program does. The more you do it the better you get, the more valuable the information, and the less anxiety-provoking for residents. Pick a DO evaluation tool—your faculty may want to modify it to your needs—and use it, a lot. In my experience, you will be enlightened by what you learn; you will become a better teacher and clinician because in many ways you begin to reflect and directly observe yourself.

References
1. Cydulka RK, Emerman CL,Jouriles NJ. Evaluation of resident performance and intensive bedside teaching during direct observation. Acad Emerg Med 1996;3:345-51.

2. Li JT. Assessment of basic physical examination skills of internal medicine residents. Acad Med 1994;69:296-9.

3. Holmboe ES, Hawkins RE.Practical guide to the evaluation of clinical competence. Mosby Elsevier, 2008:30-1.

4. Brewster LP, Risucci DA,Joehl RJ, et al. Comparison of resident self-assessment with trained faculty and standardized patient assessments of clinical and technical skills in a structured educational module. Am J Surg 2008;195:1-4.

5. Chen W, Liao S, Tsai C, Huang C, Lin C, Tsai C. Clinical skills in final-year medical students: the relationship between self-reported confidence and direct observation by faculty of residents. Ann Acad Med Singapore 2008;37:3-8.