How do medical schools assess their students’ clinical competence prior to high stakes board exams?
How can medical school faculty prepare students to solve new clinical puzzles? And how do they do so safely?
In this case study, we examine how the Southern Illinois University School of Medicine approaches this challenges, starting in the first year of their curriculum.
Assessing clinical competency and critical thinking
Every medical school aims to produce good clinicians: medical professionals who are clinically competent to assess, diagnose, and treat patients. Board exams set a threshold that clinical trainees must meet to obtain a license to practice. But how do medical schools examine their students along the way, prior to licensure testing?
One way is to administer exams at key points in the students’ academic career — exams designed to reveal strengths and gaps in that student’s critical thinking and clinical competence.
“Critical Thinking, pretty much, means thinking about things in a careful manner, considering as many perspectives as possible, looking at one’s assumptions and making sure there are no contradictions between those assumptions and, pretty much, thinking through a problem carefully, making sure one considers as much about those situations as possible.”
Dr. Cris Anderson of the Southern Illinois University School of Medicine says exams must go beyond assessing if the student knows the purpose for a particular drug, or how a disease process manifests itself. Assessments must examine the student’s ability to reason through clinical problems that they may never have encountered before.
Problem solving and practice
One way of assessing both knowledge and how knowledge is applied is through problem-based learning; that is, challenging the student to solve a clinical riddle, track the student’s decisions, and compare those landmarks to what’s expected of practicing professionals. Medical schools use simulations of all kinds to accomplish this type of evaluation in a safe and repeatable manner.
That’s exactly what DxR Development’s Clinical Competency Exam software is designed to do. And it’s why Dr. Anderson is a power user, administering six CCX cases per year to SIU’s first-year medical students.
“The software is constructed in a logical fashion. The way students encounter the exam, it’s like, ‘OK, you’ve seen the door data. Now you need to come up with a differential (diagnosis). I mean, that’s what a doctor’s going to do. It’s like, you go up to the door when you’re in your office and the nurse has already put somebody in the room, has made a little note, ‘this person is coming in with shortness of breath or whatever it is.’ And so the way the software is constructed it encourages students to develop good habits for their eventual patient care situations where they’re learning the chief complaint, considering various differentials, knowing what they need to find out about the patient to rule those potential diagnoses in or out, and then the opportunity to order ancillary testing, hopefully judiciously… and then proceeding through with the full patient work-up.
In Y1, year one, we don’t specifically do management, but they do that as soon as they get to year 2, and our students also, when they get to year three, they have a senior clinical competency exam which I do a case for and have to grade it. So, I can see how the management section also helps the student develop good habits for taking care of patients.”
Standardized vs. Virtual Patients
Dr. Anderson couples the CCX software with one of two representations of real patients: a) a ‘virtual’ patient, with the history and physical exam information presented in the context of a computer program; or b) a standardized patient, who plays the role of the patient in a practice clinical encounter. Fidelity — the realism of the patient encounter — is higher with a standardized patient. CCX allows evaluators to include feedback from the standardized patient in the evaluation of student performance. But using standardized patients also requires more set-up and administrative work. Virtual patients represented on screen are always available and consistent, a feature that kept SIU’s exams moving forward, even when students weren’t allowed on campus due to COVID-19 closures.
Revealing Clinical Reasoning
Students taking a CCX exam document their findings and their work-up of the virtual or standardized patient within the software, for comparison with the instructor’s standards. Anderson says even early in the first year of medical school, patterns emerge that reveal the student’s performance in reasoning through problems.
“…the ‘patient’ is gonna have a chief complaint, so ‘Does the student know a reasonable differential for that particular chief complaint?’ The, based on the potential diagnostic items, the differential items that they have in the system, the key findings let us know, ‘Do they know the clinical definitions of those items?’
Dr. Anderson says CCX also offers a valuable feature that requires students to correlate their hypotheses with findings from the history and physical exam.
“The Findings/Hypothesis correlation part really helps when they select a particular differential item. Then they chose the items from their key findings, and only the items that support that diagnosis or contradict that diagnosis, so the key findings list as a whole each items on that list should be a key finding for at least one of the diagnostic items on that list. But, when they do the Findings/Hypothesis correlation part, then they should must have the few key findings that they have that actually pertain to the differential item that they have chosen, so that’s why I like that particular feature so much.”
Supporting Clinical Choices
CCX also helps instructors evaluate whether students appropriately use physical and history findings to focus their clinical investigation.
“…when they do their interim DxJ (diagnostic justification), is sort of like ‘Do they really know how much they can rule in or out based just on History or Physical Exam?” So that’s really important because that’s a key feature to help students minimize unnecessary, ancillary testing later in the case.”
The Big Picture
Faculty members can use CCX’s powerful reporting tools to map student strengths and weaknesses in key areas. By the time Dr. Anderson’s students complete a handful of CCX cases in their first year, faculty see clear evidence of student’s emerging clinical competence in multiple areas of practice.
“I think the more care people put into generating suitably complex cases and the more they encourage the students to go through the exams and realize there’s a reason for each section of the exam.. this is training you to be a good physician, good reasoning skills, having a high percentage of nailing the diagnosis and having a good treatment outcome.. if care goes into making good cases, and care goes into encouraging the students all along the way to use the software to its advantage and to their advantage, it should work, it should turn out good docs.”
Building on a base of sound clinical reasoning establishes the foundation toward clinical competence. It’s why faculty members like Dr. Anderson rely on the Clinical Competency Exam software from DxR Development Group.