The public trusts the medical education system to graduate clinically competent physicians. The clinical reasoning skills of medical students are not systematically and uniformly examined during the clinical years of training. Traditional assessments such as ward evaluations, written examinations and even simulated patient encounters (using patient-actors) do not robustly measure clinical reasoning and diagnostic skills. This was quite evident to me after nearly 7 years in charge of medical student training in internal medicine at the Virginia Commonwealth University School of Medicine.
Here is an article published in Academic Medicine exploring the diagnostic justification (DXJ) abilities (clinical reasoning) of senior medical students. All senior in the classes of 2011 (n = 67) and 2012 (n = 70) at a Midwestern university were required to take and pass a 14-case, standardized patient examination prior to graduation. For nine cases, were required to write a free-text response indicating how they used patient data to move from their differential to their final diagnosis. The DXJ scores were compared with traditional standardized patient examination (SCCX) scores.
Although using SCCX and DXJ scores led to the same pass-fail decision in the majority of cases, discrepancies occurred. In discrepant cases, would fail using the DXJ score but pass using the SCCX score, suggesting that deficiencies in diagnostic reasoning may go unnoticed.
We are exploring the use of simulated technologies to simultaneously assess patient care and clinical management skills in a standardized, uniform fashion. We are using newer simulation tools, e.g. high-fidelity mannequins (iSTAN), for the assessment of student diagnostic and management skills in real time based on response to changes in mannequin-patient clinical signs, data, and pathophysiologic state. Think of it as a flight simulator for medical trainees.