[Skip to Content]
Sign In
Individual Sign In
Create an Account
Institutional Sign In
OpenAthens Shibboleth
[Skip to Content Landing]
msJAMA
April 5, 2000

High-Stakes Medical Performance Testing: The Clinical Skills Assessment Program

Author Affiliations
 

Not Available

Not Available

JAMA. 2000;283(13):1748. doi:10.1001/jama.283.13.1748-JMS0405-4-1

On July 1, 1998, the Educational Commission for Foreign Medical Graduates (ECFMG) implemented its Clinical Skills Assessment (CSA) as a new requirement for graduates of foreign medical schools seeking certification for entry into an accredited residency training program in the United States. The CSA is a day-long practical examination designed to assess graduates' ability to gather and interpret clinical data and to communicate effectively with patients and health professionals in English.

From the start, the CSA was conceived as only 1 of several assessment elements leading to certification by the ECFMG. Others include passing scores on the United States Medical Licensing Examination (USMLE) Steps 1 and 2, completion of an English comprehension test, and graduation from a medical school listed in the World Health Organization directory.

The CSA developed as a result of widespread concern among medical educators that basic clinical skills, including medical history-taking, physical examination, and doctor-patient communication, were not being adequately addressed in undergraduate medical education.1 Within the United States, the Liaison Committee on Medical Education (LCME) responded to this concern by incorporating requirements on the teaching and assessment of basic clinical skills into its criteria for accreditation of US schools.

However, no international equivalent of the LCME was available to develop similar curriculum requirements for medical schools outside of the United States. Educators were concerned that medical schools around the world were variable in quality and not always subject to outside review.2 Since neither the ECFMG nor any other agency could directly impact schools' curricula, the CSA was developed as a tool for evaluating educational outcomes.

To develop a process that yielded a reliable and fair assessment of clinical skills, a number of preliminary studies were undertaken under the auspices of the ECFMG. Two of the larger-scale studies involved both US medical students in their third and fourth years and graduates of foreign medical schools at varying levels of certification. The first, conducted in Baltimore, Md, involved a consortium of 6 schools; the second, conducted in Philadelphia, Pa, involved 2 local schools.

The introduction of this assessment is a milestone in US medical evaluation in that it is the first time basic clinical skills, including interpersonal skills and spoken English proficiency, are being assessed in a high-stakes environment. To ensure that the test is administered in a standardized manner, the CSA is offered only at the ECFMG headquarters in Philadelphia. The CSA uses standardized patients (SPs), many of whom are actors, to portray cases to the candidates and to score the encounters. SPs are recruited not for the presence of physical findings but for their ability to portray cases and score accurately; each receives 15 to 20 hours of training on assessing interpersonal skills and English proficiency, plus 6 hours of training for each case he or she portrays.

The ECFMG now has had experience with 8383 candidates who presented themselves for assessment at its center from July 1, 1998, through January 31, 2000. While the test is administered throughout the year, results are standardized through an analysis and equation of mean scores received on each case-SP combination. Analysis shows that 96.9% of these candidates received a passing decision. (CSA reports only pass/fail decisions, not numerical scores.)

Evidence suggests this high pass rate is a reflection of considerable self-selection on the part of the certification candidates. Describing their experience in questionnaires distributed after the examination, 80% of candidates reported making special preparations, including clinical observerships in the United States. Candidates also reported prescreening their spoken English proficiency by taking the Test of Spoken English, and only 3 candidates reported having scored below the level suggested in the CSA orientation manual as a prudent cutoff. Trial runs that were used to set CSA examinations standards involved test takers who had no investment in the results.

Successful candidates must meet or exceed standards for 2 separate components of the CSA: an Integrated Clinical Encounter (ICE) component assessing ability to gather data and compose a clinical note, and a Doctor-Patient Communications (COM) component based on spoken English proficiency and interpersonal skills. To date, 80% of those failing the CSA have done so on the COM side. Of the cohort described, 92% reported that SP portrayals in the test were realistic, and 90% felt that the SP format was appropriate.

The most widely heard complaint made by graduates of foreign medical schools regarding the CSA is that it is unfair for US medical graduates to be exempt from taking a similar national assessment examination. The National Board of Medical Examiners is currently piloting a Standardized Patient Examination for possible incorporation into the USMLE series, and, if adopted, this would be open to graduates of foreign schools in lieu of a separate CSA.

References
1.
Sutrick  AIStillman  PLNoroini  JJ  et al.  ECFMG assessment of clinical competence of graduates of foreign medical schools. JAMA. 1993;2701041- 1045Article
2.
Institute of Medicine, The Nations' Physician Workforce: Options of Balancing Supply and Requirements.  Washington, DC National Academy Press1996;
×