The National Board of Medical Examiners is developing a computerized patient simulation examination (CBX). During 1989 and 1990, major field tests have provided data with which to evaluate the CBX cases and scoring procedures.
CBX provides an uncued, unstructured, interactive patient management problem. The examinee types his treatment requests into the computer. The requested actions are matched against 3000 recognized actions. Scoring keys specify actions that a scoring committee felt were essential to do, or to avoid, in the assessment and management of the patient. Partial credit scoring is used to reward timeliness, sequencing and appropriateness. Each patient case is calibrated separately (currently by MSTEPS, soon to be done by FACETS). The missing data algorithms in these calibration programs enable the use of items that do not apply to all examinees. Examinee ability estimates obtained from each patient case are averaged to provide an examinee's final measure.
Since Fall, 1989, CBX examinations in Obstetrics/Gynecology and Surgery have been provided to interested medical schools along with their purchased paper-and-pencil subject examinations. Over 415 examinees have been tested in Ob/Gyn and over 300 in Surgery.
The reliability of the examination is estimated from the consistency of eight case scores. With average inter-case correlations of .35, the Ob/Gyn examination has reliability .80 and the Surgery examination .75.
Rasch analysis is essential in validating CBX. During the key- validation process the scoring keys are reevaluated by a scoring committee. The committee treats the case as though they were an examinee, reviews a Rasch-generated map of the item difficulties and then considers items that either 1) misfit, 2) are unexpectedly difficult, or 3) show no scoring in middle categories. Often the source of misfit can be identified in the definition of the item, e.g. it was required that this action be requested after that, but the high scorers requested them at the same time. In addition, unanticipated actions are reviewed for possible inclusion in the key, usually as a risk to be avoided.
Scoring Computerized Patient Simulations, E Julian Rasch Measurement Transactions, 1990, 4:3 p. 116
Forum | Rasch Measurement Forum to discuss any Rasch-related topic |
Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement
Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.
Coming Rasch-related Events | |
---|---|
Apr. 21 - 22, 2025, Mon.-Tue. | International Objective Measurement Workshop (IOMW) - Boulder, CO, www.iomw.net |
Jan. 17 - Feb. 21, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
Feb. - June, 2025 | On-line course: Introduction to Classical Test and Rasch Measurement Theories (D. Andrich, I. Marais, RUMM2030), University of Western Australia |
Feb. - June, 2025 | On-line course: Advanced Course in Rasch Measurement Theory (D. Andrich, I. Marais, RUMM2030), University of Western Australia |
May 16 - June 20, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
June 20 - July 18, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com |
Oct. 3 - Nov. 7, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
The URL of this page is www.rasch.org/rmt/rmt43f.htm
Website: www.rasch.org/rmt/contents.htm