Greetings,
The computer is a fascinating and all powerful machine that
can be programmed to do just about anything. For certification, the question is how can the computer be programmed to make the examination more valid
and reliable.
Mary E. Lunz, Ph.D. Executive Director
|
Alternative Item Formats For Computer Based Testing
|
For certification, we commonly think of multiple
choice items because they are generally accepted, easy to administer, familiar
from the development perspective, easy to score, and convenient for testing
larger samples of candidates. This
article explores some alternatives to standard four or five part multiple
choice items that can be delivered with computer based testing. All item formats have assets and liabilities
which are noted. All of these formats are available at Pearson VUE.
Video clips can
be used to show organ function or patient interaction. They have action and
sound. It is easy to incorporate video clips into a standard multiple choice
exam. They must meet a standard video
format (MPEG-1). Sound, if any is heard through ear phones.
Medical animation requires the development of the graphic images which are associated with the
questions or actions/manipulations being tested. The graphics can be programmed
so that the candidate controls the actions and behavior of the characters. The
development of the graphics can be expensive and requires a great deal of work
by the subject matter experts. The candidate may have to learn how to operate the
computer programming to make the graphic images perform the tasks being tested.
Short answer responses allow the candidate to offer their thoughts and ideas related to
a specific topic or question. The
candidate explains the rationale for his/her judgments, selected procedures, or
problem solving strategies. The
difficulty with the short answer responses, is that they must be graded by a
content expert unless a long, complex, and expensive scoring algorithm is
constructed and validated. No such
scoring algorithm has been developed to date.
Extended multiple
choice items allow the inclusion of many possible responses (e.g. 10 - 20)
based on the theory that candidates require more knowledge, skill and judgment
to sort through all of the possible alternatives. Often candidates are allowed
to choose all appropriate responses for this type of item. In reality, we have often
found that there are four or five responses that are frequently selected and
the others are not selected by any candidates.
Hot spot technology allows candidates to respond by clicking on points in an image to answer the
question. The mouse cursor becomes a
crosshair shape when over a hot-spot image and a marker is placed at the center
of the crosshair when the candidate clicks the mouse button. A simplistic
example is asking the candidate to click on the capital of Texas
with many cities in Texas shown
on a map. It is parallel to clicking on
the section of an x-ray that identifies the patient's problem.
Zoom technology uses the pearsonvue:scalefactors
to allow a candidate to zoom into and out of content within an image . For
example, an image is displayed and the candidate scrutinizes the image, zooming
(or scaling) the image between a predefined set of zoom levels. This would be appropriate for microscopic
slides. However, there is the danger
that the image may become pixilated when zoomed to a very high scaling factor.
Finally, almost any
type of visual can be used in the exams. More creative use of visuals could enhance the knowledge and skill
tested in the exam.
|
What can be done with computer programming is limited only
by our imaginations and budgets. The use of alternate item
formats raises potential scoring, expert commitment, and programming cost
issues. The question is "what is gained vs. what is lost?" While the enhanced computer administration
formats increase validity, what might be lost with regard to reliability or
accuracy of candidate measurement? Is
the financial investment worth the long term convenience of less travel and
"away" time for both candidates and examiners.
For example, if examiners score short answer responses at home instead
of administering an oral examination at a specified place and time, will there
be an impact on candidate outcomes when examiner colleagues are no longer
available for professional and social interaction.
|