February 2009
There is frequently a question about the number of items needed to cover the content and insure candidate separation reliability.  Our simple study revealed that the quality of items is as or more important than the absolute number of items on the test to achieve satisfactory reliability.
Mary E. Lunz, Ph.D.

Test Length and Test Reliability for Multiple Choice Examinations

Some believe that longer multiple choice tests tend to be more reliable because more items automatically reduce the error of measurement.  Indeed, a sufficient number of items must be included to cover the content areas tested; however, there are other factors that contribute to how efficiently a test measures and separates candidate ability. 

The indication of reliability is candidate separation reliability, calculated as [SD2 - SE2/ SD2] using Rasch logit measures.  This reliability index is appropriate because the goal of any certification examination is to distinguish between those candidates who are worthy of passing and those who are not.  The better the test items distinguish among candidate abilities, the less measurement error there is in the examination and the higher the candidate separation reliability, regardless of the number of items.  For example, deleting poorly performing items makes the test shorter, but also increases the accuracy of measurement because the items producing the most error are eliminated from the examination. 

In this simple study, we looked at test length compared to test reliability. Six different exams were included. The table shows the tests in reliability order along with the number of items in each test.  The number of items and reliability do not appear to be related. A test of 211 items had a reliability of .93 while another test of 233 items had a reliability of .78.  It appears that the quality of the items is as important, or even more important, than the absolute number of items on the test. Of course, a reasonable number of items must be included to insure accurate measurement and content coverage.

   Table of Test Length and Reliability


N of items


Exam 1



Exam 2



Exam 3



Exam 4



Exam 5



Exam 6



Measurement Research Associates, Inc.
505 North Lake Shore Dr., Suite 1304
Chicago, IL  60611
Phone: (312) 822-9648     Fax: (312) 822-9650

Coming Rasch-related Events
Feb 26 - June, 2018Online Advanced course in Rasch Measurement Theory (D.Andrich), University of Western Australia, Perth, Australia,
March 23, 2018, Fri.12th Annual UK Rasch User Group Meeting, Loughborough University, Loughborough, England,
April 10-12, 2018, Tues.-Thurs. Rasch Conference: IOMW, New York, NY,
April 13-17, 2018, Fri.-Tues. AERA, New York, NY,
May 22 - 24, 2018, Tues.-Thur. EALTA 2018 pre-conference workshop (Introduction to Rasch measurement using WINSTEPS and FACETS, Thomas Eckes & Frank Weiss-Motz),
May 25 - June 22, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),
June 27 - 29, 2018, Wed.-Fri. Measurement at the Crossroads: History, philosophy and sociology of measurement, Paris, France.,
June 29 - July 27, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Further Topics (E. Smith, Winsteps),
July 25 - July 27, 2018, Wed.-Fri. Pacific-Rim Objective Measurement Symposium (PROMS), (Preconference workshops July 23-24, 2018) Fudan University, Shanghai, China "Applying Rasch Measurement in Language Assessment and across the Human Sciences",
July 29 - August 4, 2018 Vth International Summer School `Applied Psychometrics in Psychology and Education`, Institute of Education at the Higher School of Economics, St. Petersburg, Russia,
Aug. 10 - Sept. 7, 2018, Fri.-Fri. On-line workshop: Many-Facet Rasch Measurement (E. Smith, Facets),
Sept. 3 - 6, 2018, Mon.-Thurs. IMEKO World Congress, Belfast, Northern Ireland,
Oct. 12 - Nov. 9, 2018, Fri.-Fri. On-line workshop: Practical Rasch Measurement - Core Topics (E. Smith, Winsteps),