Intrasubtest Scatter in Neuropsychology

Intrasubtest scatter (ISS) is a tendency toward unusual patterns in responses to test items (Wechsler, 1958). This is manifested as misfit to the Rasch model. The items on the Wechsler scales are ordered from easiest to most difficult using data from the standardization sample. Inconsistent patterns consist of getting some of the easier items wrong while getting the harder items right or a tendency toward isolated failures within long runs of correct responses.

Neuropsychological assessment and diagnosis often involves the detection and measurement of off-task or unusual responses in individuals suspected of cognitive dysfunction. An unusual item response pattern requiring neuropsychological interpretation occurs when an individual of high ability demonstrates a tendency to fail easy items. This response pattern may indicate cognitive inefficiency, difficulty with recall of specific information, or variable levels of arousal/attention. Almost all neuro-diagnostic interpretation of ISS has been conducted on adult populations using the Wechsler Adult Intelligence Scale (WAIS) or its revision (WAIS-R). In a review of their own and previous research, Mittenberg et al. (1989) concluded that scatter is associated with diffuse rather than focal neurological damage which results in either random loss of stored information or variable levels of arousal or attention. Surprisingly, the diagnostic utility of ISS has received comparatively little investigation in spite of frequently repeated recommendations that item scatter be interpreted as a qualitative indicator of cognitive dysfunction.

Our own study investigated the clinical utility of ISS in children experiencing attentional and information processing difficulties. The WISC-R item responses of 100 children who had received cranial irradiation treatment (with its risk of brain damage) for acute lymphoblastic leukaemia (ALL) were compared with those of 100 healthy children. The degree to which subjects in each of the two groups responded to items as predicted by an estimate of their ability was analyzed with a Rasch partial credit computer program (Adams & Khoo, 1993). The person Infit mean-square was the critical indicator.

Since brain-damage affects performance differentially, variation in subject ability across the eight Wechsler subtests was able to accurately classify 60% of the ALL cases and 70% of the controls. ISS subject misfit statistics derived from the eight subtests were able to classify correctly 61% of the ALL patients and 69% of the controls. The combination of ability variation across subtests and misfit within subtests correctly classified 71% of the ALL cases and 76% of the controls. The fact that the scatter misfit statistics were as successful as subtest ability variation in identifying subject type supports the conclusion that ISS is diagnostically useful for cases with relatively normal intellectual profiles.

With the widespread and increasing use of large scale testing programs for school-age students and with the computerized scoring of these test protocols, item scatter scores (misfit statistics) can become a screening measure for students with unrecognized cognitive dysfunction. Our research indicates that unusual response patterns on reading, spelling and arithmetic tests will prove to be useful pathognomic signs. A useful next step would be an investigation of the usefulness of item response variability on the Wide Range Achievement Test (WRAT, Jastak & Jastak, 1992, RMT 8:4 403-404) as a pathognomic sign.

Please see Psychological Assessment Resources for more information on the WRAT.

Adams, R.J. & Khoo, S. 1993. Quest, The Interactive Test Analysis System. Melbourne: ACER

Jastak, J. & Jastak, S. (1984) The Wide Range Achievement Test. Wilmington, DE: Jastak

Mittenberg, W., Thompson, G.B., Schwartz, J.A., Ryan, J.J., & Levit, R. (1991) Intellectual loss in Alzheimer's dementia and WAIS-R intrasubtest scatter. Journal of Clinical Psychology, 19, 420-423

Wechsler, D. (1958) Measurement and Appraisal of Adult Intelligence (4th ed.) Baltimore: Williams and Wilkins


Intrasubtest scatter in paediatric neuropsychology. Godber T, Anderson V. … Rasch Measurement Transactions, 1996, 9:4 p.469



Rasch Publications
Rasch Measurement Transactions (free, online) Rasch Measurement research papers (free, online) Probabilistic Models for Some Intelligence and Attainment Tests, Georg Rasch Applying the Rasch Model 3rd. Ed., Bond & Fox Best Test Design, Wright & Stone
Rating Scale Analysis, Wright & Masters Introduction to Rasch Measurement, E. Smith & R. Smith Introduction to Many-Facet Rasch Measurement, Thomas Eckes Invariant Measurement: Using Rasch Models in the Social, Behavioral, and Health Sciences, George Engelhard, Jr. Statistical Analyses for Language Testers, Rita Green
Rasch Models: Foundations, Recent Developments, and Applications, Fischer & Molenaar Journal of Applied Measurement Rasch models for measurement, David Andrich Constructing Measures, Mark Wilson Rasch Analysis in the Human Sciences, Boone, Stave, Yale
in Spanish: Análisis de Rasch para todos, Agustín Tristán Mediciones, Posicionamientos y Diagnósticos Competitivos, Juan Ramón Oreja Rodríguez

To be emailed about new material on www.rasch.org
please enter your email address here:

I want to Subscribe: & click below
I want to Unsubscribe: & click below

Please set your SPAM filter to accept emails from Rasch.org

www.rasch.org welcomes your comments:

Your email address (if you want us to reply):

 

ForumRasch Measurement Forum to discuss any Rasch-related topic

Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement

Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.

Coming Rasch-related Events
May 17 - June 21, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 12 - 14, 2024, Wed.-Fri. 1st Scandinavian Applied Measurement Conference, Kristianstad University, Kristianstad, Sweden http://www.hkr.se/samc2024
June 21 - July 19, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Winsteps), www.statistics.com
Aug. 5 - Aug. 6, 2024, Fri.-Fri. 2024 Inaugural Conference of the Society for the Study of Measurement (Berkeley, CA), Call for Proposals
Aug. 9 - Sept. 6, 2024, Fri.-Fri. On-line workshop: Many-Facet Rasch Measurement (E. Smith, Facets), www.statistics.com
Oct. 4 - Nov. 8, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
Jan. 17 - Feb. 21, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
May 16 - June 20, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 20 - July 18, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com
Oct. 3 - Nov. 7, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com

 

The URL of this page is www.rasch.org/rmt/rmt94j.htm

Website: www.rasch.org/rmt/contents.htm