Comments on
Mark Wilson's 2012 Psychometric Society Presidential Address

"Seeking a Balance Between the Statistical and Scientific Elements in Psychometrics", July 2012, Lincoln, NE

In his recent Presidential Address to the International Meeting of the Psychometric Society, Mark Wilson contrasted statistical and scientific themes in psychometrics in terms of the history of his own work, with larger goal of identifying scientific aspects in psychometrics that would distinguish it from statistical modeling. A paper based on this presentation will be forthcoming in Psychometrika in early 2013.

Early in his career, Mark developed the Saltus model of discontinuous development, which was and remains a highly innovative and effective guide to measuring cognitive growth. The work was done in isolation, however; it also is fairly complex, and it was not informed by input from anyone engaged with the substantive practicalities of research in cognitive development. Thus, Mark pointed out that his publications in this area are rarely cited and his model has had little or no impact.

This situation is quite different from more recent work Mark has been doing, in which engagement with substantive content experts is an essential ingredient. Now, the models and construct theories implicitly or explicitly included in curricular outcomes assessments are articulated, developed, and applied in collaboration with experts in the substantive area. As Mark pointed out, there is no dumbing down of model complexity necessary in this context, as many projects naturally entail multiple constructs manifest at multiple levels of organization and/or with multiple facets, and which are assessed via many different types of items or performance rating schemes, some of which may involve testlets or item bundles and their local dependencies. Mark illustrated one of these new collaborations as an example. A middle and high school statistics and data modeling curriculum developed at Vanderbilt University involves a number of separate but interdependent strands. To understand just what was intended for the assessment and to formulate a plan adequate to the needs of both instruction and accountability:

A. detailed theoretical maps of each construct's levels and sublevels were laid out;

B. items were designed to express each level of each construct, and with an eye to the interrelations between those constructs;

C. the scoring of the items, distractors and mistaken responses was set up to inform individualized instructional applications;

D. the measurement model appropriate to the overall assessment system is then applied to a pilot data set; and

E. the new information on the system performance is then used to revise the construct map(s), the item design, the outcome space, and the model for ongoing applications. Though Wilson did not mention it, readers of his 2005 Constructing Measures text will recognize here the four phases of that book's systematic assessment methodology. And completing a round or two through the process certainly sets the stage for iterating through it once again, with the intention of taking the construct sublevels to a new level of specificity capable of affording predictive control over item design and scoring, as is suggested in his 2004 book with DeBoeck on explanatory models.

In conclusion, Wilson raised again the question of just how psychometrics is to be more than a specialized branch of statistics if it does not capitalize on the practical opportunities for measurement it has created for itself in education, psychology, and other fields. There seems to be great potential for integrating qualitative substantive theory and practice with quantitative methods and modeling. Perhaps a recognizable new paradigm is now in the process of forming.

William P. Fisher, Jr.


Mark Wilson's Psychometric Society Presidential Address . William P. Fisher, Jr. … Rasch Measurement Transactions, 2012, 26:2 p. 1364




Rasch Publications
Rasch Measurement Transactions (free, online) Rasch Measurement research papers (free, online) Probabilistic Models for Some Intelligence and Attainment Tests, Georg Rasch Applying the Rasch Model 3rd. Ed., Bond & Fox Best Test Design, Wright & Stone
Rating Scale Analysis, Wright & Masters Introduction to Rasch Measurement, E. Smith & R. Smith Introduction to Many-Facet Rasch Measurement, Thomas Eckes Invariant Measurement: Using Rasch Models in the Social, Behavioral, and Health Sciences, George Engelhard, Jr. Statistical Analyses for Language Testers, Rita Green
Rasch Models: Foundations, Recent Developments, and Applications, Fischer & Molenaar Journal of Applied Measurement Rasch models for measurement, David Andrich Constructing Measures, Mark Wilson Rasch Analysis in the Human Sciences, Boone, Stave, Yale
in Spanish: Análisis de Rasch para todos, Agustín Tristán Mediciones, Posicionamientos y Diagnósticos Competitivos, Juan Ramón Oreja Rodríguez

To be emailed about new material on www.rasch.org
please enter your email address here:

I want to Subscribe: & click below
I want to Unsubscribe: & click below

Please set your SPAM filter to accept emails from Rasch.org

www.rasch.org welcomes your comments:

Your email address (if you want us to reply):

 

ForumRasch Measurement Forum to discuss any Rasch-related topic

Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement

Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.

Coming Rasch-related Events
May 17 - June 21, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 12 - 14, 2024, Wed.-Fri. 1st Scandinavian Applied Measurement Conference, Kristianstad University, Kristianstad, Sweden http://www.hkr.se/samc2024
June 21 - July 19, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Winsteps), www.statistics.com
Aug. 5 - Aug. 6, 2024, Fri.-Fri. 2024 Inaugural Conference of the Society for the Study of Measurement (Berkeley, CA), Call for Proposals
Aug. 9 - Sept. 6, 2024, Fri.-Fri. On-line workshop: Many-Facet Rasch Measurement (E. Smith, Facets), www.statistics.com
Oct. 4 - Nov. 8, 2024, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
Jan. 17 - Feb. 21, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
May 16 - June 20, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com
June 20 - July 18, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com
Oct. 3 - Nov. 7, 2025, Fri.-Fri. On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com

 

The URL of this page is www.rasch.org/rmt/rmt262c.htm,

Website: www.rasch.org/rmt/contents.htm,