We are pleased to announce the long-awaited release of ConQuest Version 2.0.
ConQuest is a computer program for fitting item response (Rasch) and latent regression models. It provides a comprehensive and flexible range of item response models that allow you to examine the properties of performance assessments, traditional assessments, and rating scales. ConQuest also offers the most up-to-date psychometric methods of multifaceted item response models, multidimensional item response models, latent regression models, and drawing plausible values.
ConQuest is available with both a graphical user interface (GUI) and a simple command line, or console, interface. The GUI version is available for all Windows platforms. ConQuest comes with a 200+-page comprehensive manual that includes tutorials on the various types of Rasch analysis that it supports.
ConQuest version 2.0 incorporates many enhancements to the 1998 version 1.0. These include plots of item characteristic curves, user-defined fit statistics, estimation of population characteristics such as percentages above a cut-point on a scale, and a more user-friendly interface.
You can download from http://www.assess.com/ a 30-day trial copy of ConQuest and the Version 2.0 PDF manual.
David J. Weiss, President
Assessment Systems Corporation
Journal of Applied Measurement
Volume 8, Number 4. Winter 2007
Nonequivalent Survey Consolidation: An Example From Functional Caregiving. Nikolaus Bezruczko and Shu-Pi C. Chen
Mindfulness Practice: A Rasch Variable Construct Innovation. Sharon G. Solloway and William P. Fisher, Jr.
Substance Use Disorder Symptoms: Evidence of Differential Item Functioning by Age. Kendon J. Conrad, Michael L. Dennis, Nikolaus Bezruczko, Rodney R. Funk, and Barth B. Riley
A Monte Carlo Study of the Impact of Missing Data and Differential Item Functioning on Theta Estimates from Two Polytomous Rasch Family Models. Carolyn F. Furlow, Rachel T. Fouladi, Phill Gagné, and Tiffany A. Whittaker
Investigation of 360-Degree Instrumentation Effects: Application of the RASCH Rating Scale Model. John T. Kulas and Kelly M. Hannum
Rasch Measurement of Self-Regulated Learning in an Information and Communication Technology (ICT)-rich Environment. Joseph N. Njiru and Russell F. Waugh
Understanding Rasch Measurement: The Saltus Model Applied to Proportional Reasoning Data. Karen Draney
Richard M. Smith, Editor
JAM web site: www.jampress.org
JAM library recommendation form
Journal of Applied Measurement
Volume 9, Number 1. Spring 2008
Strategies for Controlling Item Exposure in Computerized Adaptive Testing with the Partial Credit Model. Laurie Laughlin Davis and Barbara G. Dodd
A Multidimensional Rasch Analysis of Gender Differences in PISA Mathematics. Ou Lydia Liu, Mark Wilson, and Insu Paek
An Exploration of Correctional Staff Members' Views of Inmate Amenities: A Scaling Approach. Elizabeth Ehrhardt Mustaine, George E. Higgins, and Richard Tewksbury
Measuring Job Satisfaction in the Social Services Sector with the Rasch Model. Eugenio Brentari and Silvia Golia
Comparing Screening Approaches to Investigate Stability of Common Items in Rasch Equating. Alvaro J. Arce-Ferrer
Estimation of the Accessibility of Items and the Confidence of Candidates: A Rasch-Based Approach. A. A. Korabinski, M. A. Youngson, and M. McAlpine
Binary Items and Beyond: A Simulation of Computer Adaptive Testing Using the Rasch Partial Credit Model. Rense Lange
Richard M. Smith, Editor
JAM web site: www.jampress.org
JAM library recommendation form
Mathematics Education Research Journal
Volume 18, Number 2, 2006
Research in Mathematics Education and Rasch Measurement. Rosemary Callingham and Trevor Bond
A Case of the Inapplicability of the Rasch Model: Mapping Conceptual Learning. Kaye Stacey and Vicki Steinle
A Longitudinal Study of Student Understanding of Chance and Data. Jane Watson, Ben Kelly and John Izard
Applying the Rasch Rating Scale Model to Gain Insights into Students? Conceptualization of Quality Mathematics Instruction. Kelly Bradley, Shannon Sampson and Kenneth Royal
Easier Analysis and Better Reporting: Modeling Ordinal Data in Mathematics Education Research. Brian Doig and Susie Groves
Modeling Mathematics Problem Solving Item Responses Using a Multidimensional IRT Model. Margaret Wu and Raymond Adams
Surveying Primary Teachers about Compulsory Numeracy Testing: Combining Factor Analysis with Rasch Analysis. Peter Grimbeek and Steven Nisbet
Free download from:
http://www.merga.net.au/node/41?volume=18&number=2
Figure from Wu and Adams (2006) |
Figure. Comparison of dependence in marine communities. Cinner, J., S. Sutton, T. Bond (2007) Socioeconomic thresholds that affect use of customary fisheries management tools. Conservation Biology: 21(6): 1603-1611 |
Book: Assessing and Modeling Cognitive Development in School:
Intellectual Growth and Standard Setting
JAM Press, http://www.jampress.org/ (click on: JAM Press Books), is pleased to announce this new book which presents a series of papers that examine the area of cognitive modeling in assessment with a particular emphasis on standard setting. These papers present the most up to date information on modeling student learning using multivariate IRT models, progress variable mapping, value-based approaches, content trajectories, on line tutoring records, and vertically articulated performance standards.
The No Child Left Behind (NCLB) legislation has encouraged a keen interest in standard setting. At the same time, there has been a steady increase in the use of cognitive models to understand student performance. These models are being used to characterize the patterns of problem solving that a student utilizes to solve the test items with which he or she is faced in an assessment. This book combines these two interests in a way that gives the reader an overview of the current literature as well as the issues that remain unresolved. This book helps one to understand the standard setting problem as one of characterizing the expert student's problem solving strategies and differentiating these strategies from those used by the inexpert student. The result is a view of standard setting and student progress that takes on a very different appearance from that traditionally used in psychometrics.
This book is based on the very well received conference of the same name held on the University of Maryland Campus on October 19 and 20, 2006.
The titles and authors of the eleven chapters are as follows:
1. A Prospective, Progressive, and Predictive Approach to Standard Setting
Isaac I. Bejar, Henry I. Braun, and Richard J. Tannenbaum, Educational Testing Service
2. Vertically Articulated Performance Standards: An Exploratory Study of Inferences about Achievement and Growth
Steve Ferrara. Gary W. Phillips, Paul L. Williams, Steven Leinwand, Shannon Mahoney, and Stephan Ahad, American Institutes for Research
3. Using On-line Tutoring Records to Predict End-of-Year Exam Scores: Experience with the ASSISTments Project and MCAS 8th Grade Mathematics
Brian W. Junker, Carnegie Mellon University
4. Non-Linear Unidimensional Scale Trajectories through Multidimensional Content Spaces: A Critical Examination of the Common Psychometric Claims of Unidimensionality, Linearity, and Interval-Level Measurement
Joseph A. Martineau, Michigan Department of Education; Dipendra Raj Subedi, Michigan State University; Kyle H. Ward, Michigan Department of Education; Tianli Li, Yang Lu, Qi Diao, Feng-Hsien Pang, Samuel Drake, Tian Song, Shu-Chuan Kao, Yan Zheng, and Xin Li, Michigan State University
5. Item Response Theory and Longitudinal Modeling: The Real World is Less Complicated than We Fear
Marty McCall, and Carl Hauser, Northwest Evaluation Association
6. A Culture of Remembering: Contexts of Mathematical Development and their Implications for Assessment and Standard-Setting
Christopher A. Correa, and Kevin F. Miller, University of Michigan
7. Estimating Gain in Achievement when Content Specifications Change: A Multidimensional Item Response Theory Approach
Mark D. Reckase, Michigan State University, and Tianli Li, ACT, Inc.
8. Implementing Cognition-Based Learning Goals in Classrooms: The State Role
Mark Moody, Hillcrest and Main, Inc., William D. Schafer, University of Maryland, and Lani Seikaly, Hillcrest and Main, Inc.
9. A Value-Based Approach for Quantifying Student's Scientific Problem Solving Efficiency and Effectiveness Within and Across Educational Systems
Ron Stevens, IMMEX Project, UCLA
10. Once You Know What They've Learned, What Do You Do Next? Designing Curriculum and Assessment for Growth
Dylan Wiliam, Institute of Education, University of London
11. Using Progress Variables to Map Intellectual Development
Cathleen A. Kennedy, and Mark Wilson, University of California, Berkeley
Notes & Quotes. Rasch Measurement Transactions, 2008, 21:3
Forum | Rasch Measurement Forum to discuss any Rasch-related topic |
Go to Top of Page
Go to index of all Rasch Measurement Transactions
AERA members: Join the Rasch Measurement SIG and receive the printed version of RMT
Some back issues of RMT are available as bound volumes
Subscribe to Journal of Applied Measurement
Go to Institute for Objective Measurement Home Page. The Rasch Measurement SIG (AERA) thanks the Institute for Objective Measurement for inviting the publication of Rasch Measurement Transactions on the Institute's website, www.rasch.org.
Coming Rasch-related Events | |
---|---|
Apr. 21 - 22, 2025, Mon.-Tue. | International Objective Measurement Workshop (IOMW) - Boulder, CO, www.iomw.net |
Jan. 17 - Feb. 21, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
Feb. - June, 2025 | On-line course: Introduction to Classical Test and Rasch Measurement Theories (D. Andrich, I. Marais, RUMM2030), University of Western Australia |
Feb. - June, 2025 | On-line course: Advanced Course in Rasch Measurement Theory (D. Andrich, I. Marais, RUMM2030), University of Western Australia |
May 16 - June 20, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
June 20 - July 18, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Further Topics (E. Smith, Facets), www.statistics.com |
Oct. 3 - Nov. 7, 2025, Fri.-Fri. | On-line workshop: Rasch Measurement - Core Topics (E. Smith, Winsteps), www.statistics.com |
The URL of this page is www.rasch.org/rmt/rmt213e.htm
Website: www.rasch.org/rmt/contents.htm