• Discoverability Visible
  • Join Policy Restricted
  • Created 07 Jul 2014

article:1420

A Statics Concept Inventory: Development and Psychometric Analysis

This paper is a first major report on the development of an assessment capability that we hope ultimately to be a powerful tool valued by instructors.  Its roots are in the recognition that assessment of important knowledge is a key to better learning, and that conceptual knowledge in particular is critical to the deep understanding that enables transfer of knowledge to new circumstances.  The most relevant antecedents are the work to develop the Force Concept Inventory (Halloun and Hesthenes).  Just as the developers of the Force Concept Inventory chose their questions and the answer selections based on extensive interviews of students, this test was based on a cognitive model of aspects of student learning.  While studies of conceptual understanding are more widespread in science education, engineering subjects, including Statics, have received relatively little attention. One extensive study of errors and misconceptions in Statics led to a conceptual framework for the subject; that study forms important background to the Statics Concept Inventory.  This effort is also part of a recent stream of work to develop conceptual tests for other subjects in engineering, science, and mathematics.

The theoretical framework for this research views learning as resulting from the acquisition of knowledge, which consists of a large number of components.  Some components are abstract, conceptual knowledge, which can be applied in variety of situations.  Although one needs metacognition to draw upon knowledge appropriately, evidence that individual components of conceptual knowledge are present is also of interest.

Choices of methodology had to be made with respect to the design of the test questions, the administration of tests, and the analysis of the results.  Questions were initially designed primarily by one developer based on the conceptual framework for Statics alluded to above.  Questions were subjected to close scrutiny by a few very insightful colleagues at other institutions, with frequent feedback on alternative versions of questions.  Test questions also reflect from input by instructors of students taking the test.  Revisions of the test over several years have been based on item analyses of results. 

Administration of the test was originally done with pencil and paper in class.  For the past year, the same test has also been administered on the web (same questions as the pencil and paper version, but offered in random order).  Moving the test to outside the classroom reduces the burden on instructors, likely increasing participation.  In each case, the local instructor has control over the timing and over notifying the students of the test and setting any ground rules.  Students are often given incentive to do the test (in the form of homework credit), although their particular scores do not generally have any impact on their grades.  Students from small regional colleges to highly selective top-ranked research universities have taken the test.  Since the period reported on in the paper, over 2500 additional students have taken the Statics Concept Inventory.

Results are analyzed with both an eye to drawing conclusions as to the conceptual understanding of students and to revising the test questions.  Analyses include looking at overall scores, at subsets of questions focusing on a single concept, and at individual questions. Research questions we have sought to address include: is the test valid (this has many meanings), reliable, and does it discriminate appropriately among students?  Is the test difficulty appropriate for a broad range of students?  In what senses are sub-scores on individual concepts meaningful?  Are aspects of test scores correlated with other measures of Statics performance?

Based on the results, we concluded that the test as currently constituted was reliable; concurrent validity was established through correlations with grades that are based on exams (more recent studies compare inventory scores directly with exam scores).  The prevalence of certain misconceptions was revealed.

We envision the test questions evolving modestly in the future, with additional insights from instructors.  We also seek to develop improved means of analyzing and interpreting its results.  Future work on this project will be guided by the goal of providing valuable, actionable input to instructors and students to improve learning.

Recent work on the Statics Concept Inventory has been supported by NSF ROLE Program.

Author 1: Paul Steif steif@cmu.edu

Author 2: John Dantzler crs@censeoresearch.com

Article Link: www.asee.org


: Back to 2006 Winter Issue Vol. 2, No. 1

: Back to List of Issues

: Back to Table of Contents

Created on , Last modified on