Asee peer logo

Validating the Diagnostic Capability of the Concept Assessment Tool for Statics (CATS) with Student Think-Alouds

Download Paper |

Conference

2013 ASEE Annual Conference & Exposition

Location

Atlanta, Georgia

Publication Date

June 23, 2013

Start Date

June 23, 2013

End Date

June 26, 2013

ISSN

2153-5965

Conference Session

Misconceptions

Tagged Division

Educational Research and Methods

Page Count

13

Page Numbers

23.1352.1 - 23.1352.13

DOI

10.18260/1-2--22737

Permanent URL

https://strategy.asee.org/22737

Download Count

610

Request a correction

Paper Authors

biography

Dana Denick Purdue University, West Lafayette

visit author page

Dana Denick is a PhD student in the School of Engineering Education at Purdue University. Dana holds a BS in Mechanical Engineering from Bucknell University, MA in Physics Education from the University of Virginia and MS in Library and Information Science from Drexel University. Her research interests include conceptual understanding of engineering science and information literacy for engineering.

visit author page

biography

Aidsa I. Santiago-Román University of Puerto Rico, Mayaguez Campus

visit author page

Aidsa I. Santiago-Román is a Tenured Assistant Professor in the General Engineering Department at the University of Puerto Rico, Mayaguez Campus (UPRM). Dr. Santiago earned a BA and MS in Industrial Engineering from UPRM and Ph.D in Engineering Education from Purdue University. Before attending Purdue University, she has been an engineering instructor for about 10 years. Her primary research interests are investigating students’ understanding of difficult concepts in engineering science, especially for underrepresented populations and she also works in the implementation of best practices at UPRM.

visit author page

author page

James W Pellegrino University of Illinois, Chicago

biography

Ruth A. Streveler Purdue University, West Lafayette

visit author page

Ruth A. Streveler is an Associate Professor in the School of Engineering Education at Purdue University. Dr. Streveler has been the Principle Investigator or co-Principle Investigator of ten grants funded by the US National Science Foundation. She has published articles in the Journal of Engineering Education and the International Journal of Engineering Education and has contributed to the Cambridge Handbook of Engineering Education Research. She has presented workshops to over 500 engineering faculty on four continents. Dr. Streveler’s primary research interests are investigating students’ understanding of difficult concepts in engineering science and helping engineering faculty conduct rigorous research in engineering education.

visit author page

author page

Louis V DiBello Universtify of Illinois at Chicago

Download Paper |

Abstract

Validating the Diagnostic Capability of the Concept Assessment Tool for Statics (CATS) with Student Think-AloudsEngineering Concept Inventories have the potential to be used as diagnostic instruments that canprovide instructors with information about student understanding of key concepts that in turn canbe used to guide classroom instruction and improve student learning. Doing so requires evidenceof the validity of assumptions regarding the concepts and errors assessed by individual itemstogether with techniques for extracting diagnostic information.In developing CATS, Steif1 drew upon his own and others previous research regarding keyconcepts and common misconceptions that students demonstrate in reasoning about staticsproblems. Both essential conceptual knowledge and common errors have been mapped toindividual CATS items based on experts' predictions of how students' may reason about a givenproblem situation. Prior multivariate psychometric analysis of inventory and item performancehave supported these mappings, including work that proposed a matrix of cognitive attributesapplicable to each CATS item2.This paper describes the results from a verbal protocol study eliciting students' reasoning aboutkey concepts ostensibly required to solve 14 CATS items with the goal of amassing evidence toextend the instructional value of CATS. The research questions guiding this study were: • How is students' thinking about key concepts and skills in statics represented in verbal descriptions of their reasoning while solving CATS items? • How does students' thinking align with the presumed set of skills and errors underlying the design and proposed interpretation of CATS items?This study was guided by related work of Minstrell3-4 and his colleagues on facets ofunderstanding, in which the diagnosis of students' conceptual understanding may help instructorsdesign more targeted and meaningful instruction. Qualitative research methods, specificallyverbal think-aloud protocols, were identified to further validate proposed models of cognitiveskills and student errors. In the present study, students were prompted to explain their line ofreasoning for selected CATS items and describe why alternate responses were not selected. Thepreviously identified skills and errors were then used as part of an analytic coding scheme thatallowed for the emergence of additional concepts and errors beyond those originally posited foreach item.Based on student responses, it appears that the expert-generated model of knowledge and skillsmay be sufficient overall, although individual skills may align with specific CATS itemsdifferently than expected. Also, some evidence indicates that the common errors associated withCATS should include two additional errors related to misconceptions of moment.The findings of this study promise several broader impacts. First, they provide evidence ofstudent thinking as a means of validating the diagnostic capability of CATS. Second, theinformation provided from these studies will inform and enhance the interpretation of studentperformance on CATS. Third, some of the findings may indicate aspects of the current CATSthat may be considered for modification, including instances of CATS questions, multiple choiceoptions, concept descriptions and mappings, and common student error descriptions andmapping. Finally, an identification of trends in how students conceptualize statics problems, asprovided in this study may prove generally useful to inform statics instruction.1. Steif, P.S. and J.A. Dantzler, A statics concept inventory: Development and psychometric analysis. Journal of Engineering Education, 2005. 94(4): p. 363-371.2. Santiago-Roman, A.I., et al. The development of a Q-matrix for the concept assessment tool for statics. in 2010 ASEE Annual Conference and Exposition. 2010. Louisville, KY: American Society for Engineering Education.3. Minstrell, J., Student thinking and related assessment: Creating a facet-based learning environment, in Grading the Nation's Report Card: Research from the Evaluation of NAEP. 2000. p. 44-73.4. Minstrell, J. and E. van Zee, Using questioning to assess and foster student thinking, in Everyday assessment in the science classroom, J.M. Atkin and J.E. Coffey, Editors. 2003, National Science Teachers Association Press Arlington, VA. p. 61-73.

Denick, D., & Santiago-Román, A. I., & Pellegrino, J. W., & Streveler, R. A., & DiBello, L. V. (2013, June), Validating the Diagnostic Capability of the Concept Assessment Tool for Statics (CATS) with Student Think-Alouds Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. 10.18260/1-2--22737

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015