Asee peer logo

Evaluation of Humans and Software for Grading in an Engineering 3D CAD Course

Download Paper |

Conference

2019 ASEE Annual Conference & Exposition

Location

Tampa, Florida

Publication Date

June 15, 2019

Start Date

June 15, 2019

End Date

June 19, 2019

Conference Session

Engineering Design Graphics Division Technical Session 1 - Current Issues

Tagged Division

Engineering Design Graphics

Page Count

33

DOI

10.18260/1-2--32764

Permanent URL

https://strategy.asee.org/32764

Download Count

842

Request a correction

Paper Authors

biography

Anthony P. Garland Clemson University

visit author page

Dr. Anthony Garland is a Post-Doc at Sandia National Labs. Previously, he was a lecturer at Clemson University in the General Engineering department where he taught engineering graphics for four years. His research interests include developing computational tools to assist engineering teachers.

visit author page

biography

Sarah Jane Grigg Clemson University

visit author page

Dr. Sarah Grigg is a senior lecturer in General Engineering at Clemson University. She is a human factors design engineer specializing in process improvement and error mitigation across various contexts including engineering education, healthcare, and transportation. She received Ph.D., M.S. and B.S. degrees in Industrial Engineering, a Certificate in Engineering Education, and a Masters degree in Business Administration from Clemson University.

visit author page

Download Paper |

Abstract

In skill-building courses such as an introductory 3D CAD course, instructors typically provide many assignments for students to practice and improve their 3D modeling skills. Frequent and accurate assessments give students the opportunity to identify errors and address deficiencies more efficiently, promoting quicker acquisition of the skill. In an ideal learning environment, students would be provided feedback at every class meeting, but that can be a daunting task as grading 3D CAD homework is difficult and time-consuming. The objective of this work is to compare human and software grading of student's 3D CAD files and quantify the speed, accuracy, and effectiveness each. A statistical analysis was performed on 5200 models from three different assignments to compare the two modes of grading. Better understanding of the different grading practices enables resource allocation based on strengths of humans and computers; resulting in a more efficient combination of resources. The results show that Graderworks© software (GW) was more accurate and repeatable than human graders (TAs) in quantitative comparisons: evaluating material, volume, shape, and sketches. TAs often made a few clerical errors per assignment that limited the effectiveness of the file management structure and subsequent calculations from manually entered fields like the name or username. However, a single change in the learning management software naming convention of files lead to a large scale clerical error with similar frustrations with automation of grading. Still, one of the biggest challenges we have experienced with human grading is the high variability in speed and accuracy of graders; an ANOVA test showed that error rates differ between TAs at a statistically significant level. TAs are effective at providing informative feedback that provides direction for improving the model, but it is a time consuming process. At this time, the software is not able to offer substantial and specific feedback to the students on how to improve, and it is recommended to use the computational grading tools in conjunction with human graders. Using the software to prioritize which files need TA feedback, those with similarity scores below a threshold value, may lead to a more efficient and effective use of resources to provide a quality feedback loop.

Garland, A. P., & Grigg, S. J. (2019, June), Evaluation of Humans and Software for Grading in an Engineering 3D CAD Course Paper presented at 2019 ASEE Annual Conference & Exposition , Tampa, Florida. 10.18260/1-2--32764

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2019 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015