Asee peer logo

The Influence of Grading Bias on Reinforced Concrete Exam Scores at Three Different Universities

Download Paper |

Conference

2018 ASEE Annual Conference & Exposition

Location

Salt Lake City, Utah

Publication Date

June 23, 2018

Start Date

June 23, 2018

End Date

July 27, 2018

Conference Session

Course Structuring for Effective Student Engagement

Tagged Division

Civil Engineering

Page Count

15

DOI

10.18260/1-2--31115

Permanent URL

https://strategy.asee.org/31115

Download Count

339

Request a correction

Paper Authors

biography

Benjamin Z. Dymond University of Minnesota, Duluth Orcid 16x16 orcid.org/0000-0002-4752-3445

visit author page

Ben Dymond obtained his B.S. and M.S. degrees in Civil Engineering at Virginia Tech before obtaining his Ph.D. in Civil Engineering at the University of Minnesota Twin Cities. Ben is currently an assistant professor of structural engineering at the University of Minnesota Duluth.

visit author page

biography

Matthew K Swenty P.E. Virginia Military Institute

visit author page

Matthew (Matt) Swenty obtained his Bachelors and Masters degrees in Civil Engineering from Missouri S&T and then worked as a bridge designer at the Missouri Department of Transportation. He obtained his Ph.D. in Civil Engineering at Virginia Tech and worked at the Turner-Fairbank Highway Research Center on concrete bridge research. He is currently an associate professor of Civil Engineering at the Virginia Military Institute (VMI). He teaches engineering mechanics and structural engineering courses at VMI and enjoys working with the students on bridge related research projects and with the ASCE student chapter.

visit author page

biography

Chris Carroll Saint Louis University Orcid 16x16 orcid.org/0000-0001-9250-8503

visit author page

Dr. Carroll is an Assistant Professor in the Department of Civil Engineering at Saint Louis University. His experimental research interests focus on reinforced and prestressed concrete, while his engineering education research interests focus on experiential learning at both the university and K-12 levels. Dr. Carroll serves as a voting member on ACI Committee S802 - Teaching Methods and Educational Materials and is Chair of the Career Guidance Committee for the ASCE - St. Louis Section. He has eight years of formal experience with K-12 engineering education.

visit author page

Download Paper |

Abstract

Grading student exams fairly and effectively remains a challenge for many professors. Maintaining consistency among students on the same exam can be accomplished by using grading rubrics, grading the same question for all students at the same time, and giving similar questions each semester. However, there are still natural tendencies and preferences that affect how an individual professor grades. The objective of this research was to quantitatively assess how professor grading biases influenced exam scores in the same upper level course offered at multiple universities.

The course selected for analysis was an introduction to the design of reinforced concrete structures, a common course in many civil engineering curricula. Three professors at three different universities taught similar topics using their unique teaching styles and methods. During the semester, the same exam questions were posed to the students at each university. To understand how grading biases propagated throughout the exam questions, each of the professors re-graded the questions from all three universities at the conclusion of the course after the student identifiers were removed. The final scores from each professor were compared for 35 unique questions completed by up to 57 total students. Prior to re-grading the exams, the point value for each question was agreed upon, but there was no common grading rubric and no communication between professors about grading methodology.

Differences were measured among the scores that were assigned by each professor. Statistical differences in the scores for each question were assessed using Tukey’s method to compare individual means in the analysis of variance. In approximately 66% of the individual problems, the Tukey method revealed that there was no statistical difference in the grades (p-values > 0.05). Half of the problems that had statistically different grades (p-values < 0.05) had point totals of five or less (on a 100 point exam), which meant they were short answer or simple problems. While the differences in grading were minor on most individual problems, these differences perpetuated for each grader. A sum of the average grade for each professor over all 35 problems indicated a difference in total grades that ranged from 2 to 7 percentage points. This study showed that, while grading bias does occur, there were limited statistical differences in most questions among the three professors in this study.

Dymond, B. Z., & Swenty, M. K., & Carroll, C. (2018, June), The Influence of Grading Bias on Reinforced Concrete Exam Scores at Three Different Universities Paper presented at 2018 ASEE Annual Conference & Exposition , Salt Lake City, Utah. 10.18260/1-2--31115

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2018 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015