Asee peer logo

Comparison Of Two Peer Evaluation Instruments For Project Teams

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

FPD5 - Teaming and Peer Performance

Tagged Division

First-Year Programs

Page Count

19

Page Numbers

13.315.1 - 13.315.19

DOI

10.18260/1-2--3437

Permanent URL

https://peer.asee.org/3437

Download Count

383

Request a correction

Paper Authors

biography

Kerry Meyers University of Notre Dame

visit author page

Kerry L. Meyers is an Associate Professional Faculty Member and Co-coordinator of the First-Year Engineering Program at the University of Notre Dame.

visit author page

biography

Stephen Silliman University of Notre Dame

visit author page

Dr. Silliman is a Professor of Civil Engineering and Geological Sciences and an Associate Dean of Engineering at the University of Notre Dame. His research focuses on groundwater hydraulics and includes educational and research initiatives in developing countries.

visit author page

biography

Matthew Ohland Purdue Engineering Education Orcid 16x16 orcid.org/0000-0003-4052-1452

visit author page

Matthew W. Ohland is an Associate Professor and Director of First-Year Engineering in the School of Engineering Education at Purdue University and is the Past President of Tau Beta Pi, the engineering honor society. He received his Ph.D. in Civil Engineering with a minor in Education from the University of Florida in 1996. Previously, he served as Assistant Director of the NSF-sponsored SUCCEED Engineering Education Coalition. In addition to this work, he studies peer evaluation and longitudinal student records in engineering education.

visit author page

biography

Leo McWilliams University of Notre Dame

visit author page

Dr. Leo H. McWilliams has been a Course Coordinator for the First-Year Engineering Program at the University of Notre Dame since 2001. Prior to joining Notre Dame in this capacity, he worked as a principal engineer at Honeywell International. Dr. McWilliams received his B.A. in economics, B.S.E.E., M.S.E.E. and Ph.D. from the University of Notre Dame.

visit author page

biography

Tracy Kijewski-Correa University of Notre Dame

visit author page

Dr. Kijewski-Correa is the Rooney Assistant Professor in the Department of Civil Engineering and Geological Sciences at the University of Notre Dame. Her research focuses on structural dynamics, wind effects on structures, and the use of novel sensors to assess structural health and integrity.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Comparison of Two Peer Evaluation Instruments for Project Teams

Abstract

The College of Engineering at the University of Notre Dame has utilized a paper-pencil instrument for peer evaluations since 2005 as a portion of the assessment of project team efforts (typically 4-5 students per team) in its First Year Engineering Course. The College was considering moving from paper-pencil peer evaluations to an on-line, behaviorally based evaluation instrument, CATME1. The instructors at Notre Dame conducted a comparative study of student feedback on these two instruments during the fall 2007. During the fall semester, the students (~380) within the first year course were divided into two groups, one group using the paper-pencil instrument and the second group using CATME, both groups of approximately equal size. After completion of peer evaluations for a seven-week course project, the students were required to complete a survey providing their reaction to the instrument they used in terms of perceived simplicity, comfort, confidentiality, usefulness of feedback, and overall experience. Comparison of results from the surveys provided insight into both the relative merit and drawbacks of the two administrations. Several of the follow up survey questions comparing the instruments did not show statistically significant differences in the sample means. In spite of the confounding of the instrument design and the administration method, useful results emerged. The biggest differences in student survey results were seen in the areas of feedback and overall experience, both of which were higher for CATME. Student confidence in instructor confidentiality (keeping their comments confidential) was high for both instruments, but it was slightly higher for the paper-pencil instrument. Because student perception of the quality of the feedback is critical to both rater accuracy and the student learning experience, this study enabled the College to make a data-driven decision to use the CATME instrument in future offerings of the first year course.

Introduction

College students, regardless of their field of study, commonly work collaboratively in groups on course assignments. The benefits of collaborative learning have been well documented2,3,4 and are rarely disputed. However, collaboration can lead to difficulties in evaluating the work of individual students. For example, how can instructors ensure that all students are contributing appropriately towards the completion of a project? There are often concerns over hitchhicking, a phenomenon wherein a student does not contribute adequately towards the project goals and allows teammates to do the majority of the project work. There is a disconnect because the instructor is not typically present for much of the time the group spends working on a project outside of class, yet the instructor must assign individual course grades. Social dominance is another potential issue, wherein a student takes over a project and does not allow other group members to contribute to project goals in a meaningful way. Given these challenges, finding an effective method to assess and assign individual contributions to group work is a topic of much research and debate within the education community, with substantial attention being paid to the benefit, and possible limitations, of peer evaluation methods5,6,7.

Meyers, K., & Silliman, S., & Ohland, M., & McWilliams, L., & Kijewski-Correa, T. (2008, June), Comparison Of Two Peer Evaluation Instruments For Project Teams Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--3437

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015