Asee peer logo

Refinement And Initial Testing Of An Engineering Student Presentation Scoring System

Download Paper |

Conference

2010 Annual Conference & Exposition

Location

Louisville, Kentucky

Publication Date

June 20, 2010

Start Date

June 20, 2010

End Date

June 23, 2010

ISSN

2153-5965

Conference Session

Innovative Teaching and Assessment Tools

Tagged Division

Educational Research and Methods

Page Count

11

Page Numbers

15.1021.1 - 15.1021.11

DOI

10.18260/1-2--16824

Permanent URL

https://strategy.asee.org/16824

Download Count

348

Request a correction

Paper Authors

biography

Tristan Utschig Georgia Institute of Technology

visit author page

Dr. Tristan T. Utschig is a Senior Academic Professional in the Center for the Enhancement of Teaching and Learning and is Assistant Director for the Scholarship and Assessment of Teaching and Learning at the Georgia Institute of Technology. Formerly, he was Associate Professor of Engineering Physics at Lewis-Clark State College. Dr. Utschig has regularly published and presented work on a variety of topics including assessment instruments and methodologies, using technology in the classroom, faculty development in instructional design, teaching diversity, and peer coaching. Dr. Utschig completed his PhD in Nuclear Engineering at the University of Wisconsin – Madison. His technical expertise involves analysis of thermal systems for fusion reactor designs.

visit author page

biography

Judith Norback Georgia Institute of Technology

visit author page

Dr. Norback is a faculty member and the Director of Workforce and Academic Communication in the Stewart School of Industrial and Systems Engineering at Georgia Tech. Before arriving at Tech, she headed her own firm where she conducted research and curriculum development on basic and communication skills for the U.S. Department of Labor, the National Skill Standards Board, and a number of universities. Since 2000, her research has focused on workforce communication skills needed by practicing engineers. She has also co-taught Senior Design, Technical Communication, and Introduction to Statistics; coordinated activities in the Workforce Communication Lab and authored communication instruction for undergraduate engineers. Her research has been sponsored by the Alfred P. Sloan Foundation, NSF, the Engineering Information Foundation and other sources.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Refinement and Initial Testing of an Engineering Student Presentation Scoring System

Abstract

We have previously created and beta tested a workforce-relevant, research-based scoring system for use with engineering student presentations across multiple contexts. Since then, we have systematically validated, refined, and tested the rubric in a five-step process described in some detail for this paper. First, we tested the face validity and usability of the instrument via the collection of additional feedback during focus groups and interviews with: faculty possessing expertise in scoring system design, faculty with experience in engineering design projects that involve student presentations, and additional faculty from a variety of backgrounds. Second, we used this feedback to reduce overlap and complexity in the scoring system items. Third, teaching assistants and the researchers used the scoring system items to provide feedback to approximately 140 students on presentations in a senior design course. Fourth, we made additional modifications and simplifications to the system based on the insights gained from the TA feedback process. Fifth and finally, three raters applied the resulting scoring system to several videotaped student presentations to check for inter-rater reliability and evidence of construct validity. Based on the methodology above, we reduced the instrument from 36 items to 19 items. These items include using concrete examples and details familiar to the audience; consistently referring to how key points fit into the big picture; using graphics which are visually appealing, easy-to-understand, and include helpful labeling; and effectively combining energy, inflection, eye contact and movement; among others. This paper includes a description of the process used to create the instrument, a description of the instrument, the supplemental teaching guidelines under development, and a discussion of the materials’ potential for use across many engineering contexts.

Introduction

At Georgia Tech, with funding from the Engineering Information Foundation and approval from the Institutional Review Board for research with human subjects, we have created and beta tested a workforce-relevant, research-based scoring system for use with engineering student presentations across multiple contexts. The scoring system is designed to enhance students’ presentation skills so they can perform better in class, get a better job, and move quickly up the career ladder. In addition, the system addresses needs for outcomes assessment in communication skills for ABET, can improve the reliability and validity of scoring for engineering student presentations by faculty, and serves as a tool to help match instruction to the assessment and evaluation of student performances involving engineering communication.

In this paper we cover three aspects of the scoring system, its development and its use. First, we describe the current version of the system along with examples from the supplemental teaching guidelines for professors and teaching assistants to use when instructing, assessing, and evaluating engineering student presentations in any university. Together, these tools provide the basis for providing presentation instruction, even by instructors who are not experts in

Utschig, T., & Norback, J. (2010, June), Refinement And Initial Testing Of An Engineering Student Presentation Scoring System Paper presented at 2010 Annual Conference & Exposition, Louisville, Kentucky. 10.18260/1-2--16824

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2010 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015