Asee peer logo

Developing Performance Criteria And Rubrics For Biomedical Engineering Outcome Assessment

Download Paper |

Conference

2006 Annual Conference & Exposition

Location

Chicago, Illinois

Publication Date

June 18, 2006

Start Date

June 18, 2006

End Date

June 21, 2006

ISSN

2153-5965

Conference Session

Design in the BME Curriculum and ABET Assessment

Tagged Division

Biomedical

Page Count

13

Page Numbers

11.444.1 - 11.444.13

DOI

10.18260/1-2--411

Permanent URL

https://strategy.asee.org/411

Download Count

749

Request a correction

Paper Authors

biography

Kay C Dee Rose-Hulman Institute of Technology

visit author page

Kay C Dee is an Associate Professor of Applied Biology and Biomedical Engineering at Rose-Hulman Institute of Technology. Her educational research interests include learning styles, teaching faculty about teaching, student evaluations of teaching, and assessment. Her teaching portfolio includes courses on: biology; biomaterials; cell-biomaterial interactions; cell and tissue mechanics; bioethics, science fiction, and tissue engineering; interdisciplinary engineering problem-solving; and teaching engineering.

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Developing Performance Criteria and Rubrics for Biomedical Engineering Outcome Assessment

Abstract

This paper describes the use of criteria and rubrics to evaluate student achievement of biomedical engineering program outcomes. As a case-study example, a process for creating and using performance criteria and rubrics for program assessment is presented, and the evolution of a sample biomedical engineering-related rubric is described. This paper also includes a “how-to” section, to help faculty develop and critique their own performance criteria and rubrics.

Introduction – Accreditation as Motivation for Performance-Centered Assessment

All engineering programs seeking accreditation from ABET (the major accrediting body of university/college engineering programs within the United States) are required to define the types of skills that students will possess by the time they graduate, and to provide evidence that program graduates possess a set of skills/knowledge designated by ABET. In other words, and using ABET terminology, all engineering programs seeking ABET accreditation must define and measure student achievement of Program Outcomes, and through this process, must demonstrate that their students attain an ABET-designated set of abilities, criteria lettered “a” through “k”[1]). Biomedical Engineering programs must additionally demonstrate that their graduates have: “an understanding of biology and physiology, and the capability to apply advanced mathematics (including differential equations and statistics), science, and engineering to solve the problems at the interface of engineering and biology; the ability to make measurements on and interpret data from living systems, addressing the problems associated with the interaction between living and non-living materials and systems.”[1]

Curriculum maps or topic lists that show how Program Outcomes or ABET criteria are addressed across a curriculum or within given courses can be used to demonstrate that certain types of material are presented to students, but these lists do not provide evidence of students mastering the designated skills. Moreover, “Student self-assessment, opinion surveys, and course grades are not, by themselves or collectively, acceptable methods for documenting achievement of outcomes,”[2] since these assessments provide evidence of either student opinions, or of generalized student achievement across a potentially broad area of study. Programs seeking ABET accreditation must use an assessment strategy which demonstrates the level of student achievement of clearly-defined, designated criteria. Ideally, the assessment strategy will also have the ability to be logically coordinated across a program as a whole; provide feedback that is informative as well as easily organized and interpreted; and facilitate reflection and improvements on multiple levels – from specific, focused areas of the program to a broad, holistic overview of the program. To meet these needs, we have chosen to assess our students’ achievement of Program Outcomes with performance criteria and associated evaluative rubrics.

Dee, K. C. (2006, June), Developing Performance Criteria And Rubrics For Biomedical Engineering Outcome Assessment Paper presented at 2006 Annual Conference & Exposition, Chicago, Illinois. 10.18260/1-2--411

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2006 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015