Asee peer logo

Apples And Oranges? A Proposed Research Design To Examine The Correspondence Between Two Measures Of Engineering Learning

Download Paper |

Conference

2008 Annual Conference & Exposition

Location

Pittsburgh, Pennsylvania

Publication Date

June 22, 2008

Start Date

June 22, 2008

End Date

June 25, 2008

ISSN

2153-5965

Conference Session

Assessment

Tagged Division

Educational Research and Methods

Page Count

9

Page Numbers

13.207.1 - 13.207.9

DOI

10.18260/1-2--4064

Permanent URL

https://peer.asee.org/4064

Download Count

337

Request a correction

Paper Authors

biography

Patrick Terenzini The Pennsylvania State University

visit author page

Distinguished Professor of Education and Senior Scientist in the Center for the Study of Higher Education.

visit author page

biography

Lisa Lattuca Pennsylvania State University

visit author page

Associate Professor of Education and Senior Research Associate in the Center for the Study of Higher Education

visit author page

biography

Matthew Ohland Purdue Engineering Education Orcid 16x16 orcid.org/0000-0003-4052-1452

visit author page

Associate Professor and Director of First-Year Engineering in the Department of Engineering Education

visit author page

biography

Russell Long Purdue University

visit author page

Director of Project Assessment

visit author page

Download Paper |

Abstract
NOTE: The first page of text has been automatically extracted and included below in lieu of an abstract

Apples and Oranges? A Proposed Research Design to Examine the Correspondence Between Two Measures of Engineering Learning

Abstract

In 2004, ABET commissioned Engineering Change, a study of the impact of Engineering Criteria 2000 (EC2000) on the preparation of undergraduates for careers in engineering. One legacy of that study is a database of EC2000-specific self-reported student learning outcomes at 40 institutions, including precollege characteristics and engineering program outcomes for more than 4,300 graduates of the class of 2004. A second dataset, the Multiple-Institution Database for Investigating Engineering Longitudinal Development (MIDFIELD), compiles institutional data, including demographic and academic transcript records and Fundamentals of Engineering (FE) scores, from nine universities from 1987-2005. In this paper, we propose a design to combine data from the two databases to assess the correspondence between the self-reported student learning outcome measures in the Engineering Change study and the MIDFIELD dataset's information on program-level performance on the FE examination, the only objective test of students’ engineering knowledge.

Introduction

Throughout its history, U.S. higher education has been mindful of questions about educational quality and institutional accountability. Formal accreditation mechanisms emerged in the early 20th century. Although the public has periodically engaged in these discussions, those who fund higher education – state and federal government, business and industry, and philanthropic foundations – have wielded the greatest influence.1 Financial accountability is a dimension of these concerns, but the evaluation and assessment of educational effectiveness has emerged over the past two decades as an important corollary.

The current period of emphasis on accountability in the U.S. began in the 1980s and is roughly contemporaneous with expressions of heightened concern about the quality of engineering education programs and practices. The pressure for greater accountability, and the national conversations about the appropriate metrics for judging and ensuring educational quality that ensued, influenced the policy context for these discussions and the deliberations of accreditors. The Council for Higher Education Accreditation (CHEA), which recognizes individual accrediting agencies, now endorses assessment of student learning outcomes as one dimension of accreditation. Its endorsement, however, followed changes in the accreditation criteria in many regional and professional agencies that had already reduced their emphasis on quantitative measures of available resources and mandated that judgments of educational effectiveness be based on measurable outcomes.2 Today, the higher education community generally accepts the need for assessment data to inform decision-making and acknowledges the need for rigorous methods that can provide this information to programs, colleges and universities, accreditation agencies, and state and federal governments.

This paper proposes a research design for a study of the correspondence between two publicly available assessment tools: the Fundamentals of Engineering (FE) examination and the student

Terenzini, P., & Lattuca, L., & Ohland, M., & Long, R. (2008, June), Apples And Oranges? A Proposed Research Design To Examine The Correspondence Between Two Measures Of Engineering Learning Paper presented at 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. 10.18260/1-2--4064

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2008 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015