Asee peer logo

The Use of Mixed Methods in Academic Program Evaluation

Download Paper |

Conference

2021 Fall ASEE Middle Atlantic Section Meeting

Location

Virtually Hosted by the section

Publication Date

November 12, 2021

Start Date

November 12, 2021

End Date

November 13, 2021

Page Count

15

DOI

10.18260/1-2--38449

Permanent URL

https://strategy.asee.org/38449

Download Count

293

Request a correction

Paper Authors

biography

Michael B. O'Connor PE P.E. New York University Orcid 16x16 orcid.org/0000-0001-5317-3392

visit author page

Michael O’Connor, Retired Professional Civil Engineer (Maryland and California), M.ASCE, is a member
of the ASCE Committee on Developing Leaders, History and Heritage, Civil Engineering Body of
Knowledge (CEBoK), and Engineering Grades. Michael has been a practicing Civil Engineer with over
50 years of engineering, construction, and project management experience split equally between the public
and private sectors. Programs ranged from the San Francisco Bay Area Rapid Transit district’s 1990’s
expansions in the East Bay and SFO Airport at three billion to the New Starts program for the Federal
Transit Administration with over a hundred projects and $85 billion in construction value. At the latter, he
also acted as source selection board chairman and program COTR for $200 plus million in task order contracts
for engineering services. Working for the third-largest transit agency in the United States, the Los
Angeles County MTA, Michael managed bus vehicle engineering for $1 billion in new acquisitions and
post-delivery maintenance support for 2300 vehicles with some of the most complex technology (natural
gas engines and embedded systems) in the US transit industry in the 1990s. Michael also has extensive
experience as an instructor at New York University (five years), Howard University (four years), and
California State University- San Francisco (ten years).

visit author page

Download Paper |

Abstract

The Accreditation Board for Engineering and Technology (ABET) and Middle States Commission on Higher Education (MSCHE) accredit engineering degree programs. Their accreditation efforts assure the public that programs successfully prepare their graduates to enter critical STEM fields in the global workforce. An ongoing assessment and evaluation process must support engineering degree programs in higher education institutions. These assessments result in quantitative and qualitative data used in evaluation processes that integrate learning attainment goals and all assessment data. Engineering degree programs present unique organizational and logistical challenges in meeting accreditation requirements, such as integrating qualitative and quantitative data and information in a meaningful way that facilitates continuous improvement and supports inferences about student achievement of learning objectives. Programmatic success is predicated on using the results from programmatic evaluation in an ongoing program control process, commonly known as "continuous improvement." This control capability requires the ability to detect deviations from the program baseline or learning outcomes and promptly intervene to correct the problem. In 2020, the XXX, XXX school of engineering was readying for an accreditation visit. Part of that process was preparing a self-study report. A survey of other program reports indicated a common approach to integrating data and information was to use weights to combine the various data elements (quantitative and qualitative). The practice then was to set a "target" value that signified an acceptable level of student attainment. A criticism of this approach is that it isn't sufficiently granular to detect early problems or trends and, more importantly, doesn't adequately support corrective action by the program. The XXX BSCE program did not choose this integration method. The program choice was a method often used in the Federal government where data sources are highly diverse and varied in quality. The Federal government, like accreditation agencies, wants to integrate all assessment data and learning objectives to make valuable inferences about programmatic success. The methodological approach in those Federal contexts is to focus on broad themes and longitudinal studies. This results in programmatic decisions based upon trends with pre-established trigger points that signal the need to intervene and is well adapted for application in an environment such as ABET/MSCHE accreditation efforts. A literature search indicates this approach is known in other disciplines as mixed methods. While mixed methods offer rich potential, there has not been extensive research on specific applications to solve problems such as academic program evaluation. The significant finding of this paper is that this approach to academic program evaluation by XXX is innovative and constitutes a new application of mixed methods relevant to the engineering education community. It also presents recommendations for applying the mixed-method approach in an environment where engineering degrees conduct program assessments that must meet ABET and MSCHE requirements.

O'Connor, M. B. (2021, November), The Use of Mixed Methods in Academic Program Evaluation Paper presented at 2021 Fall ASEE Middle Atlantic Section Meeting, Virtually Hosted by the section. 10.18260/1-2--38449

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2021 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015