Asee peer logo

A Framework for Evaluation of Large Online Graduate-Level Courses for Professional Learners

Download Paper |

Conference

2020 ASEE Virtual Annual Conference Content Access

Location

Virtual On line

Publication Date

June 22, 2020

Start Date

June 22, 2020

End Date

June 26, 2021

Conference Session

CPDD Session 1 - Generating Intellectual Excitement for Professional Learners

Tagged Division

Continuing Professional Development

Page Count

12

DOI

10.18260/1-2--34004

Permanent URL

https://peer.asee.org/34004

Download Count

795

Request a correction

Paper Authors

biography

Kerrie A. Douglas Purdue University-Main Campus, West Lafayette (College of Engineering) Orcid 16x16 orcid.org/0000-0002-2693-5272

visit author page

Dr. Douglas is an Assistant Professor in the Purdue School of Engineering Education. Her research is focused on improving methods of assessment in large learning environments to foster high-quality learning opportunities. Additionally, she studies techniques to validate findings from machine-generated educational data.

visit author page

biography

Hillary E. Merzdorf Purdue University-Main Campus, West Lafayette (College of Engineering)

visit author page

Hillary E. Merzdorf is a PhD student in Engineering Education at Purdue University. Her research interests are in assessment of digital engineering learning environments, evaluation of educational technology, and the ethical use of student data.

visit author page

Download Paper |

Abstract

Massive open online course (MOOC) platforms have evolved from providing primarily courses that are free or low-cost to working with industries and universities to offer credentials, advanced degrees and professional education. As more engineering schools and corporations develop partnerships with MOOC providers, there is a need for frameworks to guide how to conduct evaluation in the ‘massive’ environment. However, researchers have criticized traditional evaluation metrics are not suitable for MOOC environments. The purpose of this paper is to present an evaluation framework for large online graduate level engineering courses. This framework addresses this need with a comprehensive evaluation plan of practices and outcomes in MOOCs. Modified from Guskey’s (2000) professional development evaluation process, this framework examines learners’ satisfaction and value alongside performance, as well as pedagogies to support learning, application of content, and integration of the course with long-term institutional goals. We present the five levels of criteria, metrics, and data sources and discuss their application to evaluating MOOCs. The five levels of evaluation criteria are: 1) Learner Satisfaction, 2) Learner Outcomes, 3) Pedagogical Practices, 4) Learner Use, and 5) Broader Impacts.

Douglas, K. A., & Merzdorf, H. E. (2020, June), A Framework for Evaluation of Large Online Graduate-Level Courses for Professional Learners Paper presented at 2020 ASEE Virtual Annual Conference Content Access, Virtual On line . 10.18260/1-2--34004

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2020 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015