Asee peer logo

A Performance Assessment Framework for Measuring Online Student Learning Outcomes

Download Paper |

Conference

2013 ASEE Annual Conference & Exposition

Location

Atlanta, Georgia

Publication Date

June 23, 2013

Start Date

June 23, 2013

End Date

June 26, 2013

ISSN

2153-5965

Conference Session

Online Learning

Tagged Division

Computers in Education

Page Count

19

Page Numbers

23.88.1 - 23.88.19

DOI

10.18260/1-2--19102

Permanent URL

https://peer.asee.org/19102

Download Count

620

Request a correction

Paper Authors

author page

Petronella A James-Okeke Morgan State University

author page

Craig J. Scott Morgan State University

biography

Yacob Astatke Morgan State University

visit author page

Dr. Yacob Astatke completed both his Doctor of Engineering and B.S.E.E. degrees from Morgan State University (MSU) and his M.S.E.E. from Johns Hopkins University. He has been a full time faculty member in the Electrical and Computer Engineering (ECE) department at MSU since August 1994 and currently serves as the Associate Chair for Undergraduate Studies. Dr. Astatke is the winner of the 2012-2013 American Society for Engineering Education (ASEE) Mid-Atlantic Region Distinguished Teacher Award. He teaches courses in both analog and digital electronic circuit design and instrumentation, with a focus on wireless communication. He has more than 15 years experience in the development and delivery of synchronous and asynchronous web-based course supplements for electrical engineering courses. Dr. Astatke played a leading role in the development and implementation of the first completely online undergraduate ECE program in the State of Maryland. He has published over 40 papers and presented his research work at regional, national and international conferences. He also runs several exciting summer camps geared towards middle school, high school, and community college students to expose and increase their interest in pursuing Science Technology Engineering and Mathematics (STEM) fields. Dr. Astatke travels to Ethiopia every summer to provide training and guest lectures related to the use of the mobile laboratory technology and pedagogy to enhance the ECE curriculum at five different universities.

visit author page

biography

Jumoke Oluwakemi Ladeji-Osias Morgan State University Orcid 16x16 orcid.org/0000-0002-8645-696X

visit author page

Dr. Jumoke Ladeji-Osias is Associate Professor and Associate Chair for Graduate Studies in the Department of Electrical and Computer Engineering at Morgan State University. She earned in B.S. in electrical engineering from the University of Maryland, College Park and a Ph.D. in biomedical engineering from Rutgers, The State University of New Jersey. She coordinates the departmental graduate program and teaches both undergraduate and graduate courses in computer engineering, primarily in designing digital systems for hardware. She is the PI for Scholars in Engineering (SiE), an NSF S-STEMS scholarship for undergraduate and Master’s students. She is a member of the Morgan team that is developing online laboratory courses for undergraduate students. Her research expertise is in algorithm optimization for FPGA implementation and her research group has developed a novel biologically inspired image fusion algorithm. She has over 35 journal and conference publications combined.

visit author page

author page

LaDawn E. Partlow M. Eng. Morgan State University

author page

Kofi Nyarko Morgan State University

Download Paper |

Abstract

A Performance Assessment Framework for Measuring Online Student Learning OutcomesTrends in higher education for the past ten years have shown that enrollments in online coursesor online degree programs have been growing substantially faster than overall higher educationenrollment. There are few if any Accreditation Board for Engineering and Technology (ABET)accredited programs that are completely online. Coupled with the need for innovative approachesand to offer laboratory courses online is the need for assessment tools that allow the collectionand analysis course outcomes and objectives in a seamless manner. The philosophy driving thisprocess is to allow the instructor to focus on course outcomes via embedded problems andlaboratory exercises while the program outcomes are derived from a mapping between the twolevels. In this session, assessment of online course student learning outcomes using a rubricbased performance assessment framework (Searchlight Performance Assessment™ ) , which wasdeveloped at our institution, will be presented. Participants who attend this session will get afirsthand view of how student assessment data is embedded, captured and analyzed usingSearchlight Performance Assessment™, and how the data is aggregated and used to informdepartment progress.This session will review online course assessments for two academic years. The performanceassessment frame work offers the means to perform program assessments through graphing bothdirect and indirect measures of course outcomes. At the program level, tabular and graphicaloutputs are created with drill down properties that allow one to determine the source ofproblematic results. The program outcomes are then mapped to each course with the associatedperformance criteria. The performance criteria measures specific outcomes that are determinedthrough the use of electronic rubrics. Mapping and the entry and setup of course and programoutcomes are accomplished through parsing Microsoft Word formatted course syllabi andprogram annual report forms. Courses outcomes are satisfied through embedded questions andexercises during the academic term. At the end of the academic term, the instructor completes aFaculty Course Assessment Report (FCAR). In order to ensure continual improvement as part ofthe assessment process, adjustments are made at the course level to improve upon theinstructional strategies employed for each class and noted in the individual Searchlight FacultyFCAR reports.As one example, students having difficulty in circuits class with identifying various approachesto solving problems and applying engineering concepts to solving problems. In another example,it was found that there was a need for students to have both better professional development andeducation plans. In both cases specific course revisions and teaching strategies adopted andemployed with the help of the performance assessment framework. Improvements in both areaswere measured and sustained within one academic term. In summary, the searchlight assessmenttool facilitates an easier and data-driven effective analysis of both program and course outcomesespecially as applied to courses offered in an online environment.

James-Okeke, P. A., & Scott, C. J., & Astatke, Y., & Ladeji-Osias, J. O., & Partlow, L. E., & Nyarko, K. (2013, June), A Performance Assessment Framework for Measuring Online Student Learning Outcomes Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. 10.18260/1-2--19102

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015