Asee peer logo

Automated Assessment of Systems Engineering Competencies

Download Paper |

Conference

2018 ASEE Annual Conference & Exposition

Location

Salt Lake City, Utah

Publication Date

June 23, 2018

Start Date

June 23, 2018

End Date

July 27, 2018

Conference Session

Systems Engineering Division Technical Session 4 – Systems Thinking Integration and Systems Engineering Skills Evaluation

Tagged Division

Systems Engineering

Page Count

13

DOI

10.18260/1-2--29840

Permanent URL

https://peer.asee.org/29840

Download Count

601

Request a correction

Paper Authors

biography

Peizhu Zhang Stevens Institute of Technology (School of Systems & Enterprises)

visit author page

Peizhu Zhang is a Ph.D. Candidate at Stevens Institute of Technology specializing in Systems Engineering. His research interests include systems engineering, competency assessment, software engineering, and serious games. He has over 10 years of experience in design and development of software systems. Mr. Zhang holds a B.S. in software engineering from Beijing University of Technology as well as an M.S. in Computer Science from Stevens Institute of Technology.

visit author page

biography

Jon Patrick Wade Stevens Institute of Technology (School of Systems & Enterprises)

visit author page

Jon Wade, Ph.D., is a professor of practice at the Jacobs School of Engineering at the University of California at San Diego where he is the director of convergent systems engineering designing transdisciplinary education and research programs oriented around the fundamental principles of contemporary closed-loop systems engineering design. Previously, Dr. Wade was a research professor in the School of Systems and Enterprises at the Stevens Institute of Technology where he also served as the chief technology officer of the Systems Engineering Research Center (SERC) UARC. His industrial experience includes serving as executive vice president of Engineering at International Game Technology (IGT), senior director of Enterprise Server Development at Sun Microsystems and director of Advanced System Development at Thinking Machines Corporation. His research interests include complex systems, future directions in systems engineering research and the use of technology in systems engineering and STEM education. Dr. Wade received his S.B., S.M., E.E. and Ph.D. degrees in electrical engineering and computer science from the Massachusetts Institute of Technology.

visit author page

Download Paper |

Abstract

Systems engineering and technical leadership (SETL) is a multidisciplinary practice that is as much an art as a science. While a traditional model of education can teach the fundamental body of knowledge, it is not until this knowledge is put into practice in an integrated, real world environment that a systems engineer can develop the necessary insights and wisdom to become proficient. Organizations and enterprises not only need to improve the existing workforce to enable them to keep up with the demands of the work place, but also require a better approach to assess and evaluate the competencies and learnings of prospective and practicing systems engineering practitioners. Learning assessment is a critical component of accelerated learning. It is imperative to understand individual learning and the efficacy of the various learning experiences. This is critical both in determining the capabilities of the learner, but also enable the continual improvement of the capabilities of the learning experience.

This paper describes a set of Automated Learning Assessment Tools (ALATs) that measure a subject’s proficiency in a set of systems engineering competencies and the efficacy simulated learning experiences through analysis of the data recorded throughout the learners’ participation in a simulation experience. The vehicle that it uses is the Systems Engineering Experience Accelerator which is a new approach to developing the systems engineering and technical leadership workforce, aimed at accelerating experience assimilation through immersive, simulated learning situations where learners solve realistic problems. A prototype technology infrastructure and experience content has been developed, piloted, and evaluated. Traditionally, learning assessment has been done through examinations and experts’ reviews and opinions on students’ work which requires substantial effort. In addition, most approaches emphasize comparing learners’ performance against those of the experts’ and less about the evaluation the actual learning performance of the individuals. Though simulation has been widely adapted by systems engineering learning, it has yet to be used to assess learner competencies and learnings performance in systems engineering and technical leadership learning. The ALATs described in this paper address these issues. This paper describes the evaluation of the capabilities of these tools through their performance in a number of pilot studies. Evidence of systems engineering competencies and learning trajectories is analyzed, compared and contrasted from the perspective of the learner’s performance, behaviors, self-evaluation and finally expert assessments. The limitations and strengths of the various approaches are discussed. Finally, areas of future research in pilot studies and learning assessment tool capabilities are described.

Zhang, P., & Wade, J. P. (2018, June), Automated Assessment of Systems Engineering Competencies Paper presented at 2018 ASEE Annual Conference & Exposition , Salt Lake City, Utah. 10.18260/1-2--29840

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2018 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015