Asee peer logo

Implementation and Results of a Revised ABET Assessment Process

Download Paper |

Conference

2013 ASEE Annual Conference & Exposition

Location

Atlanta, Georgia

Publication Date

June 23, 2013

Start Date

June 23, 2013

End Date

June 26, 2013

ISSN

2153-5965

Conference Session

ABET Accreditation, Assessment and Program Improvement in ECE.

Tagged Division

Electrical and Computer

Page Count

21

Page Numbers

23.694.1 - 23.694.21

DOI

10.18260/1-2--19708

Permanent URL

https://strategy.asee.org/19708

Download Count

407

Request a correction

Paper Authors

biography

Diane T. Rover Iowa State University

visit author page

Diane Rover is a Professor in the Department of Electrical and Computer Engineering at Iowa State University. She has served on the Engineering Accreditation Commission of ABET since 2009. From 2006-2009, she served on the IEEE Committee on Engineering Accreditation Activities. Since 2009, she has held the positions of secretary/treasurer, program chair, chair-elect, and chair of the ASEE ECE Division. From 2000-2008, she led the Academic Bookshelf column as a senior associate editor for the ASEE Journal of Engineering Education. Dr. Rover was Associate Dean for Academic and Student Affairs in the College of Engineering from 2004-2010. Prior to that, she served as associate chair for undergraduate education in the Department of Electrical and Computer Engineering from 2003-2004. She began her academic career at Michigan State University. She received the B.S. in computer science in 1984, and the M.S. and Ph.D. in computer engineering in 1986 and 1989, respectively, from Iowa State University. Her teaching and research has focused on embedded computer systems, reconfigurable hardware, integrated program development and performance environments for parallel and distributed systems, visualization, performance monitoring and evaluation, and engineering education. Dr. Rover is a 2012 ASEE Fellow and member of the IEEE Computer Society, the IEEE Education Society, and ASEE.

visit author page

biography

Douglas W. Jacobson Iowa State University Orcid 16x16 orcid.org/0000-0002-6835-4687

visit author page

Doug Jacobson is a University Professor in the Department of Electrical and Computer Engineering at Iowa State University. He is currently the director the Iowa State University Information Assurance Center, which has been recognized by the National Security Agency as a charter Center of Academic Excellence for Information Assurance Education. He teaches network security and information warfare and has written a textbook on network security. For a non-technical audience he co-authored a book on security literacy and has given numerous talks on security. His current funded research is targeted at developing robust countermeasures for network-based security exploits and large scale attack simulation environments and is the director of the Internet-Scale Event and Attack Generation Environment (ISEAGE) test bed project. He has given over 75 presentations in the area of computer security and has testified in front of the U.S. Senate committee of the Judiciary on security issues associated with peer-to-peer networking. He has served as an ABET program evaluator representing IEEE for five years. He is a Fellow of IEEE and received the IEEE Educational Activities Board Major Educational Innovation Award in 2012 for his work in teaching information assurance to students of all ages.

visit author page

biography

Ahmed E. Kamal Iowa State University

visit author page

Ahmed E. Kamal received a B.Sc. (distinction with honors) and an M.Sc. both from Cairo University, Egypt, and an M.A.Sc. and a Ph.D. both from the University of Toronto, Canada, all in Electrical Engineering in 1978, 1980, 1982 and 1986, respectively. He is currently a professor of Electrical and Computer Engineering at Iowa State University.
Kamal's research interests include wireless sensor networks, cognitive radio networks, optical networks and performance evaluation. He is a Fellow of the IEEE and a senior member of the Association of Computing Machinery. He is an IEEE Communications Society Distinguished Lecturer for 2013 and 2014. He received the best paper award of the IEEE Globecom 2008 Symposium on Ad Hoc and Sensors Networks Symposium, and the best paper award for papers published in Computers and Control in IEE journals in 1993.
Kamal serves on the editorial boards of the IEEE Communications Surveys and Tutorials, the Elsevier Computer Networks (COMNET) journal and the Elsevier Optical Switching and Networking journal. He was named the COMNET Editor of the Year in 2008.
Kamal was the chair or co-chair of the Technical Program Committees of a number of conferences including the Optical Networks and Systems Symposia of the IEEE Globecom 2007 and the IEEE Globecom 2010, and the Cognitive Radios and Networks Symposium of the IEEE Globecom 2012 conference. He will also serve as the co-chair of the IEEE Globecom 2014 Cognitive Radios and Networks Symposium. He also served as the chair of the Computer Systems section in the Computer Science Grants Evaluation group of the Natural Science and Engineering Research Council of Canada. He is currently the secretary of the IEEE Communications Society Technical Committee on Transmission, Access and Optical Systems.

visit author page

biography

Akhilesh Tyagi Iowa State University

visit author page

Akhilesh Tyagi is an associate professor of computer engineering at Iowa State University. He has also been with Computer Science department at Iowa State University, Laboratory for Computer Science at MIT, Computer Science department at UNC-Chapel Hill. He teaches classes in embedded systems and computer architecture. He received his PhD in Computer Science from University of Washington in 1988.

visit author page

Download Paper |

Abstract

Implementation and Results of a Revised ABET Assessment ProcessThe electrical and computer engineering programs at a public land-grant university were reviewed bythe Engineering Accreditation Commission of ABET during fall 2012. This paper will describe thedepartment’s revisions to its process of assessing student outcomes since the last visit in light of thecurrent criteria for accrediting engineering programs and in the interests of efficiency and sustainability.The revised process involves a larger number of faculty members in specific ways. Having a criticalmass of faculty involved ensures that the expectations of Criterion 6 are met, which states, in part, thatfaculty must be qualified to develop and implement processes for the evaluation, assessment, andcontinuing improvement of the program, its educational objectives and outcomes. At least two facultymembers in a program were deeply involved in the assessment process. These core faculty eitherattended ABET workshops to enhance their knowledge or were trained as program evaluators.Expanding the core group builds a foundation on which to sustain assessment and improvement effortsover time. In addition to the core expert group, other faculty members were enlisted for specificassessment and evaluation tasks. This had multiple benefits, including spreading the workload amongthe faculty, sharing the responsibility for program improvement, and creating greater awareness of howto assess student learning. There were also various challenges, which will also be addressed in the paper.The result, however, is that the program faculty as a whole are better positioned to understand, supportand use the assessment process.The revised process also takes a multilevel approach that leverages existing assessment tools and bestpractices. Data are collected from different types of measurements at three different levels. Informalfeedback is routinely obtained from student surveys, student forums, and comments by faculty andstudents. Direct, formal measurements tap into four sources: senior design, a required portfolio class, asmall number of required courses before the senior year, and OPAL surveys administered everysemester by the college and completed by employers of students on internships. The levels provide arange of information. Level 1 assessment uses high-level information from a cross-section of students inthe program that can be used to identify trends and potential problems. It is done frequently,automatically, and with little overhead. Level 2 assessment uses senior-level information from allstudents in the culminating capstone courses. Students demonstrate attainment of outcomes throughsenior design projects and other summative information in portfolios. Level 3 assessment usessophomore- and junior-level information from students in selected required courses. Student learning isassessed using rubrics and assignments that focus on specific outcomes of interest. This is finer grainedand more specific than the other levels. It is done less frequently. It provides more in-depth examinationof a student outcome earlier in the program at the time the student is learning about it. The multilevelapproach supports efficient data collection while also providing sufficient data to make decisions.Curriculum changes resulting from the evaluation of program educational objectives and studentoutcomes will be presented.

Rover, D. T., & Jacobson, D. W., & Kamal, A. E., & Tyagi, A. (2013, June), Implementation and Results of a Revised ABET Assessment Process Paper presented at 2013 ASEE Annual Conference & Exposition, Atlanta, Georgia. 10.18260/1-2--19708

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2013 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015