Asee peer logo

Natural Human-Computer Interface Based on Gesture Recognition with YOLO to Enhance Virtual Lab Users’ Immersive Feeling

Download Paper |

Conference

2024 ASEE Annual Conference & Exposition

Location

Portland, Oregon

Publication Date

June 23, 2024

Start Date

June 23, 2024

End Date

July 12, 2024

Conference Session

Simulations and Virtual Learning

Tagged Division

Computers in Education Division (COED)

Page Count

13

Permanent URL

https://sftp.asee.org/47791

Request a correction

Paper Authors

author page

Momina Liaqat Ali

biography

Zhou Zhang Middle Tennessee State University Orcid 16x16 orcid.org/0000-0003-4599-4339

visit author page

I have been an Associate Professor in the Department of Engineering Technology at Middle Tennessee State University since August 2022. Before taking this position, I was an Assistant Professor at the CUNY New York City College of Technology from August 2017 to August 2022. I earned my Ph.D. degree in Mechanical Engineering with the honor of the James Harry Potter Award for outstanding performance in the Doctoral Program at the Stevens Institute of Technology. My research at Stevens is on robotics and virtual reality used in engineering education. My master's degree was in Electrical Engineering, obtained from Southeast University. I received my bachelor's degree in Mechanical Engineering at Southwest Jiaotong University. I have over 7-years of industrial experience as an electrical engineer and mechanical engineer. I also have extensive teaching experience with respect to various interdisciplinary courses involving Mechanical Engineering, Electrical Engineering, and Computer Science.

visit author page

Download Paper |

Abstract

Hand tracking and gesture recognition are rapidly developing fields with many applications in human-computer interaction (HCI). This technology enables computers to recognize and respond to hand movements and gestures, creating a more natural and intuitive interface. With the increasing popularity of augmented reality (AR) and virtual reality (VR) devices, the demand for advanced hand tracking and gesture recognition technologies is growing. The purpose of this research is to study the current state of the art in hand tracking and gesture recognition and to develop new and improved techniques for HCI applications with ‘You only look once’ (YOLO) models that result in the improvement of the user’s immersive feeling in the virtual world. The research results will be used in a virtual electrical power lab along with the learning management system. To evaluate the implementation, the surveys will be administered before and after the classes. The research will contribute to advancing the technologies by developing new and improved hand tracking and gesture recognition algorithms and integrating them into HCI applications.

Ali, M. L., & Zhang, Z. (2024, June), Natural Human-Computer Interface Based on Gesture Recognition with YOLO to Enhance Virtual Lab Users’ Immersive Feeling Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon. https://sftp.asee.org/47791

ASEE holds the copyright on this document. It may be read by the public free of charge. Authors may archive their work on personal websites or in institutional repositories with the following citation: © 2024 American Society for Engineering Education. Other scholars may excerpt or quote from these materials with the same citation. When excerpting or quoting from Conference Proceedings, authors should, in addition to noting the ASEE copyright, list all the original authors and their institutions and name the host city of the conference. - Last updated April 1, 2015