posted on 2025-04-02, 13:35authored byIvan Garcia, Viviana Moya, Andrea Pilco, Piero Vilcapoma, Leonardo GuevaraLeonardo Guevara, Robert Guamán-Rivera, Oswaldo Menéndez, Juan Pablo Vasconez
<p> Recognizing human intentions to make robots able to collaborate with humans has been a high-interest research topic in recent years. In particular, non-verbal communication methods can provide useful approaches to reach a fluid and natural interaction between humans and robots. This article investigates the role of hand gesture recognition in human-robot interaction (HRI) scenarios, in which a human communicates with a robot using hand gestures. In particular, we emphasize the training and calibration of YOLOv2 (You Only Look Once) for real-time hand gesture recognition. We use YOLOv2 to detect five distinct hand gestures (index finger, full open hand, fist, thumb up, and peace sign). Then, we program 6∘C of freedom (DoF) manipulator educational robot to perform a specific task based on the recognized gesture. We explore the creation of user-specific models, which achieve a mean average accuracy of up to 99.02%, and user-general models, with a mean average accuracy of up to 90.52% respectively. The outcomes of this research could significantly influence the HRI field and herald new developments enhancing the efficiency and fluidity of human-robot non-verbal communication. </p>
Lincoln Institute for Agri-Food Technology (Research Outputs)
Publication Title
Advanced Research in Technologies, Information, Innovation and Sustainability: 4th International Conference, ARTIIS 2024, Santiago de Chile, Chile, October 21–23, 2024, Revised Selected Papers, Part II
Volume
2346 (Communications in Computer and Information Science (CCIS))
Subject to Springer Nature Terms of Use for accepted manuscripts of subscription articles, books and chapters (https://www.springernature.com/gp/open-science/policies/accepted-manuscript-terms)
Will your conference paper be published in proceedings?