Artificial Intelligence-Based Hand Gesture Detection for Smart HCI Systems
Keywords:
Artificial intelligence, Convolutional Neural Networks (CNNs), Embedded system, Gesture recognition, Human-computer Interaction (HCI), Image processing, OpenCV, Touchless controlAbstract
Gesture recognition has emerged as a significant component of modern Human-Computer Interaction (HCI), offering a more natural and intuitive mode of communication between users and machines. With the advancement of Artificial Intelligence (AI), particularly in the areas of machine learning and deep learning, gesture recognition systems have evolved from basic rule-based systems to sophisticated models capable of recognizing complex gestures in real time. The integration of AI enables systems to learn from large datasets, adapt to varying user behaviours, and improve their accuracy over time. This paper explores how AI enhances gesture recognition, focusing on the technologies, algorithms, and architectures employed. We investigate both sensor-based and vision-based approaches, highlighting the importance of pre-processing, feature extraction, and classification in creating effective gesture recognition systems. Experimental results show that AI-powered models achieve high recognition accuracy across diverse gesture sets. Applications in healthcare, robotics, gaming, and smart environments further illustrate the system’s practical impact. Despite the progress, challenges such as lighting conditions, occlusion, and computational cost remain. This research provides a comprehensive overview of gesture recognition systems driven by AI and discusses future directions for making these systems more robust, scalable, and accessible. The findings demonstrate the potential of AI to transform HCI, enabling seamless and contactless interactions that enhance user experience and open doors to innovative applications across multiple industries.