Real-Time Sign Language Translation using the KNN Algorithm
Keywords:
American Sign Language (ASL), Effective communication, Graphical User Interface (GUI), Implementation, Key-pointAbstract
Addressing the imperative of bridging communication barriers between the deaf and non-verbal communities this project centres on advancing automated American Sign Language (ASL) recognition through key-point detection-based methodologies. A comprehensive analysis of the model‘s efficacy is conducted, employing rigorous testing methodologies and metrics like F1 score, precision, and recall to ascertain optimal performance. By delving into the nuances of ASL recognition, the project seeks to enhance the accuracy and reliability of machine learning models in deciphering sign language gestures. Additionally, the implementation of a user-friendly graphical user interface (GUI) facilitates seamless interaction, empowering users to effortlessly engage with the system and generate predictions utilizing the most proficient machine learning algorithms. Through this endeavour, the aim is not only to enhance accessibility for the deaf and non-verbal communities but also to foster inclusivity by providing a platform for effective communication between individuals utilizing ASL and those who rely on verbal communication. This interdisciplinary approach merges technological innovation with social responsibility, paving the way for a more inclusive and connected society.