Lee, Teck Junn (2024) Automated hand gesture recognition for enhancing sign language communication. Final Year Project, UTAR.
Abstract
This paper introduces a novel approach aimed at enhancing communication between individuals who are deaf or hard of hearing and those unfamiliar with sign language. The project addresses this challenge by developing a mobile application that harnesses the power of smartphone cameras, coupled with a deep learning model, to interpret hand gestures and provide real-time contextual information to users. It emphasizes the widespread adoption of smartphones and the practical applicability of mobile applications in real-life scenarios. Furthermore, the paper proposes a new methodology leveraging Google’s MediaPipe, which outperforms traditional approaches such as transfer learning with pre-trained object detection models in deep learning model development. Of paramount importance is the seamless integration of the deep learning model with the mobile application, enabling real-time detection and recognition on the mobile application.
Actions (login required)