UTAR Institutional Repository

Real-time hand gesture recognition system to interpret sign language

Chong, Siow Yen (2023) Real-time hand gesture recognition system to interpret sign language. Final Year Project, UTAR.

[img]
Preview
PDF
Download (3950Kb) | Preview

    Abstract

    Sign Language plays a major part in communications among individuals with hearing impairments or hearing-challenged individuals, a medium which allows them to participate in society. Through sign language, it allows them to get access to real-world news and access important information. Its significance can be proven when sign language interpreters appear on screen alongside the news anchor in our daily news broadcastings, with the addition of closed captions. The central component of sign language is the hand gestures. It is used to communicate words, phrases, and ideas, with each gesture having a specific meaning. Hand gestures in sign language can be divided into two main gestures, static gestures which do not involve movement, such as the alphabet-fingering or thumbs up gesture. while dynamic gestures are hand movements which involve changing of hand shape or position, such as waving, pointing and more. The applications of hand gestures in sign language include in communication, education, interpretation, accessibility in the form of visual format, and cultural events to translate lyrics, plays and musicals to sign language that is easily understood by individuals with hearing impairments or have difficulty in hearing. The current phenomenon reflects that most hearing people are unaware of sign language and do not take the time to learn it, leading to miscommunication and poor understanding between the hearing and the deaf community. They also view sign language as less of importance compared to spoken language. It is expected that with the existence of a real-time sign language interpretation tool would educate these ignorant people and increase their acceptance towards the individuals with hearing impairments or hearing-challenged individuals’ culture. The building of this System To Interpret Sign Language aims to bridge the communication gap between the hearing and individuals with hearing impairments or hearing-challenged individuals. It shall act as a complement rather than a replacement to the existing sign language interpreter to enhance the communication and understanding among the communities. To achieve that, a tool will be developed to ease communication between the individuals with hearing impairments or hearing-challenged individuals. The sensors and cameras will detect the hand gestures, then translate the hand gestures to captions. There are several studies on sign language recognition systems, which built with techniques, which includes OpenCV, OpenCV with Mediapipe, LeapMotion Controller (LMC) with training of Convolutional Neural Network (CNN), which will be discussed further in the literature review. The project will follow five major steps: pre-processing, feature extraction, segmentation and dimension reduction, classification, and model evaluation. The system is built with OpenCV Mediapipe in addition of Neural Network model with TensorFlow and Keras API. The results demonstrate that the system can detect 10 alphabets (A, E, H, I, L, N, O, S, T, U) and 7 vocabulary words (Best, Birthday, Please, Happy, Hearing, Like, Feel), with accuracies of 96% and 53% respectively. This project serves as a valuable tool in fostering communication and understanding between these communities.

    Item Type: Final Year Project / Dissertation / Thesis (Final Year Project)
    Subjects: Q Science > QA Mathematics > QA76 Computer software
    Divisions: Lee Kong Chian Faculty of Engineering and Science > Bachelor of Science (Honours) Software Engineering
    Depositing User: Sg Long Library
    Date Deposited: 25 Nov 2023 02:07
    Last Modified: 25 Nov 2023 02:07
    URI: http://eprints.utar.edu.my/id/eprint/6092

    Actions (login required)

    View Item