Lee, Xiao Xu Alexis (2024) Mobile application for sign language learning with real time feedback. Final Year Project, UTAR.
| PDF Download (4Mb) | Preview |
Abstract
With over 70 million deaf people worldwide, sign languages serve as means of communication and connection within Deaf communities. However, limited accessibility of sign language education poses barriers to social inclusion and awareness. This project proposes developing an innovative mobile application for interactive sign language learning to benefit both Deaf individuals and hearing loss individuals globally. The app aims to deliver courses methodically from basic vocabulary to advanced grammar, diverse learning materials like video demonstrations, quizzes and exercises. A major innovation of this project involves integrating computer vision and machine learning for real-time sign recognition and feedback during signing exercises. Machine learning algorithms using MediaPipe and deep learning will analyse users' hand motions to provide corrections for improving technique. Overall, this project strives to transform sign language learning through assistive technologies. This mobile application aspires to deliver innovative tools empowering deaf and hearing loss individuals globally to connect across social barriers.
Item Type: | Final Year Project / Dissertation / Thesis (Final Year Project) |
---|---|
Subjects: | H Social Sciences > H Social Sciences (General) H Social Sciences > HB Economic Theory T Technology > T Technology (General) T Technology > TD Environmental technology. Sanitary engineering |
Divisions: | Faculty of Information and Communication Technology > Bachelor of Computer Science (Honours) |
Depositing User: | ML Main Library |
Date Deposited: | 23 Oct 2024 14:01 |
Last Modified: | 23 Oct 2024 14:01 |
URI: | http://eprints.utar.edu.my/id/eprint/6655 |
Actions (login required)
View Item |