Lim, Yueh Sheng (2022) Application of intelligent gesture control to simultaneously control multiple remote devices. Final Year Project, UTAR.
Abstract
The famous superhero Iron Man has two well-known icons, Iron Man suits and the super artificial intelligent J.A.R.V.I.S. that Tony Stark can either command it using his voice or swing his hand over the air. Although hand gesture control over hologram display might still be difficult to achieve, hand gesture technologies are already very mature and controlling our daily devices using hand gestures is achievable and practical. This project will show how to control a device or multiple devices using only a webcam and our hand gestures. The hand detection model used in this project is MediaPipe Hands. MediaPipe Hands is a high-fidelity hand and finger tracking solution. Unlike other object detection models, MediaPipe Hands marks and coordinates 21 landmarks in our hands, including every joint of our fingers. We designed a series of tools that detect and design hand gestures using these coordinations. After designing a hand gesture, the next step is designing an application and putting in the hand gestures we designed as control. We designed two applications to demonstrate hand gestures control: driving mode and precision mode. Driving mode demo the immersive and interactive characteristic of hand gesture control. In contrast, precision mode demo precision of hand gestures can achieve. The device we use to demonstrate is a wireless robot car. This report will demonstrate the capability and practicality of hand gesture control and evaluate the potential and limitations of hand gesture control.
Actions (login required)