Scholarly open access journals, Peer-reviewed, and Refereed Journals, Impact factor 8.14 (Calculate by google scholar and Semantic Scholar | AI-Powered Research Tool) , Multidisciplinary, Monthly, Indexing in all major database & Metadata, Citation Generator, Digital Object Identifier(DOI)
We’ve developed a robotic vehicle that can be controlled by hand gestures, and no remote control is required. You can simply hold your hand up in front of a camera and the car will respond in real time. Here’s how we made this happen: We used Google’s Media Pipe to detect 21 key points on your hand via a webcam. The 21 points detected on your hand were mapped into five basic commands: FORWARD, BACKWARD, LEFT, RIGHT, and STOP. Then those five basic commands were sent over the air to the car’s “brain” – an ESP32 microcontroller that drives the motor. We also added an important safety feature: an ultrasonic sensor located in front of the car functions as the car’s eyes, and will brake the car automatically if it detects an obstacle while the car is moving in the direction of the obstacle, regardless of whether you’re telling the car to move forward. To add more interactivity to the vehicle, we connected a mini MP3 player to the car, which provides audio feedback as the car is driving. For example, the car may say "forward" as it moves forward. Additionally, as the car is moving, the car is sending data such as distance traveled and the last command it was given to a cloud database. We did this with your privacy in mind, and thus there are no video frames being stored or sent by our system. When we tested the car, it felt very responsive to our hand gestures, and responded to our movements within approximately 150 milliseconds of us making a gesture. It can capture our movements at 25 frames per second and is quick enough to avoid obstacles in approximately 200 milliseconds. We also wanted to ensure the car was durable, and provided some internal protection so it would continue to function properly even if the network connection goes down momentarily. Finally, we designed the car to cost under ₹4500 ($55 USD) to build, so it is a low-cost and easy-to-replicate project for people who want to learn about IOT, embedded systems, computer vision, and/or human-robot interaction.
Keywords:
Hand Gesture Detection, Media Pipe, ESP32, Ultrasonic Sensor, Firebase, Smart Automobile, Human Computer Interaction, Embedded IOT, Edge Computing, Robotics Education.
Cite Article:
"Hand-Gesture Smart Automobile: ESP32-Powered Robotic Vehicle Controlled by Real-Time Hand Gestures via MediaPipe", International Journal for Research Trends and Innovation (www.ijrti.org), ISSN:2455-2631, Vol.10, Issue 11, page no.a469-a476, November-2025, Available :http://www.ijrti.org/papers/IJRTI2511055.pdf
Downloads:
000225
ISSN:
2456-3315 | IMPACT FACTOR: 8.14 Calculated By Google Scholar| ESTD YEAR: 2016
An International Scholarly Open Access Journal, Peer-Reviewed, Refereed Journal Impact Factor 8.14 Calculate by Google Scholar and Semantic Scholar | AI-Powered Research Tool, Multidisciplinary, Monthly, Multilanguage Journal Indexing in All Major Database & Metadata, Citation Generator