In this project, we focused on closely replicating the human hand gestures and were able to replicate each finger movement independently which gave us freedom to make any kind of gesture by controlling each finger individually. Some of the key insights from the thesis are as follows:
- Leveraged the Mediapipe framework, renowned for its robustness in landmark detection, to meticulously identify key landmarks on the hand.
- Developed bespoke algorithms tailored to analyze the changes in finger length over time.
- Implemented the Python serial module to establish a seamless communication channel between our Python-based gesture recognition system and the Arduino microcontroller responsible for controlling the robotic arm's servo motors. This integration facilitated real-time data transmission, enabling the Arduino to translate detected gestures into corresponding motor commands swiftly and accurately.
- Through meticulous assembly and calibration, integrated the various components of the robotic arm, ensuring precise alignment and functionality.
By meticulously connecting the motors to the Arduino and fine-tuning their responses, we achieved a high degree of fidelity in replicating human hand movements, thereby realizing the full potential of our Gesture Control Robotic Arm.
Complete Documentaion on this Project
-
install these modules numpy, pandas, mediapipe, OpenCV, serial, serialDevice
-
run the gesture_recognition.py this will open a window from your inbuilt camera if you are using external camera change "cameraNo" to 1.
-
connect the motor to the Arduino pins as specified in the ArduimoSerial file