A real‑time gesture‑to‑music system combining Python-based hand‑gesture recognition with a LabVIEW front end that translates detected signs into dynamic soundscapes.
- Capture & Preprocess
- Live video feed from webcam
- Hand landmark detection via MediaPipe Hands
- Classification
- Features extracted → Random Forest model classifies 15 distinct gestures
- Gesture label exported to gesture_output.txt
- LabVIEW Integration
- LabVIEW VI polls the text file
- Maps each gesture to musical parameters (pitch, tempo, effects)
- Generates adaptive audio in real time
- Python 3.8+
- mediapipe
- scikit-learn
- opencv-python
- LabVIEW 2021 or later
\\�ash pip install mediapipe scikit-learn opencv-python \\
- Clone repo
- Train or load
andom_forest_gesture.pkl\ - Open LabVIEW project \GestureMusic.vi\
- Run: \python gesture_to_file.py\
- Launch LabVIEW VI
- Perform hand gestures in front of the camera → listen to responsive music
- Fork the repo & create feature branches
- Submit PRs with clear descriptions
- Report issues in GitHub Issues
MIT © Your Name