Spaces:
Running
Running
File size: 3,139 Bytes
7cf3c32 9331a7b 7cf3c32 9331a7b 7cf3c32 f8b493b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 |
---
license: mit
title: Arpeggiator
sdk: static
emoji: 🦀
colorFrom: purple
colorTo: pink
short_description: Hand-controlled arpeggiator, drum machine, and visualizer
---
# Hand Gesture Arpeggiator
Hand-controlled arpeggiator, drum machine, and audio reactive visualizer. Raise your hands to raise the roof!
An interactive web app built with threejs, mediapipe computer vision, rosebud AI, and tone.js.
- Hand #1 controls the arpeggios (raise hand to raise pitch, pinch to change volume)
- Hand #2 controls the drums (raise different fingers to change the pattern)
[Video](https://youtu.be/JepIs-DTBgk?si=4Y-FrQDF6KNy662C) | [Live Demo](https://collidingscopes.github.io/arpeggiator/) | [More Code & Tutorials](https://funwithcomputervision.com/)
<img src="assets/demo.png">
## Requirements
- Modern web browser with WebGL support
- Camera access enabled for hand tracking
## Technologies
- **MediaPipe** for hand tracking and gesture recognition
- **Three.js** for audio reactive visual rendering
- **Tone.js** for synthesizer sounds
- **HTML5 Canvas** for visual feedback
- **JavaScript** for real-time interaction
## Setup for Development
```bash
# Clone this repository
git clone https://github.com/collidingScopes/arpeggiator
# Navigate to the project directory
cd arpeggiator
# Serve with your preferred method (example using Python)
python -m http.server
```
Then navigate to `http://localhost:8000` in your browser.
## License
MIT License
## Credits
- Three.js - https://threejs.org/
- MediaPipe - https://mediapipe.dev/
- Rosebud AI - https://rosebud.ai/
- Tone.js - https://tonejs.github.io/
## Related Projects
I've released several computer vision projects (with code + tutorials) here:
[Fun With Computer Vision](https://www.funwithcomputervision.com/)
You can purchase lifetime access and receive the full project files and tutorials. I'm adding more content regularly 🪬
You might also like some of my other open source projects:
- [3D Model Playground](https://collidingScopes.github.io/3d-model-playground) - control 3D models with voice and hand gestures
- [Threejs hand tracking tutorial](https://collidingScopes.github.io/threejs-handtracking-101) - Basic hand tracking setup with threejs and MediaPipe computer vision
- [Particular Drift](https://collidingScopes.github.io/particular-drift) - Turn photos into flowing particle animations
- [Video-to-ASCII](https://collidingScopes.github.io/ascii) - Convert videos into ASCII pixel art
## Contact
- Instagram: [@stereo.drift](https://www.instagram.com/stereo.drift/)
- Twitter/X: [@measure_plan](https://x.com/measure_plan)
- Email: [[email protected]](mailto:[email protected])
- GitHub: [collidingScopes](https://github.com/collidingScopes)
## Donations
If you found this tool useful, feel free to buy me a coffee.
My name is Alan, and I enjoy building open source software for computer vision, games, and more. This would be much appreciated during late-night coding sessions!
[](https://www.buymeacoffee.com/stereoDrift) |