
1st, Motivation and First Steps
It’s incredible how far an engineer will go to solve a simple problem, but maybe it’s not about the problem itself, or the solution, but
It’s incredible how far an engineer will go to solve a simple problem, but maybe it’s not about the problem itself, or the solution, but
With the development environment up and running, next in the pipeline is to be able to detect and interpret static gestures, the basis of this
Now I have a Neural Network that infers static gestures, in order to keep this development process as tidy as possible, I created a new
Now things start getting interesting, it’s time put all together and start recognizing meaningful gestures After some study on usage scenarios, and trying lots of
This whole project is meant to be run in a Raspberry Pi, with the heaviest Neural Network, the handLandmarks NN, accelerated by a Coral USB
Installing and running MediaPipe and HandCommander on development machine Follow this instructions to install my fork of MediaPipe and HandCommander: $ git clone -b lisbravo_01
This project was conceptualized to be run on a Raspberry Pi while the heaviest neural networks, palm and landmark offloaded to a Coral usb accelerator.