6th, Installation & Running

Installing and running MediaPipe and HandCommander on development machine

Follow this instructions to install my fork of MediaPipe and HandCommander:

$ git clone -b lisbravo_01  https://github.com/lisbravo/mediapipe.git

$ cd mediapipe

$ git clone https://github.com/lisbravo/myMediapipe.git

$ sudo apt-get update

$ sudo apt-get install libmosquittopp-dev 

If this is the first time you are installing MediaPipe, you may need additional libraries like OpenCV, please check MediaPipe’s official documentation: https://mediapipe.readthedocs.io/en/latest/install.html#installing-on-debian-and-ubuntu

Once Installed, the fastest way to test it is to run HandCommander on the dev machine and have it send commands to a Raspberry Pi. To set the basic Raspberry environment, please check my post on that matter: https://www.deuxexsilicon.com/2020/04/07/5th-to-pie-part-1/

-Note that you don’t need an actual IR emitter to test HandCommander, just a mqtt broker to publish messages, so you can even use a broker on the local machine and work without a Raspberry

You will probably need to Set the IP of the MQTT broker on HandCommander, get the IP on the Raspberry Pi with the ifconfig command, then, on the PC, modify the file myMediapipe/graphs/dynamicGestures/dynamic_gestures_cpu.pbtxt, look for the broker IP, it will probably be at/near the end:

node {
  calculator: "MqttPublisherCalculator"
  input_stream: "MQTT_MESSAGE:message"
  node_options: {
   [type.googleapis.com/mediapipe.MqttPublisherCalculatorOptions] {
      client_id: "HandCommander"
      broker_ip:  "xx.xx.xx.xx"
      broker_port: 1883
      #user: user          #optional
      #password: password  #optional
    }
  }
} 

(replace xx.xx.xx.xx with your broker’s IP) 

Then compile the project:

$ ./bldDynamicGestures.sh 

This will take a while and if everything goes well, you should see something like:

INFO: Elapsed time: 412.303s, Critical Path: 271.64s
INFO: 1876 processes: 1875 linux-sandbox, 1 local.
INFO: Build completed successfully, 2006 total actions 

In my case, I’m developing on a VMware Ubuntu image running on Windows, as explained on https://www.deuxexsilicon.com/2020/03/16/1st-motivation-and-first-steps/, so, in order to test HandCommander, you need to stream the webcam from Windows to the VMware machine, just follow the instructions on that blog entry. 

 Now you can start HandCommander, the script is:

$ ./rnDynamicGestures.sh 

Don’t worry if you see a lot of decode error at first:

[h264 @ 0x7fd22c027900] non-existing PPS 0 referenced
[h264 @ 0x7fd22c027900] decode_slice_header error
[h264 @ 0x7fd22c027900] non-existing PPS 0 referenced
[h264 @ 0x7fd22c027900] decode_slice_header error 

That just means that ffmpeg has not seen a keyframe yet, which carries SPS and PPS information.

On the other hand, if you are running Linux natively, just modify the rnDynamicGestures script to use your webcam as input

 Now you should see a video window and if you make a valid gesture, it should recognize it and the terminal from where you launched the program should show some output messages like which category the actual gesture belongs to, MQTT messages and such:

Good Work! in the next chapter I’ll show how to cross compile so HandCommander can run autonomously on the Raspberry Pi