Using Leap Motion to control Poppy Robots

The Leap Motion is a infra red camera which can detect your hands so that you can access all geometric data.
For introducing controlled values in a robotic system, it is very useful.

I created this subject because the version 3.0 of the software just arrived (named ORION) and it worth !!
There were some false detections which are now eliminated.

4 Likes

Is there any API available for Linux?

Yes, the API is available for Linux platform … but not yet on Raspberry Pi.
I know the API is not opensource but it is very efficient and reactive.

I did a feedback on the Myo device which was not very good. For the Leap Motion, for me, it is THE tool for the robotician. It is very reliable, very precise and very transparent with a very efficient developer team. You can inject every geometric parameter of your hand inside a robot with less than 1ms delay…
It is not only useful for puppet control but also a huge tool for robot debbug.
Moreover, it is very easy to give control to children :slight_smile:

1 Like

Ho it’s very exciting! That’s very good news! Due to the operational range, I’m not surprised if the Raspberry Pi is not yet supported, because this tool is not designed to be embedded (but could serve as a sensor). On their blog, it seems to be planned or done.

I’ve seen a VR headset with a Leap Motion for hands detections, for example. What is the effective distance, according to you? Do you have ideas for applications?

Yes, I know they plan to do it. I wait… I plan to use RPi 3 for “not embedded” station to avoid using my laptop during shows (If there is a big crash for instance).
For the VR headset, the range for detection is 60cm-80cm, it is very efficient when you put the device on your head but keep in mnd there is a USB cable which can prevent you from moving.
I plan to add the Leap motion on google cardboard since Oculus is too expensive (and not yet available).