Leap Motion with the Raspberry Pi

Robots have the capability to work where humans would find it inconvenient. In fact, that is one of the reasons people build robots. For example, in areas where high amounts of nuclear radiation would be fatal for a human being, a robot can work happily. Science fiction movies have exploited this feature several times – a robot mimicking the hand movements of its human controller, when watched and manipulated from a safe distance. Now, with a few motion-controlled servos, Leap Motion and Raspberry Pi or RBPi, the tiny Single Board Computer, you too can make a robot with the ability to mirror the movement of your hands. Additionally, you can do this even when you are sitting on the opposite side of the Earth.

The project involves two servos, each mirroring the movement of your individual hands. A Leap Motion controller captures the motion of your arms and sends appropriate instructions to the RBPi, which drives the two servos using a PWM driver. Two 8×8 RGB LED matrices individually attached to the servos react to each finger movement on your hands. The Leap Motion controller communicates with the RBPi via PubNub Data Streams.

The project uses the RBPi Model B+, Leap Motion controller with Leap Motion Java SDK, four numbers of Tower Pro Micro Servo, the Adafruit PWM Servo Driver and an optional display case.

The Leap Motion controller is a powerful device. It is equipped with three infrared LEDs and two monochromatic IR cameras. The cameras capture the movement of your hands and Leap Motion publishes their attributes to a channel via PubNub. The Leap Motion SDK has the attributes pitch, yaw and roll pre-built in it and actually separates the movements of your hands into the three attributes.

For achieving real-time mirroring, Leap Motion sends the attribute information messages nearly twenty times in a second. It sends information about your individual arms and each of your fingers to PubNub. Since the RBPi subscribes to the same channel, it is able to parse these messages for controlling the servos and the RGB LEDs.

To start, you will need to open a Java IDE and create a new project. You will find a guide for the Leap Motion Java SDK here. Follow up this step with installing the PubNub Java SDK. Make your project implement Runnable, which will allow all the Leap activity to operate in its own thread.

Every second, Leap Motion captures nearly 300 frames. Each frame has a huge amount of information about the hands, such as the number of fingers presently extended and hand gestures such as pitch and yaw. To simulate the motion of the hands, one servo mirrors the pitch while the other mirrors the yaw. Incidentally, pitch is the rotation around the X-axis and yaw is the rotation around the Y-axis. Both servos rotate 180-degrees with a sweeping motion. The resulting servo mimics most of the movements your hands make.

Leap Motion outputs values for the pitch and yaw in radians. The RBPi is responsible for converting these radians into degrees and finally into PWM or pulse width modulation between 150 and 600 MHz for driving the servos.Leap Motion with the Raspberry Pi