Leap Motion Controller Review. Hand passes. Leap Motion Controller Review: Hand Passes Games for Lip Motion

The Leap Motion device appeared on the market not so long ago, but has already managed to win its user base, which continues to expand. The scope of this technology is quite wide and can be limited only by the developer's imagination. Most often, the device is used in the gaming industry and advertising.

Minimum system requirements:

Windows® 7 or Windows® 8

AMD Phenom ™ II or Intel® Core ™ i3

2 GB RAM

USB 2.0 Port

Internet Connection

Leap Motion works exclusively on Windows, MAC OS, Linux platforms and judging by the information from the official forum, the developers of this device do not plan to support mobile platforms yet. I think this is due to the fact that mobile devices are not yet powerful enough to work with Leap Motion technology.

The device has a field of view 120 degrees deep (Z-axis) and 150 degrees wide (X-axis).

The height of visibility (along the Y-axis) is maximum 25 cm and can be changed in the settings of the software that comes with the kit.

(photo: leapmotion.com) Leap Motion workspace

Before starting development for a new device, it is always necessary to have an idea of \u200b\u200bhow it works. The more you can understand the technical details, the easier it will be to design information systems using these devices. This knowledge will allow you not only to predict exceptional situations in which the device is physically impossible to use, but also to expand the scope of their application by hardware adaptation for specific tasks.

The Leap Motion device in question is technically not something super complicated. Inside there are two infrared cameras and three powerful infrared LEDs.

Leap Motion inside

The principle of operation of the device is simple - infrared (IR) diodes illuminate the hands, and infrared cameras capture them by transmitting images to the Leap Motion software processor. At the software level, mathematical algorithms enter the battle that highlight the contours of the hands and track the coordinates of the fingers. Starting with version SDK 2.0., Leap Motion has learned to select the component parts of the hand, in other words, the algorithm determines the bones of the hands and wrist, and tracks their movement in space. This opens up new horizons for expanding the base of recognizable gestures.

How Leap Motion works

Many on the forums and in personal conversation ask the question, can Leap Motion or Kinect be used as a thermal imager? Most people believe that an IR camera is necessary for human heat release and this is a huge misconception. The fact is that an IR camera and a thermal imager are completely different devices. Both of these devices are built on the principle of perception of invisible infrared radiation, but the spectrum of the perceived radiation is different. For the production of a matrix of thermal imagers, other materials are used that are more sensitive and allow us to perceive IR waves emitted even at minimum temperatures. It is for this reason that thermal imagers are quite expensive.

If you want to use this device as a thermal imager then it is fair to ask what temperature range do you want to monitor?
Based on Wien's physical law, we can calculate the maxima of the emissivity of bodies. Since this article refers to a different topic, I will not give the calculations, I will just give a few calculated values \u200b\u200bof the wavelengths:

- Human body: 9300 nm

- Fire: 3000 nm

The infrared camera of Leap Motion and Microsoft Kinect will not determine the temperature of the human body, but the flame from a lighter or a filament of an incandescent lamp can be seen.

Let's make an experiment. Take a Microsoft Kinect sensor and close the infrared emitter tightly with a metal cap. Next, we launch a program that will simply display an image from an infrared camera on the screen. The screen is black because there is no infrared radiation in the room. However, as soon as the fire of the lighter is lit, a picture of infrared radiation from the fire appears on the screen.

Let's close the infrared emitter with a metal plug

Nothing appears on the screen

Let's take a lighter

Infrared radiation from the lighter is displayed on the screen

Likewise, if you aim the infrared camera at a table lamp. The lamp is visible, but the light from the lamp is not reflected on the hand and objects around.

Infrared radiation on the screen from an energy-saving lamp

The light from the energy saving lamp is clearly on the hand, but there is no hand reflection on the screen

Optical spectrum

The matrix of any digital camera is sensitive to infrared radiation. In order to narrow the spectrum of IR perception, special IR filters are used, which are in any digital camera.
To prove my words, take the remote control from the TV, point the webcam or camera on your mobile phone at it, press the button on the remote control and at this moment on the screen of your device you will see the glow of the infrared diode, although in fact the radiation of the diode is outside the sensing area human eye. Since an IR filter is installed in the camera, you see a less bright glow of the diode. But if you remove the filter, then the light will literally blind the camera matrix.

Leap Motion Infrared Illumination

Infrared filters are classified into several types:

- only cuts infrared radiation. Such filters can be found in most modern photographic equipment. Webcams, cameras, mobile phone cameras and more. A filter is installed between the camera sensor and the lens. Visually, it can be distinguished by its color performance. The filter can be blue, pink and other colors, and in the light it can sparkle with all the colors of the rainbow. These characteristics depend on which wavelength spectrum needs to be clipped.

- transmits only infrared radiation, and cuts off the rest of the visible spectrum. This filter is usually black. Such filters are used in the production of thermal imagers, as well as widely used by photographers for the shooting technique called "infrared photography", and in everyday life this type of filter is known to us as sunglasses. This filter is used in the Leap Motion device we are considering. The black, glossy surface of this device is the IR passing filter.

Thus, an infrared camera in conjunction with a passing filter in LeapMotion is required to only perceive the narrow infrared spectrum.

Capturing a frame from Leap Motion cameras

But in spite of the fact that the use of infrared illumination and physical filtering allows the most qualitative preparation of images for further software processing, this device cannot be used in direct sunlight or with additional infrared illumination. If LeapMotion is located near a window through which the direct rays of the sun will enter the device, the device will notify you that it cannot recognize the image.

Hand recognition is fast enough, but the speed depends on the power of the computer, which actually processes the data received from the two cameras.

The disadvantages include the inability to recognize gestures that require you to turn your hand with an edge towards the device.

Leap Motion does not recognize the gesture well

In this case, the coordinates of the hand will get confused and twitch on the screen, which causes difficulties when it is necessary to perform some precise actions. I am sure I would not use Leap Motion to control a sapper robot 😉

Also Leap Motion does not recognize gestures in which two hands are connected together.

(photo: anijoin.by) Gestures that Leap Motion does not recognize

We were pleasantly surprised by the update of the Leap Motion libraries to the second version. It is noticeably felt that the recognition algorithms have become more accurate.

If you came across this article in search of an answer to the question: - "Is it worth spending time on Leap Motion technology?", My answer to you: - "definitely worth it!"
Today Leap Motion is an excellent and affordable tool for contactless control of software or devices in which control is based on the simplest hand gestures. Leap Motion opens new horizons for software developers. There is no doubt that in the future this device will evolve and new versions will take into account and fix all current errors and shortcomings.

I hope the information was useful for you.

Issue price- 4990 rubles.

Contents of delivery - the device itself, 2 USB cables.

Outward view.

When I first picked up a box with a gift to me Leap motion, a single thought was spinning in my head - what the hell is this? Since the dialogue half a year ago with the discussion of this novelty of the electronic market had completely disappeared from my memory, and the instructions in Russian were not attached, I had to go to Google for help, which gave me something like the following. :)

In theory. In practice, of course, everything is somewhat different. :)

The device itself looks something like the picture, although the dimensions are somewhat larger: 8 cm long and 3 cm wide. Due to the rubberized lining, it should stand firmly on the table, although personally I always have it slightly distorted :) It is connected to the computer via a USB-USB 3.0 cable, installed between the monitor and keyboard and works quietly on Windows 7.

Many people write about some special tests when connecting a device, but I didn't have anything like that, except for registration in Leap Motion Airspace, but more on that below.

Leap Motion is software.

After the system recognized the foreign device and happily announced that it was ready for operation, I solemnly waved my hand in front of the monitor, and ... the miracle did not happen. :)

I had to go to the site https://airspace.leapmotion.com/ and download Leap Motion Airspace from there - a program that is a typical service for distributing games and other software created for this device.

I downloaded, installed, registered, installed the software, started testing. The first program, Orientation, demonstrates the most spectacular capabilities of Leap Motion. It was rather unusual to see your own upper limbs in this form:

Further to choose from - in the Airspace Store there are games, programs for creating various objects, drawing, music makers and other, no less interesting gizmos. Of course, there are many paid ones, but among the free ones there is something to play with.

Perhaps the most unpleasant discovery here was the separation of software for Mac and Windows, so the owners of the "wrong" system will only have to lick their lips with some programs.

To do this, go to the section Computer controls and download there Touchless - for poppy and Windows, respectively.

Leap Motion - configurable for normal operation.

Then the fun begins. :) Initially, it took me 5 minutes to even hit the icon on the desktop. It was similar to the feeling that arose during the first use of the computer - everything is scary, it clicks and does not work a damn thing. :) So at a certain stage of poking on the screen, bewilderment began to give way to disappointment.

And then the understanding came that it would be nice to get into the settings, if any. Settings showed up in the notification area as a compact icon "Gestures" ... We click on it with the right mouse button (oh, what a bliss it is when everything works right away and as it should) and select Open the Gestures control panel.

In the opened window Gestures exhibiting precise pen and touch sensitivity.Then go to the tab Pen options... Selecting a line Double tap Double click and press Options.

In the new window, sliders Speed and Distance move all the way to the side more and test the result in a small window with a door. If everything is done correctly and you have already got used to the double touch, the door will open. We do the same procedure on the tab Touching.

Leap Motion Controller has been named among the top ten devices of the year by Time magazine. This device belongs to the glorious family of new generation wireless controllers such as the Wii Remote, PlayStation Move, but its closest relative is the Xbox Kinect. Unlike the latter, Leap Motion reacts exclusively to the movements of the hands; it detects even the fastest movements of the hands and fingers 200 times more accurately. This device brings us even closer to real virtual reality - to the creation of a natural interface between man and machine. Hurray, comrades!

Leap Motion Controller

After the release of the Kinect sensor, in the wake of its success, other contactless control devices began to appear. The Kinect served as the basis for the growth and development of the market for such devices: investors saw the prospect and understood the meaning of investing in gesture control devices. However, the most significant and successful was the Leap Motion Controller. Like its progenitor, the latter is based on motion capture technology. This device connects to a USB port and is no larger than a pair of folded sticks. On the technical side, to capture the projection of user hands in space, the Leap device uses two optical sensors (cameras) and an infrared light source (the developers do not exclude that the number of cameras may be changed in future versions of the device). The device is placed with the work surface up next to the screen to create the feeling that objects on the screen are controlled by hands. After connecting the device, a virtual inverted pyramid is formed above it with a central apex in the device. The most effective range extends from 25 to 600 mm above the controller with a field of view of 150 degrees. In the area of \u200b\u200bthis pyramid, Leap Motion “sees” all movements and sends them to the software, which converts data and signals into coordinates and messages. The software is able to recognize both simple gestures (virtual touches, pressing) and complex long movements: scaling, moving, rotating, drawing various geometric shapes. Thus, the device itself does not perform any calculations and transformations, leaving everything at the mercy of the host software, which, removing image noise, builds models of hands and fingers - pointers. Having the origin in the center of the device, Leap Device interprets the coordinate axes as follows: negative X is located to the left of the device, respectively, positive X is to the right. The Y coordinate grows upward and has no negative values, as Leap "sees" objects from 25 mm above it. The positive Z is toward the user, while the negative Z is toward the screen.

Leap Motion SDK

The Leap Motion SDK is developing surprisingly rapidly, and new versions are released with enviable regularity: in the relatively short history of its existence, a full-fledged second version of tools has already appeared, as well as its modifications. More precisely, the mods are still in beta, and we will be using the latest version of the SDK at the time of this writing, since each new version provides visible improvements - additional capabilities for tracking the skeleton ("bones" of the hands). As you might expect, Leap Motion SDK works on all common platforms: Windows NT, OS X, Linux. Since recently I have to work more on a Mac (and I'm editing this article on an EEE PC with Win XP, and I'm fine - Ed.), Then in the future my story (with some reservations) will relate to this particular operating system. If you are not friends with her, do not despair, because the Leap Motion SDK is cross-platform, and you can easily adapt the information obtained from this article for any supported operating system.

Ready to work hard!

To start working with the Leap Motion controller, after registering on the device manufacturer's website, download the LeapDeveloperKit_2.1.1 + 21671_mac.tar archive from the Downloads section. After unpacking it, you will find a folder inside which there will be a bundle Leap_Motion_Installer_skeleton-release_public_mac_x64_2.1.1 + 21671_ah1704.dmg (disk image for OS X) containing drivers for the device, as well as demo applications. Next to the bundle there will be a LeapSDK directory, which includes all the necessary libraries and APIs for developing applications that work with the Leap Motion device. In addition, this folder contains documentation and samples. In addition to demos, the bundle contains Airspace Home, a kind of client for the Leap Motion app store - you can upload your apps into it and sell them, like in other digital distribution sites. The main difference between the second version of the SDK and the first is a new system for tracking the "skeleton" of the upper limbs. It includes the processing of additional information about the bones of the hands and fingers, the ability to predict the location of bones invisible to the device and the construction of hand models in conditions where the limbs are not completely visible.

First, install the bundle content (I'm sure it has the same name under Windows, only with the exe extension). The installation program itself, which is inside the image, is called Leap Motion.pkg, it starts the installation process of all of the above.


Figure: 2. Installing the program

After the installation of the Leap Motion software is completed, the driver will automatically start, which will "settle" in the form of a daemon in the menu bar (top right). Three new applications will appear in "Programs": the driver itself, the Leap Motion Orientation demo program (I recommend starting with it) and Airspace. If the controller was not previously connected, it's time to do it. The icon (in the menu bar) will be highlighted in green. Clicking on it will open a menu containing five items. The first item Launch Airspace launches the window client of the same name. By default, it contains seven demos and two links leading to the Airspace Store and the developer community. Each of the demos showcases the capabilities of Leap Motion.

The next menu item - Settings opens a window for configuring the device. This window has four tabs. On the Generals page, basic settings are made: allow or disallow the device to interact with web applications that support Leap Motion (looking ahead, I note that there is such an opportunity, and for this HTML5 + JavaScript is used), enable or disable the ability to receive signals from the device for applications working in the background, automatically transfer the device statistics, enable (if necessary) the transition to the power-saving mode; adjust the lowest height above the device at which it “sees” hands and fingers (pointers); agree to automatic updates. There are two options on the Tracking page related to setting up the device's "tracking" capability. The next tab is devoted to diagnostics and troubleshooting, there are functions for viewing the software log, a diagnostic imager, recalibrating the device and returning to default settings. The last tab simply informs about the device and the software that serves it. Clicking on the Visualiser item opens a demonstrator, in which you can see how the device "sees" the limbs. That is, if you move your hands over the active area of \u200b\u200bthe device, the application will display them in virtual space. The Pause Tracking button pauses the tracking, Quite - knocks out the demon.


When the Leap Motion software is installed, you can install the developer tools. At the same time, I assume that you have the latest versions of the operating system and tools for development (Xcode) installed. As I said above, after unpacking the archive, the SDK folder is next to the installation bundle. This folder contains documentation, examples, header and object files for all officially supported languages. Initially, the Leap Motion SDK is written in C ++, but thanks to SWIG it has support for many common compiled and interpreted languages, including C # (along with the .NET and Mono frameworks plus the Unity 3D engine), Objective-C, Java, Python, JavaScript. SWIG is a free and open source tool that acts as a glue code generator between C ++ and other languages. For our developments, we will take C ++, as the most native. The client computer and the controller interact via a TCP connection, which opens ports 6437, 6438, 6439 - for the device to work correctly, you need to make sure that they are not blocked by a firewall. The Leap Motion SDK allows you to develop two kinds of applications: native (client applications) and WebSockets (web applications that run in a browser environment). The first for work (receiving data from the controller) use a dynamic library - specific for a specific operating system, it connects to the device and provides a service to the upper level. Whereas the latter receive data through the local host's WebSockets server as JSON messages. In this case, the JavaScript + open source LeapJS add-on is used, and to control the device, the application can send configuration messages through the WebSockets server back to the device.

Coding for Leap Motion

Today we will focus on native apps for OS X, but thanks to the cross-platform tools, you can easily remake our programs for another supported operating system. We will not develop a console application that shows the coordinates passed by the controller, this is boring. We'll immediately dive into some serious code and write an application that displays a graphical representation.

Visualization

The Leap Motion SDK provides a wonderful way to get data from a controller, but it has nothing to do with graphics. Therefore, our path lies through the use of additional tools. To display graphics from a native OS X application, you need to use OpenGL. This idea blows with sadness: the level is too low, no article will be enough, and you can generally fall asleep. Therefore, we will use the setting over OpenGL. From the widest range of such libraries, I chose Cinder. Cinder is a set of open source libraries for image processing, graphics, sound, computational geometry. As I said above, Cinder is cross-platform, and the same code will work not only on desktop platforms, but also on smartphones and tablets from Apple. In the future, developers are going to expand the range of supported hardware and software platforms. In addition, the TinderBox utility is included in the Cinder package to generate a new project template, with its help you can create a project with OpenGL, DirectX, CocoaView (OpenGL) support, each of these templates can contain support for the Box 2D physics engine, the Cairo rendering library, the FMOD audio library , the OpenCV computer vision library. For Apple devices, you can generate a template where geolocation and motion managers will be used using standard frameworks (Core Location, Core Motion). All this can be easily included in the project at the stage of its creation using the GUI interface. In addition, the project can be generated for a specific programming environment and operating environment: Xcode (Mac), Xcode (iOS), VC 12/13 (WinRT). Corollary: we have more than an API library, it all looks like a cross-platform game engine! You can also create a local Git repository right away. In my humble opinion, Cinder will soon be the best cross-platform solution, even compared to Qt. Since Cinder uses boost a lot, it's a good idea to update to the latest version. Open your favorite console and first set the management system for obsolete (in Apple's eyes) Homebrew packages:

Ruby -e "$ (curl -fsSL https://raw.github.com/Homebrew/homebrew/go/install)"

After that, install boost 1.55 from this system: brew install boost. To work directly with Cinder, it is enough to download and unpack it, and to generate a project - use the TinderBox utility located in the tools subfolder.

Hands, fingers, space management

So, to warm up, let's create an application that displays what the sensor sees in the window. If you read my articles about Kinect, then you can remember that we started there the same way, we will consider it a tradition. The TinderBox stock for OpenGL is perfect for us, we just need to add support for Leap Motion to it. To do this, from the include subdirectory of the previously unpacked LeapSDK folder (see above), drag two files into the Xcode project directory tree: Leap.h and LeapMath.h. When the transfer is completed, a dialog box will appear where you must specify the way to insert / link files to the project; check the Destination -\u003e Copy items into destination group’s folder (if needed) box, check the Folders -\u003e Create groups for any added folders checkbox and below mark the project to which the files are being added. In addition, a dynamic library is also needed. Since the C ++ compiler (LLVM) included in Xcode follows the C ++ 11 standard, you must use a library compiled with its intervention. There is such a lib, it is called (OS X version) libLeap.dylib and is located in the libc ++ subdirectory of the lib subfolder of the LeapSDK directory. Liba also needs to be moved to the Xcode system, with the same subsequent passage of the dialogue. Now we need to tell Xcode to use the lib added to the project. In the project file / directory tree, click on the project name (top item), the project configuration menu will open. Go to the Build Phases tab. In the upper left corner of the tab, by clicking on the plus sign, select the New cope files build phase item from the context menu that appears. A collapsed Copy Files panel appears at the bottom of the tab. Having expanded it, select Executables from the Destination drop-down list, and drag the dynamic lib to the empty list of files (below) from the project tree, while the Copy only when installing checkbox should be unchecked. She is now connected to the project. The next step is for the sensor to transmit "raw" image data of what it sees; in the Leap Motion settings (the Settings item of the context menu of the device icon in the menu bar) on the General tab, check the Allow Images checkbox. The template generated by TinderBox includes several folders, files, and necessary frameworks. Since I named the project RawImagesApp, I added the RawImages.h header file. In it I put the inclusion of the Cinder and Leap header files, the inclusion of the Leap namespace and the declaration of the Leap Motion controller object, in fact, it is the central subject of discussion. In addition, TinderBox generated the source code for our project, it will serve as a good starting point for development. The cpp file contains the main class (in my case RawImagesApp) of the application, corresponding to the project name and inherited from the Cindera base class - AppNative. The window is created using the CINDER_APP_NATIVE macro. The RawImagesApp class declares and implements the virtual functions of the base class. The setup function is called at the start of the application, the code for its initialization is placed here: to display "raw" graphic data in this method, you must set a special flag of the sensor policies, for which you need to call the setPolicyFlag method, into which the controller POLICY_IMAGES value must be passed. The update function is called every frame to update; draw is called to redraw content; mouseDown - when the mouse button is pressed. By default, not all possible functions are enabled, for example, you can add prepareSettings - a function that is called before the window is created and allows you to pass parameters to it. We will add this function to make the window larger when creating, and also set the refresh rate for it. The declaration inside the RawImagesApp class looks like this:

Void prepareSettings (ci :: app :: AppBasic :: Settings * settings);

and the implementation is like this:

Void RawImagesApp :: prepareSettings (Settings * settings) (settings-\u003e setWindowSize (1024, 768); settings-\u003e setFrameRate (60.0f);)

I'm sure comments are unnecessary here. Let's add an OpenGL texture to the main application class: gl :: Texture tex; It will be useful to us for the conclusion. In the update function, we will receive images from the sensor frame by frame, then process them and display them on the texture (see source). At each frame, we get a controller frame: Frame frame \u003d controller.frame () ;. An object of the Frame class contains all other objects, information about which the controller generates. We just need to extract them from it. By the way, getting a frame in this way - taking it from the controller yourself (serial polling of the device) - is the simplest and most often used. Any intermediate moments are predetermined: if at the next polling the new frame is not ready yet, the old one is returned; if several frames are ready during the next poll, they are placed in history. There is one more way to get frames, but for now we do not need it, and we will postpone its consideration to the next section. Having received the frame, we extract the images taken by the sensor from it: ImageList images \u003d frame.images () ;. There are two of them, since the sensor has two cameras, so there are two images at each moment. Next, we process both sequentially. First in the line: const unsigned char * image_buffer \u003d image.data (); get the picture data; at a certain point in time, we can receive different images from the controller - not only in content, but also in size. The next line creates a Surface Graphics object that is part of the Cinder API. Four parameters are passed to its constructor: the width and height of the surface, the use of the alpha channel, the sequence of color channels (the SurfaceChannelOrder :: RGBA constant corresponds to the standard: red, green, blue, alpha, but there are a number of others, for example, GDI or Quartz uses other color sequence). The iterator then traverses all pixels of the (so far empty) surface. Inside this loop, the color of the pixels is set. I decided to give the output image a reddish tint (like in DOOM :)). Therefore, the red channel of each pixel is set to a value corresponding to the value from the image data. The rest of the channels are cleared. After traversing the entire image, we construct a texture object using the gl :: Texture method based on the surface passed in the parameter. If you now render the texture to the screen, it will be too small. Therefore, pre-scale it: glScalef (2.0, 3.0, 0.0) ;. Now let's display: gl :: draw (tex) ;.

Figure: 5. What the Leap Motion Controller sees

Bones

In the following example, we will display our hands in a machine context, that is, we will draw them in the appropriate coordinates. This task will be more difficult than the previous one, and LeapSDK still provides a fairly low-level interface, so to simplify our task, we will use ready-made developments. American programmer Stephen Schieberl, nicknamed Ban the Rewind, has developed a couple of classes (Listener inherits from Leap :: Listener and Device) that do all the typical work associated with handling and returning device states. In addition, Stephen has put functions in the file that perform coordinate and matrix calculations, which will allow us to concentrate on higher-level work. First of all, these calculations are related to the fact that, unlike the coordinates of the operating system desktop, where the Y axis grows from top to bottom, the origin for Leap Motion (0, 0, 0) begins in the lower left corner (Y grows from bottom to top). therefore, when using Y coordinate values, they must be inverted. Additional calculations are performed on vectors and matrices as indicated above. So, let's create a new project in the same way as the previous one. Additionally, add the Cinder-LeapMotion.h and Cinder-LeapMotion.cpp files to it (see the materials for the article). In the main class of the application, the number of member variables has been replenished, the following have been added: mDevice - a reference to the device - an object of a self-written class, mFrame - of the Frame class (we already examined this class in the last section), mCamera - an object of the CameraPersp class or Cindera, a method was also added onFrame (the callback function of the parent class), which, taking an object of the Frame class, makes it the current one - assigns it to the member variable mFrame. The Setup method enables drawing modes, smoothing lines and polygons; initialization of the camera: setting the scope (in the constructor parameters), setting the viewpoint (in the lookAt method); then an object of the self-written class Device is created, which includes three necessary class objects: Controller, Device (from the Leap namespace) and Listener, in addition, you cannot do without a mutex. So we come to the second method of receiving frames from the device - listening. Our device class is inherited from the Listener class, which allows us to implement this feature, that is, we receive frames from the controller with a frequency corresponding to its work. When the controller is ready to transmit a frame, the Listener class calls the onFrame method we have overridden and passes the frame to it (in a parameter), we mentioned this method above. By the way, why do we need a mutex? The point is that when using a listener - a callback function - onFrame is called in multithreaded mode. That is, each of its calls is carried out in an independent thread. Therefore, we need to take care of thread safety at the time of receiving a frame from the device, which is a mutex. When listening, you can also ignore the arrival of a new frame (for example, if the previous frame has not yet been processed) and add it to the history (for further processing). Let's go back to our code, to the place where the object of our Device class was created. Once created, a callback function is set for it.

Redrawing

But the most interesting thing happens in the redraw method. First, preparatory actions are performed: clearing the screen, setting the current matrices for the camera, enabling alpha blending, the ability to read and write to the depth buffer, and setting the color for drawing. Then the actual drawing begins: we get the three-dimensional vectors of the elbow and wrist positions from the device and draw a line between these points using the gl :: drawLine method. Next, we get the number of fingers and in the loop, using the iterator, we go through the container containing them. In Leap Motion, each toe has four parts (phalanges): peripheral, intermediate, proximal, and metacarpal. Although the last phalanx is missing on the thumb of a real human hand, here it is, but it has zero value. In the nested loop, bypassing all the phalanges, we get the coordinates of their various parts: beginning, center, end, direction. Coordinates are represented as vectors (Vec3f). Also inside this subcycle, the phalanx is drawn using the drawLine method, to which the found coordinates are passed. Additionally, a container of joints (knuckles) is formed from the first phalanges. When the outer loop is exited, lines are drawn connecting the fingers and forming the hands. This is where the fun of redrawing ends. Compile and run the program, hold your hands over the sensor, and the outlines of your limbs will be displayed in the window.

Outcome

Leap Motion is a revolutionary controller, it not only replaced the touch screen, but also gave us space control, making the border between the real world and virtual reality even more transparent. At the software developer level, we get a convenient software interface that allows us to manage all the capabilities of the sensor. Cross-platform developer tools give the latter access to the device in a variety of programming languages, both compiled and interpreted (so far, among the latter, only two are Python and JavaScript). In addition, the API has a slender and understandable structure: at each moment of time, the controller takes an image, forms a frame based on it and sends it to the upper level - to the application program, where the programmer, having parsed the frame, works with such entities as hands, fingers, pointers (tools) and more. Due to the presence of two cameras in the device, it is often mounted on virtual reality glasses to create an augmented reality effect, which is achieved due to the presence of measured values \u200b\u200bof the brightness of the infrared emitter in the images taken by the cameras, as well as the calibration data required to correct the complex lens. In today's article, we touched on the topic of creating application solutions that interact with a device through an API. This topic is very extensive, and far from everything was possible to consider - gestures, special movements, emulation of touch and much more were left behind. All this, as well as much more, such as using a controller on Windows and the Web, integration with game / graphics engines, can become a topic of conversation in upcoming articles. It all depends on you - write to us, ask for more :). In the meantime - good luck in all matters and see you on the pages of "Hacker"!

Leap Motion Controller Review

Date of publication: 18.08.2013

Last year, a Leap Motion demo was presented and everyone did not believe that this was not a prototype, but a completely completed product, which would be released within six months. Somehow the pictures shown by the developer looked too fantastic, suggesting thoughts about the film "Minority Report". But the miracle did not happen, and at first the release date was postponed by four months, and then by a couple more. But the main thing, as they say, is the result, which is very, very good. It's not bad, it's a really amazing thing.

Concept

The idea behind Leap Motion is to make the interface more intuitive. And it doesn't matter if you are running an operating system or playing a game. And what could be more intuitive than gestures? Remember how cheerfully Tom Cruise worked with the Agency's database in the film "Minority Report":

Impressive, isn't it? This is how easily data is chased across the screen, combined into separate clusters, compared. With a keyboard and a mouse, it would take much longer to poke around.

« But this is a matter of the distant future and, in general, an invention of a science fiction writer!", You say. And take a look around - most of the technologies in the modern world came from fairy tales and fantastic works: moving through the air at great speed, communication at a distance using a miniature device, which is a powerful computer that easily fits in the palm of your hand, the Internet ... there are a lot of examples.

So Leap Motion has become a kind of dream come true Philip Dickwho wrote the novel “ Special opinion"Back in 1956:

Not less impressive than the episode from the film shown above. And, you know, this thing really works! It may not be as smooth as it is shown in the video, but it is difficult to convey the feeling of working with Leap Motion in words, although I will try to do it.

Small device in a big box

Considering the size of the controller, which is not much larger than that of a lighter, it comes in just a huge box, although it can easily fit in the palm of your hand. In general, the packaging is cool:

Above is a dust jacket, under it is a box made of very hard and high quality cardboard. You lift the lid and you are invited to plunge into a completely new world:

And here is the hero of the review, sitting at ease in a plastic bath:

In addition to it, the box contains a couple of cables and a small instruction:

By the way, pay attention to cables... The length of one is 60 cm, the second is 150 cm, but the connector is more interesting:


This is not a proprietary connector, but a standard one microUSB 3.0... But, to be honest, I did not understand why it was necessary to use this option, if the device itself works via USB 2.0. At least that's what the official website says. The usual microUSB connector is also more compact, and the corresponding cable is easier to get if the complete one becomes unusable.

Well, okay, what we have, we have. Let's take a closer look at the device itself.

Leap Motion is small, but weighty and tightly packed. The controller is made in an aluminum case with a glossy plastic panel on top and a rubber one on the bottom:


The assembled accessory is excellent - a monolith: no creaks, no backlash, nothing moves. I liked the solution with a rubber bottom - due to it, the gadget stands securely on almost any flat surface.

Of the additional elements on the case, there is only a microUSB 3.0 connector and a status indicator:


Imagining myself to be the smartest and most capable of figuring things out without instructions, I decided to immediately connect the controller to the Mac and act according to the circumstances, without bothering with papers. I connected it and ... nothing happened. Instructions are for cowards? No friends. Still, it's better to read them, which saves both time and nerves.

How to make Leap Motion friends with a computer

But all that had to be done was to go to the official website of the project. The creator of the accessory strongly encourages this as soon as you open the box and take the gadget out. On the top of the protective sticker it is clearly written, they say, “ Activate at leapmotion.com/setup". Following the link, I was prompted to download the appropriate drivers:

By the way, the package was downloaded quite heavy, as much as 80 MB, and after installation it takes more than 200 MB:

Well, I installed the drivers, software, and after connecting to the computer, the small box immediately came to life: the status LED turned on green, and red dots appeared on top - this is an IR emitter:


An action was also happening on the screen, and the software not only suggested that I connect the gadget and position it accordingly, but also clean the top Leap Motion panel from fingerprints to make the device work better. I don't know where it found prints there, I took out the device very carefully, did not touch the top panel, but nevertheless wiped it off, brushing off a few specks of dust:


Well, that's it, now I will touch the magic! But no, you still need to launch the Airspace companion app and register with the local app store:


I signed up, launched Airspace and finally, you can touch the future. Don't rush, though. First, the demo applications will be loaded, and they take up decently - at least tens of megabytes, but usually well over a hundred.

Well, while loading it is worth going over the accessory settings, there are not many of them.

In the tab with basic settings you can enable or disable the transmission of information from the sensor to web applications and programs running in the background, as well as adjust the height at which the tracking is carried out:



In the tab Tracking the priority of the sensor is configured (speed, accuracy, or balanced work) and the scheme of its operation (whether to automatically track the orientation of the hands in the air and whether to filter out unnecessary objects in the Leap Motion range, such as shoulders and head).

Well, all the demo programs are loaded, it's time to test the gadget in action.

Hand passes

When Leap Motion came to the editorial office in a packed form, it did not cause any special emotions. We waited too long and somehow burned out, I guess. Even in the process of setting up the device, I personally did not experience particularly strong emotions.


But it was worth launching the first application and seeing the device in action, when, without touching the keyboard, mouse or screen, you do some hitherto incredible things, like the rotation of the DNA molecule just like in the "Special Opinion", or you scare away schools of fish with your lungs with a movement of the palm, then the wild delight and the feeling that the world is changing right before your eyes returned.

The controller is really very accurate tracks palms, fingers and their movement. True, you still need to get used to it. First, do not make too sharp movements. The performance of Leap Motion is high, but don't expect miracles. In addition, with sudden movements, hands often fall out of the sensor's range.

The second important point is determining the most convenient tracking height, which is configured in the options, I mentioned this above. During testing, the most convenient was the standard value of 20 cm, but for the most part I interacted with the gadget while standing. If you are sitting, then it is worth lowering the height a little, or using automatic tracking.


Concerning applications, then for Leap Motion there are already a lot of them, both paid (prices are about the level of the Apple App Store) and free. In fact, the company that created the device has opened its own app store, which many third-party developers have already successfully mastered. Fortunately, the SDK has been distributed for over a year.



There are games, there are just cool programs that demonstrate the capabilities of the controller, there are applications for creating music, teaching, etc. There is software for both Windows and OS X.

In general, for the most part it is still a toy, but an incredible toy, causing a storm of emotions in the soul. I rejoiced like a child when virtuality reacted to my movements with my hands in this reality - when ripples from touch appeared on the water, and the crown of a tree with glowing leaves moved when you control a virtual racket in a 3D Arkanoid and hit the ball as if holding it in your hands a real racket ... it's all very addictive.


And it is worth trying to play with Leap Motion for a child, and you will not take him away from the computer for a long time. Moreover, children do not need to explain anything, they grab everything on the fly, and now a 3-year-old kid is briskly chopping ropes in Cut the Rope and feeding Omnom with lollipops.

Leap Motion has a great future. So far, this is more of a toy than a serious tool or replacement for a mouse and touchpad, but it really works. Moreover, the device was supported by the developers and this is the most important thing. Indeed, without software, even the coolest gadget will be just a pile of dead "iron, plastic and silicon". Programs are the soul of the device, and Leap Motion has a wide, kind and promising soul.

Last October, Leap Motion, vying for the virtual reality market with its motion controller and custom Oculus Rift mount, teamed up with IndieCade to compete for a $ 75,000 prize pool using gesture-based apps. By this day, 20 semi-finalists were selected from all the contestants, and 14 of them rushed to victory with virtual reality projects, and today we will tell you exactly about these daredevils.

Aboard the lookinglass

A sci-fi space adventure from Henry Hoffman with unique gameplay mechanics where your hands can control the past and the future. Activate remotes, solve puzzles and peer into the harsh truth.

Corridor 17

A fusion of endless runner and first-person shooter from Studio 17. You are a pilot and must dodge treacherous traps and unexpected obstacles by shooting armed robots with a cannon controlled by Leap Motion. Oculus Rift DK2 required.

ElementL: Ghost Story

As a Taoist monk, you were sent to rid the village of the evil spirits that inhabit the bamboo forest. In this adventure from Kevin Tsang and his team, you will use the magic of the elements with your own hands.

Gooze

In the abandoned ruins, you must solve puzzles as you travel from hall to hall and hide from monsters. Daniel Wiedemann's prototype for the game is based on his impressions of a real-life location near Berlin.

Haunet

This futuristic 3D VR puzzle game from VRARlab lets you use your hands to control a world of colored cubic structures. Any Oculus Rift will do.

Hollow

Feel like a horseman from The Legend of Sleepy Hollow by Washington Irving. Irving's fall palette comes to life in VR with realistic graphics powered by Unreal Engine 4. Oculus Rift is optional, but recommended.

Let’s Make Fried Rice

A culinary experiment in 3D menus and item tracking by Yusuke Ando. Cook fried rice and feed hungry customers! Virtual reality mode is currently only available for Windows.

Magicraft

Fantasy VR Magicraft from Storm Bringer Studios turns you into a magician in a world of witchcraft and the clanging of swords. Opponents have several levels, the gameplay promises to be interesting. Rift is optional but recommended.

Otherworld

With great graphics and ambient, Otherworld takes the player to a strange place full of spirits and mysteries waiting in the wings. The game was developed by a team of four people, including an artist, two musicians and a programmer.

Press Bird To Play

A stupid bird has stolen a pirate dagger - help the pirate regain his pride. With minigames and atmospheric locations, Gerald Terveen's game is truly immersive. Requires Oculus Rift and Leap Motion located on the table.

Soundscape VR

Complete immersion in the world of music - consoles and controllers in virtual reality, an abstract background, soothing sounds, a creative zone separated from reality itself from the Sander Snake.

Tran; section

Ever dreamed of playing a game in a game? Then this project is for you. Sitting in a gloomy virtual office, you have to play the most ordinary platformer on a regular computer, but with one twist: the worlds of the game and the game-in-game are interconnected ... Better watch the video.

Weightless

Most of us are unlikely to go into outer space at least once in our entire life, but virtual reality is good precisely because it removes any restrictions. A meditative space adventure from Martin Schubert immerses you in your work on the orbiting space station - collecting artifacts with a beautiful piano.

World of comenius

Educational and informational environment with data about the surrounding world and ourselves from Tomas Mariancik Frooxius.

A small bonus from Leap Motion: the LEAP3DJAM promo code gives a 20% discount on the purchase of the VR Developer Bundle, which includes the controller itself and the mount for Oculus Rift, in