Monthly Archives: April 2014

Custom shelving unit

Shelving unit in place

Photo: Thomas Winther

My friends Hugo and Anja needed a shelving unit with several sizing and fitting constraints. Having seen and liked the box design I used on my Little Bzzztrd synth/noisemaker, Hugo asked me if I could help them out. You can see the final result in the picture above; more details below.

The unit is made from the same material as the Little Bzzztrd boxes, Valchromat – a type of HDF (High Density Fibreboard). I deliberately made the joints such a tight fit that I had to hammer the parts together. This resulted in a sturdy construction in and of itself, but as an extra structural backup, I used glue and wood screws to hold the back wall on.

Getting the joints tight enough to achieve that hammer fit – and still be able to actually assemble the parts – required high precision when drawing the unit. There was a noticeable difference even when making as small adjustments as 7 to 8 hundredths of a millimeter. Good thing I was using Rhinoceros, which makes working at this precision level a breeze :-)

Even if Rhinoceros’ (Rhino among friends) primary domain is 3D modelling, I think its 2D capabilities are far superior to most dedicated 2D applications I’ve tried. This is especially true when it comes to accuracy and ease of use. Here are some of the 2D parts in Rhino:

2D drawing of parts in RhinocerosAfter drawing the parts in 2D, I tested that everything fitted together as it should by extruding the 2D parts to 3D and assembling them in Rhino. Good thing I did, too – since I discovered a couple of things that had to be changed!

3D extruded and assembled parts

Next came milling the parts out – and then, with one assembling/hammering session and several rounds of sanding and painting, I went from this…

Shelving unit parts

Photo: Thomas Winther

…to the final unit in this post’s main picture.

Now, a couple of images highlighting the sizing and fitting constraints I mentioned earlier. The unit had to fit around this hatch without blocking it…

Shelving unit fitting snugly around service hatch

Photo: Thomas Winther

…under this control box and into the corner next to the doorway.

Shelving unit fitting under control box

Photo: Thomas Winther

And finally, painting these corner joints proved the most challenging part of the job:

Corner joint

Photo: Thomas Winther

Motion controlled slideshow

Motion controlled slideshow installation

Photo: Thomas Winther

I recently finished my work on a physical installation in the new finn.no store (“En slags butikk“). I’ve covered a couple of the more technical aspects of this in my two previous posts – here’s a more accessible overview of the installation :-)

The installation is basically a motion controlled slideshow. Swipe your hand to the left over the Leap Motion sensor (mounted in the left wooden box in the picture) and the current picture slides left and a new picture slides in from the right. Swipe your hand to the right, and the opposite happens.

The pictures shown are from typical vacation destinations – the idea being that store visitors would snap selfies of themselves in front of the canvas, upload them to Instagram (tagged with #enslagsbutikk) and thereby have the chance to win an actual vacation.

Functionally simple enough, but not without challenges:

  1. The installation is located in a somewhat spacious room with lots of light that might confuse the Leap Motion
  2. The built-in gestures that come with the Leap Motion API don’t include a generic full hand gesture, so I had to solve it without using those
  3. Given that the whole point was that people would take pictures of themselves in front of the canvas, the projection had to be done from the rear (to avoid shadows)
  4. Quite a bit of fiddling around and optimization had to be done to ensure a good enough motion detection reliability

The biggest issue in bullet point 4 above was that the store visitors were swiping their hands too close to the Leap Motion sensor. Swipes in the 0 to 4 centimeter zone would go undetected.

Since this proximity blind zone is a physical sensor limit with the Leap Motion, the only way to make sure people kept their hands far enough away was to mount a physical barrier that still didn’t block too much of the sensor’s view. I ended up milling a simple frame on Jens Dyvik‘s excellent CNC milling machine (a Shopbot) and mounting it around the sensor recess:

Standoff frame to keep people's swipes the required distance from the sensor

Photo: Thomas Winther

Although the main point of the frame is to keep swipes at a minimum distance from the sensor, it also provides two more advantages:

  • It limits the sensor’s side view, reducing “false positives”, ie. hand movements that the user didn’t intend as a swipe
  • It reduces light pollution from lighting sources in the room, improving the sensor’s frame rate

Summary time! Here’s what I did on this job:

  • Coded and otherwise made the slideshow app in Unity3d, hooking into Leap Motion’s Unity3D API
  • Specified how the Leap Motion sensor needed to be mounted to function well, including guidelines for the ambient lighting
  • Milled out a standoff frame to improve sensor accuracy
  • Set up a PC to run the app

Leap Motion full hand gesture

After figuring out how to handle the Leap Motion’s light sensitivity issues, I started producing the actual code for my finn.no app.

The app functionality specification was rather simple: swipe a hand left or right to navigate backwards or forwards through a slideshow, respectively. Digging into the Leap Motion’s Unity3D API, I began to suspect the built-in gesture functionality wouldn’t be adequate, since the app would be exposed to the general public with little or no guidance available.

The API comes with four different gestures defined:

  • KeyTap
  • ScreenTap
  • Circle
  • Swipe

The last one, Swipe, sounded like it would be perfect for my needs. Regrettably, it’s been named a bit inaccurately; a finger swipe is what triggers it. Any finger on a detected hand doing a quick swipe, to be more precise.

I used this Swipe gesture in the first prototype for the app, but upon testing at the physical installation location, it became clear that it was too easy to misuse. It would mostly detect the swipes just fine if the user had at least one finger separate from the rest of the hand, but had trouble if the finger in question was a thumb or the hand was clenched or all fingers gathered together.

A good rule of thumb for any type of development is that if something can be used wrong, someone will eventually do it. Probably sooner than you think.

With this in mind, I did some more research into how I could get the Leap Motion to recognise a hand swipe, rather than a finger swipe. After a nerve-wrecking experience with a Unity3D plugin that didn’t turn out as expected, I found that the best way to implement this particular gesture was by not using the API’s Gesture class at all (!)

The solution was using Motions instead. Here’s the basic description (from the Leap Motion docs) of what Motions are:

The Leap Motion software analyzes the overall motion that has occurred since an earlier frame and synthesizes representative translation, rotation, and scale factors.

IE: if everything that the Leap Motion is currently tracking is moving in one direction, the resulting Motion translation shows you which direction this is. If you move one hand to the left, you’ll get a vector indicating this direction and the magnitude of the movement.

With this information in hand, all I had to do was gather this information across a number of frames, and define what magnitude of movement across these frames were enough to constitute a gesture.

Here’s some sample, simplified code(C#) – with descriptive variable names and all:

string checkForSwipe()
{
    Vector motionSinceLastFrame = 
        LeapControllerThingy.Frame().Translation(LeapControllerThingy.Frame(1));
    if (motionSinceLastFrame.x > 8.0f)
    {
        continuousFramesThatRightMovementWasDetected++;
    }
    else
    {
        continuousFramesThatRightMovementWasDetected = 0;
    }
    if(motionSinceLastFrame.x < -8.0f)
    {
        continuousFramesThatLeftMovementWasDetected++;
    }
    else
    {
        continuousFramesThatLeftMovementWasDetected = 0;
    }

    if(continuousFramesThatLeftMovementWasDetected >= 
        numberOfContinuousUnidirectionalFramesNeededForGesture)
    	{
    return swipeDirection.Left.ToString();
    	}
    else if(continuousFramesThatRightMovementWasDetected >= 
        numberOfContinuousUnidirectionalFramesNeededForGesture)
    {
    return swipeDirection.Right.ToString();
    }
    else
    {
    return swipeDirection.None.ToString();
    }
}

The Leap Motion’s sensitivity to lighting

Hand holding a Leap Motion

Photo: Leap Motion Press Kit

I recently had the pleasure of developing a motion detector (Leap Motion) controlled Unity3D app for the newly opened finn.no store in Oslo, Norway. The physical installation this app would be a part of were to be placed in a somewhat large room, with multiple lighting sources. So I figured some research was due.

My first tests were disappointing, to say the least. The Leap Motion’s accuracy was, quite honestly, horrible. It kept losing track of my hands, could never discern more than one or two of my fingers at a time and had a really low frame rate when I checked Leap Motion control panel diagnostics.

Once I replaced the incandescent light bulb above my desk with an LED bulb, though, things picked up. Both hands and fingers were tracked quite well and everything seemed more responsive. This impression was confirmed by the diagnostics: the frame rate had more than tripled.

The Leap Motion uses a small array of infrared LEDs to bounce infrared light off anything within its range, picks the reflections up with its sensors and applies some software magic to discern hands, fingers and pointy objects. In the right conditions, this works pretty well. There are, however, some possible pitfalls that will degrade the sensor’s performance. The one that has the greatest potential to cause trouble is light pollution from nearby lighting sources. Luckily, with a bit of care, this problem might be mostly avoided.

Different types of lighting disturb the Leap Motion to varying degrees. By far, the two worst types are incandescent (“regular”) and halogen light bulbs. These give off a very small portion of their energy use as visible light; the rest is spent on heat and infrared light. LED bulbs and CFLs (Compact fluorescent lightbulbs), it turns out, not only use less energy but also give off way less infrared light, allowing your Leap Motion to work undistracted.

So, if you’re having trouble with your Leap Motion’s performance, try adjusting your lighting. Replace incandescent bulbs with LED bulbs or CFLs, angle the light sources differently or put up screens or other objects to block direct light onto the sensor surface. Also: keep it out of direct sunlight, too – there’s plenty of Leap Motion-distracting infrared in sunlight.