Log in with to leave a comment.


I'm wondering, have you done something special like additional filtering of positions and rotations of fingers and palms to get so correct and smooth hand detection? For some reason on my computer the leap motion device data fps is limited at about 43 Hz, and that really hurts detection both with my and other people's projects. However, your app works OK!

Btw, I'd add some transparent texture of dirt and speckles on windscreens to enhance 3D experience. ;)

Hi Ubojan!

What I did was instead of using the physics system to detect the grabbing of the levers I write my own. When grab strength is above 0.9 I check if a lever is inside the hand area and, if so, the lever gets to a grabbed state and adjust its rotation to match the hand position, always lerping to keep things smooth. I personally like to avoid using physics unnecessarily since the reactions of rigidbodies are too unpredictable.

Thanks for the textures tip. For this jam I just had time to make a quick prototype and the art side was hugely overlooked. :)


(Edited 1 time)

Cool! Adding another camera or two in Unity (like a picture-in-picture setup) might allow you to still see where your hands were in relation to the virtual HOTAS while looking forward (or elsewhere).

Hi, Kip! Thanks for the tip!

I though about creating an always visible UI element to indicate how much thrust and rotation were being applied to the ship but hadn't had time to do it.