The Leap Motion thread

Hi everybody!

Is there a way to improve leap motion tracking stability? There are many differences between the Leap visualizer software and the tracking positions of fingers and hands in vvvv.
Does it depend on vvvv itself or maybe the driver version used for the plugin?

Sorry for stupid questions.

I have a leap and i didnt saw so much diferrence

you can try to damper the data to avoid quick changes

Well, Leap people say “the device will launch exclusively in the U.S. at Best Buy stores on May 19th” so I’m sure the version that they are planning to sell is more robust than the one they sent to developers. I’m interested to see the public reaction.

i’m more and more skeptical about this whole thing. they promised a skeleton tracking and they’re still unsure about the raw pointcloud which is quite annoying that is unavailable. fingers are still getting random ID’s so the developer cannot have any idea which finger is pointed on the hand and the tool detection is still unstable too

Just a side note @microdee: As the technology is probably not based on the Kinect way of “generating a point cloud and figuring out what it means”, generating it may be an overhead and not needed for the actual computations (that may be the reason it draws so low on processing power).

I have ambiguous impressions about Leap. From one side, device tracks fingers smoothly and precisely enough, from another side — i can’t imagine specific use of Leap, where it have no alternatives. + it’s kinda buggy, that one that i play with was broken one day for no obvious reason, and then self-fixed after a few days, lol.
Marketing, marketing everywhere.

Btw here is a video of test patch i did for playing with Leap: https://vimeo.com/61709229