Help/Advice with Kinect b27.1

I’ve not used the kinect since pretty much this time last year, so I thought I’d give the new pluigns a go, as I have a job that might need them, however they seem to be really skittish.
The skeleton plugin, I like you can select what bits you want to track, but the blobs really jump around, the only way the hand seems to track solidly is to have your hand far forward, in line with your body, (the cactus pose for example) and the blob will jump from hand to elbow to random…
The hand tracker I can’t get to give any tracking tracking at all
Last year I did http://vimeo.com/21935228
where you had to catcus pose, but the tracking was really good, you could get smooth arcs from the hands, and everything else really.
So the question is, what am I doing wrong, or which is the best version to use, which drivers should I be using (I think the help patch should say whcih version the plugin was written with really…)
OpenNI: 1.5.2.23
PrimeSense XnVSkeletonGenerator: 1.5.2.21
PrimeSense SensorKinect: 5.1.0.25
Is what I’m using currently…
And as Matka has just shouted, anyone done a character animation with them yet?
I’ve never managed to get the skeleton (joints) nodes to work correctly with anything but the test model… Let alone link up with kinect ( I last tried this time last year tho!)
Thanks guys…

cat

Ah ok the hand track, I finally opened the help patch duh! And that works, skeleton is still not right…

If you get a lot of jumping try adjusting the smoothing factor on the skeleton node.
Try setting it to around 0.9, which will smooth all tracking data.
If that still is to jumpy you might consider adding additional smoothing via a Damper node or similar.

I am also experimenting a lot with it at must say that the tracking has only gotten better. Also the autocalibrate feature is really nice.

hei cat,

your drivers are right.
i’ve been mostly working with handtracker lately and have to say it works quite well. in the helppatch just place your hand on the white dot with a distance of ~77cm from the kinect and it should be detected and the dot follow your hand. does that not work?

THanks Joreg, as I put in my later edit, I got the hand tracker working and it seems really good, hopefully I can make use of it for this project! It does seem pretty solid :)
The skeleton seems much worse than the first version though and I was wondering why that was, its not just the jumping, which is pretty bad, its things like if I put my hand further away from the camera than my elbow, it tracks my elbow instead of my hand, before, once it had got your skeleton it was completely solid, i.e. it knew that it couldn’t be your elbow as the bones and joints wouldn’t work, its kind of like it only tracks each point separately rather than as a whole skeleton, If the points were just moving within a few cm of where they should be it wouldn’t be a problem but, its that they’re jumping 10’s of cm…
Is this just to be expected now?
Is this because of the no need to pose? Can you go back to having to pose instead and get solid tracking?

Hi VVVVKinecters,

I can confirm that the best tracking performance I had was with the first Hierro Plug-in and then with the OpenNi generic Node .dll.

The skeleton tracking (especially arms) got worst with Phlegma and with the last plugin as well.

I made the attacched patch to understand better and did extensive testing with the brand new b27.1 and OpenNI.

Some Random Notes:

The User Node gives ID 0 and Position 0,0,0 at start even if no User has been detected (I would expect a NIL instead).

It doesn’t work if the Skeleton Node is disable (0 texture, 0 Position, 0 ID).

It gives an ID also if the Calibration is not finished (which would be correct) but some times a chair or a curtain is considered a person and this makes the ID association (User -> Skeleton) fail. (Maybe because of my patching)

The Hand Node and Gesture Node works very well.

The option for Depht Texture could be very usefull but I don’t know how.
Ciant Particle, Calibrate Projector, Field for GPU Dottore’s Particles, I could imagine but I can’t get those work for now.

What about the User Node Adapt to RGB? Sometimes works but not now while I’m writing this post…

I’m a beginner in VVVV, so shame on me for the bottom part of the patch (Sift Mess)

Could this patch be a starting point to make an advanced help patch for UnskillfullVVVVKinectUser like me?

Thanks for this great piece of software!!!
VVVV is in my heart…

Kinect (Test).v4p (96.4 kB)

I just had a look at the sources for the openNI module, and (if i looked at the right versions) it seems that in the skeleton module the confidence values are not used.
This could be the source of the jumping joints.
Normally you want to use only confidences over 0.5
So the positions should be surrounded with an if statement like this:

for (int i = 0; i < binSize; i++)
{
	var j = GetJoint(user, FJointIn[u](u)[i](i));
	var p = j.Position.Position;
    if (j.Position.Confidence > 0.5f)
    {
        FJointPositionOut[u](u)[i](i) = new Vector3D(p.X, p.Y, p.Z) / 1000;

        var o = j.Orientation;
        FJointOrientationXOut[u](u)[i](i) = new Vector3D(o.X1, o.Y1, o.Z1);
        FJointOrientationYOut[u](u)[i](i) = new Vector3D(o.X2, o.Y2, o.Z2);
        FJointOrientationZOut[u](u)[i](i) = new Vector3D(o.X3, o.Y3, o.Z3);
    }
}

fixed that.

this is not thought out very well. for now it needs both a depth node and an rgb node connected to the kinect-node as well for this to work.

aight, done as you suggested.

changes available in latest alpha.

@Joreg, i just checked the latest alpha but didn’t find a position confidence pin, is it possible to have it as a pin?
also i noticed in the version shipping with beta27.1 that if there are more then one user, the USER node is outputting a spread of 2 textures, and the first one is just empty, can you confirm?

also, both with the beta and the alpha, to get the skeleton data i need to run vvvv as administrator, else i only see the textures and get this error:

00:01:05 ERR : OpenNI.StatusException in OpenNI.Net: Can’t create any node of the requested type!

Stacktrace:
in OpenNI.UserGenerator.Create(Context context, Query query, EnumerationErrors errors)
in VVVV.Nodes.Skeleton2OpenNI.Evaluate(Int32 SpreadMax)

this happends both on win7 32 and 64 bits

there is no pin for the confidence, i implemented it as the mammoth suggested it. there could be a confidence input though…

have to check about the user-node…

i don’t see a reason why you’d need to start vvvv as admin to not get this error. no idea…works for me without admin.

Hi Joreg,
Thanks!!!

Also I’ve another request/suggestion:

Sometime you want to delete tracking data so you need a reset pin that reintialize the tracking just in skeleton and user node… (but not the depht or rgb) so with no lag you can start from begining (NIL -> User -> Skeleton traking data when calibrated…

Hope I explaing well…
Thanks

you are talking about a Reset input on User (Kinect OpenNI) and Skeleton (Kinect OpenNI)?

Yes Joreg, Just a reset that reinitialize Skeleton and User XYZ (to NIL).

Adapt to rgb pin can work as a resetter…
…But the User XYZ doesn’t restart with a NIL (I know you’ve already fixed this!)

Thanks

@sapo: the spreaded texture-out was an error. it is supposed to be only one texture with color-coded pixels.

@robe: i understand the need for a reset-input but i checked and didn’t find an option in the OpenNI api to reset the users. there is an option to reset calibration data, but after calling that, tracking only stops. no recalibration takes place…so i guess we’ll have to wait for an openni update…

also i now added a Confidence CutOff that allows you to choose a value between 0 and 1 manually.

Thanks Joreg thats much better!
Sorry I’ve been so quiet, another job on, back to kinect now…
Its still a little weird at times, for example, when its calibrating a new user it puts out coordinates, it would be good if it waited until it was tracking them.
I’ve also had it tracking me 2 or 3 times, as user 3 and 5 for example, when I walk out of shot and back in.
I’ve been trying to sift the output based on whether its tracking or calibrating but it still seems a bit mixed up…
This is with confidence way up.
I’m working on an installation where people will be entering and leaving booths, so ideally when they dispear from view it should immediatly forget them and start a new user…
Is this all NI related or in the plugin?

ALSO

is there still a way to edit the config file to limit the tracking depth so I can remove bystanders?

@joreg:
some old version of the plugins had confidence output pin per joint. i liked that pretty much.
is usefull, if you just want to use the part of the body where tracking is solid.

I’m having quite some trouble with ghost skeletons, that seem to be tracking a bit of my wall and don’t go away! I was hoping to use the hand tracking node but its not fast enough (I’m doing a conductor hero type game, and theres much hand waving in 5/4…) I keep leaving the hand up in the air, or worse attached to my face!
Someway of removing a user if they’re out of a certain area, or don’t move for a certain length of time, this could be patched if there was a remove user x pin?
@Woei the confidence pin is spreadabe and seems to work per joint…?

Hi Cat. I’ve been having the same problem when a real skeleton leaves the view, but remains as a still skeleton for a while. My fix for this was to see if any skeleton xyz was static (I think I had to do for >2 frames for some reason) and if so, remove it from the spread.

Shame about the reset pin. A physical workaround is to put your hand over the sensor for a second or two. This should reset both user and skeleton if things go awry

Same here, not much good for an unmanned installation with potentially 100’s of visitors a day tho… I can foresee there being many stuck skeletons and that makes me worry about performance issues…

The skeletons from real bodies who were in view who then become stuck do always disappear after 10-20s, at least in my case.

The other kind of stuck skeletons are the kind who attach to another object in view, a chair for example.

I get them fairly frequently when developing, and in an enclosed environment (shelves,tables,chairs around me) but they’ve never been an issue during an install, when I generally have a more open space surrounding the project.

I don’t think I’ve ever set up close to an opposite wall (while using skeleton tracking) but probably if you do get a false skeleton in that case it’s position will be ‘on’ the wall and you can ignore it based on it’s z position