Texture formats in vvvv?

I tried making a plugin that output a 16-bit greyscale texture, but vvvv appears to ignore the type and treat it as a 32-bit RGBA.

Does vvvv only handle RGBA textures, or am I doing something wrong?

Thanks!

vvvv handles all dx texture formats. are you sure your card/driver can handle your desired format?

Here’s what I’m trying to do. I’m creating the drawing surface as 16bit grey, I expect my problem is when that is copied to the texture; I don’t see how to create a specific type of output texture. This patch needs a Kinect to work, but you can probably just look in the .cs file and see what I’m doing wrong. (I hope!)

OpenNISkeleton - Greyscale.zip (143.0 kB)

in the CreateTexture function simply use the following:

return new Texture(device, texW, texH, 1, Usage.None, Format.L16, Pool.Managed);

to get a 16bit single channel texture. you can now write the ushort* pDepth directly into the ushort* pDest.

then in a pixelshader i simply use:

float4 col = tex2D(Samp, In.TexCd); //Samp takes the L16 texture from above
col.rgb = clamp(col.r*65535, MinDistance, MaxDistance) - MinDistance;
col.rgb /= (MaxDistance-MinDistance);
col.rgb = 1 - col.rgb;

to draw the range between MinDistance and MaxDistance from white to black.

Most excellent! Thanks, joreg.

Edit: I use a slightly different approach for the depth in the shader, as I want the out-of-bound areas to go black, not white:

bool good = [Depth >= DepthMin) && (Depth <= DepthMax](https://vvvv.org/documentation/Depth->=-DepthMin)-&&-(Depth-<=-DepthMax);
col.rgb = good * [DepthMax - Depth) / (float)(DepthMax - DepthMin](https://vvvv.org/documentation/DepthMax---Depth)-/-(float)(DepthMax---DepthMin);

This is in an attempt to not use a conditional in the shader, which I have heard is a Bad Thing. Is that still true?