Colour of a pixel

Hello

is it in some way possible to read the rgb values of a single pixel from the output of a renderer? alternatively of a texture?

kind regards

Sune P
www.galakse.dk
www.urlyd.com

hi sunep

i think pipet will be your node…

Hi sebl

Thanks, it seems to be the right node to use.

but now I get this weird behaviour:

first: pipet when I use it does not respond to blue, but sets the blue value to the same at the red value.

one other problem that arise is some sort of edge problem, when reading values close to the left or top edge of the texture, the picture look like it is stretched.

I have included a patch that illustrate the problem, a box can be moved around using the mouse in one renderer, the color of the box can be changed using the rgb node at the top.

the output of the renderer is transformed into a DX9 texture and pipet is reading values in a 15X15 pattern, displayed last.

I can’t figure out what I have done wrong, any idea what is wrong?

gegards

Sune

ai sune,

note that texture coordinates you put into pipet range from:
0/0 top left
to
1/1 bottom right

any coordinates out of this range will make the result look stretched.

i am not sure about your redblue problem though…what kind of texture do you have connected to the input of pipet? check its texture format via the Info (EX9.Texture) node.

I think that solved the stretch problem, thanks.

I have added the info node and can see some info about it, I wonder if the 64bit format has something to do with it. I have uploaded an example patch to illustrate the color problem.

Is there a special reason for selecting A16B16G16R16 as format in the DX9Texture-node?
I you change this to default “No Specific” everything seems to work fine.

Markus

cool, I feel a little stupid, i did som video filtering where the many bit made the difference between looking trashy and nice. and it was the leftover from that.

thanks for the help and patience, it now work fine.

pipet is fixed for beta>14 to work with more texture formats

Hi,

I do some tests with sunep example path.
How can I correlate location of the quad on the screen with number of sampled slice?

Regards,

pipette.zip (6.0 kB)

We have a great node for that, called GridPick (2D).

edit: why did you pick a greyscale image to test this? ;)

file_pipette2.v4p (15.5 kB)

Thanks West!

BTW: My renderer window is 300 by 300 pixels w&h. My bitmap is 100 by 100 pixels w&h. Why I must scale quad with bitmap twice to fill renderer screen?

why did you pick a greyscale image to test this?
Hmm, question of taste ;)

Youre welcome :)

The DX9 renderer, for 2D stuff, has it’s coordinates set from -1,-1 to 1,1 (x and y axis). So 0,0 is exactly in the middle. That is why a scale off 2 fills up the entire screen (difference from -1 to +1 is 2). And this is regardless the screensize you are using.

And a filetexture by default fits exactly on a quad. I mean, when you don’t have a transform connected to to the Texture transform pin.

Yes greyscale is nice, but for testing stuff with a pipet, i prefer a bit more color (specialy on my crappy TFT where grey and white look alike ;) ).

Good luck m8!