this is to inform you that latest alphas come with a new commandline: argument
starting vvvv with this option (on windows >= vista) brings you:
texture-sharing can be used to share textures e.g. between two instances of vvvv (without noticable performance penalty) and even with any other software that also supports DX9EX. check the helppatch of SharedTexture (EX9.Texture) for instructions.
apparently this can also work as a bridge to opengl if someone wants to give this a try...and for those wondering, yes this enables stuff on windows your macfriends have been bragging about for a while now using syphon.
another new thing (not related to dx9ex) is the possibility to render the deph of scenes directly without using an extra pixelshader pass. just set the Texture Format of a DX9Texture (EX9.Texture) to INTZ and you should get the depth of your scene rather than a colorbuffer. disclaimer: may not work on certain graphiccards, see here
available now in latest alphas, enjoy.
dx9ex also helps for VideoTexture span across multiple graphiccards?
no, not across multiple cards, only with non-spanned dualview setups.
s u p e r !
hey joreg, why this cross process texture is activated in commandline and not directly added to vvvv default functionality ?
since the feature is rather of special interest and we have the feeling it is not too much tested in the wild with bigger projects we thought in a first step we make it optional and if it turns out to be _the thing_ we can still make it default in a later release.
Would this be possible from opengl->dx?
This would be good too for ofx processing or Avenue to v4...
But one way is better than no way!
Sounds great, I'll try and test soon, thanks guys :)
Is the depth evaluation radiometric from the current point of view or is it aligned to the z-axis?
depth is using the same camera as color.
@sven - aligned to Z of the camera view, and often not a linear response with respect to euclidean xyz world space
hell yeah thanks
@elliot - if it aint linear, what is it and why?
launching last alpha with the /dx9ex switch seems to break the kinect texture nodes! white renderer!!
tested with both microsoft & openni nodes
also changing the depth mode of the kinect (Microsoft) node throw some errors
@circuit: thanks. fixed for openni-nodes in latest alpha.
will it support for unity3d?
@kyy921: you have to check with unity3d if it supports dx9ex texture sharing. vvvv does.
@joreg/electromeier: /dx9ex texture sharing works also on 2 graphic cards here ...
But I have to right-click on the renderer node (close and open) once. It's the same when both screens connected to one graphic card (GeForce 770).
I don't use SharedTexture(EX9.Texture), but VideoOut(SharedMemory) and SharedMemory(EX9.Texture)
edit: Well, it seems I don't even need the /dx9ex for this nodes. Sorry!
vj_raul: yes, in your case it works on multiple cards because those nodes you mentioned don't make use use of gpu-texture sharing. they share via cpu-memory.
anonymous user login