Fabric Synth Landscape installation with Kinect

Dear vvvvers,

I am working on a patch for an exhibition/installation in Bochum,Germany that opens january 26th 2013 and stays open for a week. I am new to vvvv, started learning it myself maybe three months ago. I do grasshopper (architecture background) and do ableton(but not for the exhibition). I am working with VVVV to develop their installtion further. The general idea is to develop an intelligent or artificially intelligent landscape that works as a soft infrastructure.

here is a link to a past installtion -http://www.programonline.de/disconnect.html
Alex Tsolakis and Bastian Wibrek have made three or four of this installation so far in Berlin, Thessaloniki and frankfurt.

It is a virtual synth/live visual patch that works a fabric - sculptural - landscape as people interactive with it. I have a vst synth working in vvvv with enough parameters. I know quite a few synths and vsts but synth1 vst is the only one that seems to work with beta28 or 29 out of the list on the page about vst synth. but for this exhibition synth1 should work i think - quite basic but has most features categories one would expect from a synth.

once the fabric sculpture/ interactive landscape gets installed, there needs to be a way to begin monitoring the change of fabric’s shape as visitors push or pull the fabric and that generates audio and visuals. (My collegues are desiging the fabric shape at the moment.)

I am facing quite a challenge solving how to monitor a fabric ONCE it is installed and take information from the stretchy fabric’s displacement with kinect - and channel it in terms of + or - values to modulate sound via vst synth from within vvvv. have many ideas for visuals but the first step is how to extract the z displacement from kinect point cloud, with RULES such that if the z displacement is greater than x, then something happengs, if greater than y, something else happens. if the displacement is minus, then something else happens, etc.

I have been studing patches by “mediadog” and “vjc4”(thx! great examples and a good intro for me to vvvv’S ABILITIES) but i believe there is no patch there that can do what my job requires at the moment - i mean one that monitors displacments from an original geometry(point cloud i guess) and apply colors to the areas deformed or changed. there will be non interactive sound and visuals but my work is to deliver some visual and sound will only trigger when the fabric is altered, twisted, pushed, or pulled.

Are there nodes that I havent learned that can help on my situation? thanks for reading!

fabric synth.v4p (75.5 kB)

So what you get from kinect is a depth texture and with openni’s raw depth you can tell the exact mm distance from the device perpixel. Which you can convert to usable data for sound controlling with pipet node (if you have a stronger machine), that will extract color, split that with rgb (split) and there you go.
Remember that don’t sample every pixel in pipet that would be insane resourcehog. For your needs for sound control 64 (8*8) sampled points will be enough

okay thx! it works for now. there’s so much room for improvements though. wanna make it tight b4 i share it but i want to! it would be great if every body has their own version of a fabric synth. for sampling original sounds this is pretty dope - i can’t say I have heard what I just heard. But now I really need a crazy synth that works from within vvvv. should look into max/msp now i guess…

by the way, thx vjc4, part of this patch comes from me studying your patch.
which leads me to the next question, if you use a part of someone’s patch as a part of a new patch, what is the right way to credit ??