Webcam to control audio files from folder

Hi all!

I’m having trouble setting up a project where I want to use webcam input (via a blobtracker) to select certain audiofiles within a folder (certain color reads out certain sound).

Reading the tutorials on this site and info on others, I think I need the getslice, dir and filestream nodes/functions, but I can’t seem to get them connected in a logical way which ends up in a positive result.

Would anyone be so nice to take a peak and give me some (any) advice how to pull this off? It would be very mich appreciated! I’ve added the project to this thread.

webcam to audio.v4p (31.4 kB)

Sry for the late reply, but this means making your entire patch, so here are some ideas.

You have 2 sections you want to make and use, one is for the webcam, and second is for the audio part, and you need soemthing in between as a bridge.


First think the about information you want to extract from your webcam, I assume X and Y location? (when a blob is on XY, play a corrensponding file?) And also a trigger if there is any blob at all. So make a patch that Outputs XY from your webcam.


The audio part, you have a folder and want to trigger what track to play? Make a patch where you can select with a number what track you are going to play. And perhaps a play/stop button.


The in between part is the harder part, you need to convert that XY position to a number, but if you have the first 2 sections up and running, I will help you with that.

Good luck, sorry about the lame answer, but don’t have any clue on what you are wanting to do, and these 3 steps are the best way to approach this. Is it a multi-touch like idea?

Hi, thanks for your input! I’m just beginning to work with vvvv, so I want(ed) to keep it simple: indeed xy coordination that outputs a number which corresponds with a specific file. So a webcam that tracks motion and translates this to sound, no fancy multitouch panels or the like.

I could bypass the audio part if I choose to do it with midi. Load audiofiles in another app (say ableton live). Also gives me more control over the sounds (effects etc).

Problem remains: how to convert xy data to something useable (in this case midinote output)? Will take a closer look into it…

Triggering the Audio is not a problem.

First, isolate your XY values, if it is not multitouch you only want 1, perhaps sort it and get the biggest? Dunno?

Okay, so what you want to do with that info? Is it like a big Grid with squares, and every square (aka quad) is a different audio file? Than you need to figure out what quad the blob is intersecting with. (hint: intersect node, or gridpick ;) ), or is it like a big XY note-book-mouse-pad??

edit: join the skype gang if you want more help ;)