» Kinect Hitboxes DX11
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

Kinect Hitboxes DX11

Credits: patched by Andrej Boleslavský for Colorbitor


Kinect Hitboxes tracker DX11 v3.2b

Tool for creating site-specific interactions using Kinect. DX11 version utilizes the performance of GPU compute shaders. This tracker is a great replacement of all kinds of presence sensor such as PIR, IR gates, ultra sounds. Installation and setup takes only few minutes.

Demo image sequence included.

Sample projects

Little Boxes, Bego M. Santiago
Hopscotch³, dezInterzis
Composition for a drone, Mária Judová
Leafstrument, dezInterzis
ArtistTalk, Andrej Boleslavský

Credits and licence:

Patched and coded for Colorbitor by Andrej Boleslavský

Required software:

vvvv 45beta32 x86
vvvv addons 45beta32_1
Kinect for Windows Runtime v1.7
VVVV Packs DX11 b32 x86 by Vux

KinectVirtualHitBoxes v2.42b

A tool for triggering events using the depth image of Kinect. You define a virtual box; when pixels from pointcloud enter that space it triggers a message. The messages are sent over osc or keyboard keys

Included patch:

hopscotchᵌ, a collaborative, musical hopscotch
as seen on http://vimeo.com/57388809

Tutorial on using it:


Credits and licence:

Patched for Colorbitor by Andrej Boleslavský, coding support Martin Zrcek, included sample by Nicola Pavone.

Required software:

vvvv 45beta29
vvvv addons 45beta29
PrimeSense SensorKinect:


10.06.14 [12:47 UTC] by id144 | 2651 downloads
Show 1 older revisions

Older Revisions

17.01.13 [03:55 UTC] by gontz | 1329 downloads

guest 15/01/2013 - 15:59

missing CreatePointCloud2 (value) plugin

guest 16/01/2013 - 02:37

sorry but the plugin is still missing
anyway looks fun :)

gontz 17/01/2013 - 04:56

should work now.

metrowave 19/01/2013 - 04:55

Works well here, thank you.

synth 19/01/2013 - 20:47

Works here too. I find it very hard to adjust it properly though.
I am playing for few hours already and cant calibrate it correctly.
I almost line up the point cloud to the real image , move the boxes but nothing happens after that , i can never achieve a hit .
Any calibration advises ?

eglod 21/01/2013 - 23:46

Now, it looks interesting and I want to use it. But may be there is some help file to find out how the program works. Is there a really working sample of the program. I would be very happy because I am doing with kinect music in a modern classical sense. I am hoping for an answer.

Thank You eglod

synth 23/01/2013 - 05:44

OK either it is hard to align the camera to the real world position of the Kinect or i am uber lame ... Anyway i was thinking last night is it possible to use some of the camera calibration techniques to align it ?
I Ask because i can not do it on my own.
If you are using the dept stream of the kinect to generate a point-cloud , why not use the RGB srream for the camera calibration process. Just a thought , i might be way off in my dreams but it could be possible, thus making it easier for artist/users to adjust the system and use it for something creative.

id144 23/01/2013 - 17:45

hi synth,
what do you mean by calibrating camera?
kinect virtual hit boxes is meant for site specific installation where you add interactivity to a defined space (virtual box) based presence of object (people, animal, etc.).

it's meant as a replacement for traditional sensors such as infra or ultrasonic proximity sensors, passive infrared motion detectors, light gates etc.

back to your query about camera calibration - what you generally need to do is to rotate and move the point-cloud so the significant features of the real space are aligned with the 3d coordinates - floor is aligned to the XZ plane, wall to ZY etc. first you set the correct rotation of the point cloud, then the position...and you are ready to set up the "boxes" align them to the features of the physical space.

see the sample projects using this tracking system

some other project suggestions:

  • games (such as bowling, app could evaluate how many pins were knocked)
  • behavioral experiments (rat labyrinths etc.)
  • lifted toilet lid detector (could save your relationship)
  • intelligent room light (light on when you are in the room, off when you are in the bed, different light when you sit behind the desk)
  • interactive visuals for parties ("put your hand up!")
  • simple window shop interactions (detect present of visitor, add few interactive buttons)
  • musical instruments (musical stairs, drawn musical interfaces, etc. )
gontz 24/01/2013 - 14:42

Hallo synth,

here is a tutorial for using it

synth 25/01/2013 - 19:26

Erm as i said .. I am super, ultra, uber lame . I totally forgot about the camera controls thats why it was hard for me to adjust the point-cloud position.
About camera calibration techniques i meant something like : http://www.kimchiandchips.com/blog/?p=725 by Elliot Woods and Kyle McDonald.

Thank you for the tutorial video, was really showing everything that is suppose to happen.

id144 27/01/2013 - 18:29

Thanks Synth, comments and suggestions are always welcome as we'll keep on adding features and publish updates. Could you elaborate further on you idea about camera calibration?

I believe in case of our point cloud aligning the 3PointsToPlan plugin would do the job. By clicking on a three points in a point cloud, it would get aligned to the axis.

Noir 10/06/2014 - 13:19

thanks id144 for the dx11 version!

medodawod 28/08/2014 - 15:38

hi, kinect pointcloud not working ?!

id144 19/09/2014 - 17:18

noir: welcome :)
medodawod: interestingly enough, i have a same problem now with GTX590 and no shader reports an error. what kind of graphic card do you have?

ERV 11/10/2015 - 21:50

hi all, I manage not find the node Kinect
  ¿ Can anyone send it to me?

id144 26/10/2015 - 15:35

Hi ERV, did you download recent addonpack and dx11 pack?
Which version of vvvv, dx11, addopack are you using?

yochee 18/02/2017 - 22:06

thanks to all who developed it! just wanted to leave short notice that it works really well for me in vvvv_50beta35 :)

one question: is it possible to use multiple kinects with the hitboxes (maybe doublicate kinect part and select another index.. integrate into the 3d view?!) ;D They dont have to overlap, I just would like to cover a bigger area side by side, then I send the OSC-Signals to Resolume Layers etc

id144 19/02/2017 - 16:07

hi @yochee!
thanks for the feedback :)

yes, with kinect v1 aka 360 it is possible to use multiple devices, though you need to adjust the patch a it as the output of the kinect node is not spreadable.

probably a faster way to do is to launch hitboxes on two machine, another option is to launch vvvv in two different instances with /allowmultiple command line parameter. each instance should use same table of hitbox coordinated but the transformation of the kinect position and the index should be different.

when using multiple kinects on a different machine, it's best to have them on two separate usb host controllers. i wonder how do multiple kinects perform if you disable color stream and connect them to same usb 3.0 host controller.

kinect v1 may overlap, there quality of depth map is decreased because they do jam each other, on the other hand, this does not generate false positives.

gegenlicht 18/03/2017 - 16:37

Hey thanks for this great contribution.
im having issues where the kinect would "freeze" until i reset it, this happens somehow randomly. 45beta34.2_x64

Its not USB or EnergySaving related. Any ideas?

id144 21/03/2017 - 06:50

@gegenlicht, welcome :)
Yes, it sounds like USB related issue. For a permanent installation I did in the past, I used separate USB controller precisely because of this issue. As a hotfix you may reset Kinect when it's idle for n frames.

vizmagician 30/05/2018 - 21:04

Is it possible to add more hitboxes?

yochee 08/11/2018 - 02:21

hey @vizmagician sure you can do that!

anonymous user login


~7d ago

bjoern: Yo peeps! I am looking for a job/project starting July. For contact info check: vvvv specialists available for hire

~1mth ago

joreg: Summer Season 23 vvvv workshops are now ready for sign-up: https://thenodeinstitute.org/vvvv-intermediates-summer-2023/

~1mth ago

schlonzo: yeah! shader input pins now also visible, while the variable it not used!

~1mth ago

benju: Job opportunity, teaching Sounddesign for New Media purposes in Berlin (6hrs/week): https://www.letteverein.berlin/wp-content/uploads/2023/03/Ausschreibung_MIA_LK_6_UStd._Sounddesign_NEU.pdf

~2mth ago

joreg: vvvv gamma 5.0 is out! Please read all about it in the release notes: https://visualprogramming.net/blog/2023/vvvv-gamma-5.0-release

~2mth ago

domj: Coming to LPM next weekend? Learn more about one of the first full vvvv gamma apps, Schéma! https://liveperformersmeeting.net/editions/2023-muenster/program/detail/schema-talk/

~2mth ago

joreg: Want to get started with #vvvv? Check this 12 session beginner online course starting May 8th: https://thenodeinstitute.org/courses/vvvv-beginner-class-summer-2023/

~3mth ago

mediadog: @ggml Yup, lots. Only used in 4.x, haven't tried in 5.x yet: https://www.unrealengine.com/marketplace/en-US/product/simple-udp-tcp-socket-client