» MINT Framework for muti-touch gestures
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

MINT Framework for muti-touch gestures


UPDATE: Documentation and detailed info is now onlinehttp://mint.strukt.com/doku.html

MINT is a framework for adding multi-touch support to your applications.
MINT takes care of multi-touch inputs and provides you gestures so that you don`t have to deal with their logic in your application. MINT comes with a set of standard gestures, for special gestures you can make your own scripts using Python. Gesture Scripts can be used in all supported platforms and are interchangeable.


Every programmer who had to develop a multi-touch application from scratch knows how difficult it can be to handle many simultaneous interactions. Not only multiple fingers are used while interacting, but also multiple persons can use an application at the same time. Gestures can even be combined, for example to rotate and scale an object at the same time.

This can get quite complicated and it is not very effective to program the same interactions for every app so Strukt looked for a framework that is capable of a high-level handling of gestures. At the time there were no usable solutions available, but a research grant from the Austrian Research Promotion Agency (FFG) made it possible for Strukt in cooperation wit the Technical University Vienna to developed a very powerful framework.

The framework itself is platform- and language-independent but of course we focused on an extra-nice vvvv implementation.

We are not ready for a public beta but we plan on releasing the MINT Framework at NODE13!

Here is a short demo video explaining the basic functionality of the MINT Framework, more can be found on the MINT website:


ampop, Wednesday, Apr 25th 2012 Digg | Tweet | Delicious 8 comments  
catweasel 25/04/2012 - 16:06

Cool, thats come on a bit! Can't wait to have a play :)

bilderbuchi 25/04/2012 - 17:43

Nice! 2 Questions:
1) No Linux version?
2) Which License?

sebl 25/04/2012 - 20:04

looks rock-solid, top!

alg 25/04/2012 - 20:42

finally! This will be so great for prototyping. Are you planning to open the sources? Can i easily extend VVVV plugins with my set of nodes, by using .net interfaces?

Alec 25/04/2012 - 21:41

Multi-cool! ;)

u7angel 26/04/2012 - 09:22

nice idea ! but release in 9 months ? sound like an eternity ;)

ampop 27/04/2012 - 09:51

Hey evvverybody.
Thank you for the nice feedback.
To answer all your questions at once:

Yes, 9 months is a long time but we need that to create a documentation and fix bugs. There will be a closed beta, if you are interested let me know.

A Linux version is possible, if there is demand we can think about releasing it for Linux as well.

We haven't made any decision on the license model but it will be similar to vvvv's license. Maybe there will be a few restrictions that will affect long-time commercial use without a license but there will be no locked features or other restrictions in functionality.

The MINT core will not be open source but we want to give everybody the possibility to extend MINT with custom gestures.

I will post the complete paper as soon as possible and keep you all updated!

ampop 06/08/2012 - 13:01

Beta testers wanted:


  • 1

anonymous user login


~3d ago

andresc4: gamma+stride+fuse+standaloneapp+soputoutput+oscinput = lovvvvvveeeeeee!!!!

~6d ago

joreg: @pechart thanks for the flowers!

~6d ago

pechart: Just a big shoutout to the whole vvvv devteam and the friendly community! It's fantastic to export patches to standalone apps. THX!

~15d ago

daviddenker: We are looking asap for 3x Azure Kinect to rent or buy. Any kind of help appreciated! :-)

~17d ago

mediadog: Hmmm md.ecosystem link on staff pics page is not found

~19d ago

schlonzo: physics constraints and forces plz?

~21d ago

karistouf: thank you @gml but im thinking about and old version of vvvvv. There was a redraw with HSL of each pixels

~22d ago

schlonzo: no more previews? is there a 4.0 stable?

~23d ago

sunep: Live now. I perform live AV with Thomas Li after kiloton who is performing now. https://youtu.be/IG5oC6JvMeM

~23d ago

ggml: @karistouf ColorAndDepth help file