» MINT Framework for muti-touch gestures
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

MINT Framework for muti-touch gestures


UPDATE: Documentation and detailed info is now onlinehttp://mint.strukt.com/doku.html

MINT is a framework for adding multi-touch support to your applications.
MINT takes care of multi-touch inputs and provides you gestures so that you don`t have to deal with their logic in your application. MINT comes with a set of standard gestures, for special gestures you can make your own scripts using Python. Gesture Scripts can be used in all supported platforms and are interchangeable.


Every programmer who had to develop a multi-touch application from scratch knows how difficult it can be to handle many simultaneous interactions. Not only multiple fingers are used while interacting, but also multiple persons can use an application at the same time. Gestures can even be combined, for example to rotate and scale an object at the same time.

This can get quite complicated and it is not very effective to program the same interactions for every app so Strukt looked for a framework that is capable of a high-level handling of gestures. At the time there were no usable solutions available, but a research grant from the Austrian Research Promotion Agency (FFG) made it possible for Strukt in cooperation wit the Technical University Vienna to developed a very powerful framework.

The framework itself is platform- and language-independent but of course we focused on an extra-nice vvvv implementation.

We are not ready for a public beta but we plan on releasing the MINT Framework at NODE13!

Here is a short demo video explaining the basic functionality of the MINT Framework, more can be found on the MINT website:


ampop, Wednesday, Apr 25th 2012 Digg | Tweet | Delicious 8 comments  
catweasel 25/04/2012 - 17:06

Cool, thats come on a bit! Can't wait to have a play :)

bilderbuchi 25/04/2012 - 18:43

Nice! 2 Questions:
1) No Linux version?
2) Which License?

sebl 25/04/2012 - 21:04

looks rock-solid, top!

alg 25/04/2012 - 21:42

finally! This will be so great for prototyping. Are you planning to open the sources? Can i easily extend VVVV plugins with my set of nodes, by using .net interfaces?

Alec 25/04/2012 - 22:41

Multi-cool! ;)

u7angel 26/04/2012 - 10:22

nice idea ! but release in 9 months ? sound like an eternity ;)

ampop 27/04/2012 - 10:51

Hey evvverybody.
Thank you for the nice feedback.
To answer all your questions at once:

Yes, 9 months is a long time but we need that to create a documentation and fix bugs. There will be a closed beta, if you are interested let me know.

A Linux version is possible, if there is demand we can think about releasing it for Linux as well.

We haven't made any decision on the license model but it will be similar to vvvv's license. Maybe there will be a few restrictions that will affect long-time commercial use without a license but there will be no locked features or other restrictions in functionality.

The MINT core will not be open source but we want to give everybody the possibility to extend MINT with custom gestures.

I will post the complete paper as soon as possible and keep you all updated!

ampop 06/08/2012 - 14:01

Beta testers wanted:


  • 1

anonymous user login


~28d ago

manuel: very cool to generate matcap / cubemap for reflections https://cables.gl/p/pDCOCw

~29d ago

joreg: Join us for the 19th #vvvv meetup on November 11th: https://thenodeinstitute.org/event/19-worldwide-vvvv-meetup/

~1mth ago

~2mth ago

joreg: @zaid please ask this in the forum: https://discourse.vvvv.org/

~2mth ago

zaid: how can i detect a changing spread in gamma? the old workflow with a change and select node isnt possible. cheers

~2mth ago

joreg: @dawoof such a question is probably better asked in the forum or chat

~2mth ago

dawoof: hey vvvvers, What are yall's fav depth-camera for body tracking? (dancing in this case) eyeing the ZED but the delivery time is long