Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.
Get your week pass for the full festival programm from 26 June - 02 July 2017
with artists talks, a full-day symposium, an international exhibition, performances, party nights,
the vvvv Keynode - with or without vvvv coding workshop program included!
You would rather get a day pass?
They are coming soon, so stay tuned!
Looking forward to see you this summer in Frankfurt (Main),
your NODE team
Basically a maintenance and bugfix vvvv release, wouldn't there be so much new VL stuff in it.
As mentioned previously the main topic we are working on at the moment is importing .NET DLLs into VL. Think drag'n'drop and easy wrapper patches all in visual programming style. This needs some deeper changes in the code base and was deliberately not included in this release but will find it's way into alpha builds soon for everyone to try it out. After that we'll polish the VL workflow and libraries a bit more to have a shiny version for NODE17.
And of course more bits and pieces are waiting in line, keep your eyes on the devvvv blog.
right now, we're looking for volunteers to help in this process, to run this year's festival edition and make NODE17 awesome!
Volunteers are an integral part of NODE. What about you, your friends, colleagues, digital enthusiasts and others? Want to get involved? Be part of our team, get a look behind the scenes, discover your skills, contribute your energy and enjoy the festival!
Click here for more information: https://nodeforum.org/journal/node17-call-for-volunteers/
Please send us an email with your area of interest or the duties that you would like to get involved with to
We are really excited to hear from you!
your NODE team <3
following up on our ongoing call for NODE17 workshops we're happy to start confirming proposals. Thanks everyone for your great proposals, we're still sorting through everything and waiting for some more that have been promised behind the scenes. Here is a first batch though that we can already confirm:
Bookmark this blog-post as it will be updated in the coming weeks. So much more to come..
vl for vvvv users
vl for programmers
developing nodes for vl
Videoeffects and Compositing
As you may have noticed, we are back in our every 2 month release cycle and a new beta is up on the horizon.
As we have noticed, not many of you use alpha builds to test it against your latest an greatest projects. So here is a particular fine alpha version that is our release candidate for beta35.5 scheduled for Monday.
Please give it a test run with a few patches and send us reports on any bug or problem you encounter. Testing is also the perfect excuse to miss any Easter obligation.
Some have reported that they are seeing ~temp files being written on save. We could not reproduce the error here, but we have now an error pop-up to inform you when something goes wrong and the exception that caused the problem will be copied into the clipboard. Open the projects that have that issue and paste the exception message into a new forum thread to help us tracking it down.
Here is something really great. The new Reactive category gives you tools to handle asynchronous events, background calculations and even enables you to build your own mainloop that runs on a different CPU core. But let's start with a pragmatic explanation of what it is:
In a way, this isn't anything new. Event buses or your typical click events are really an asynchronous event stream on which you can observe and do some side effects. Reactive is that idea on steroids. You are able to create data streams of anything, not just from click and hover events. Streams are cheap and ubiquitous, anything can be a stream: variables, user inputs, properties, caches, data structures, etc. For example, imagine your Twitter feed would be a data stream in the same fashion that click events are. You can listen to that stream and react accordingly.
On top of that, you are given an amazing toolbox of functions to combine, create and filter any of those streams.
Since a while VVVV and VL use these so called Observables to handle external events (i.e. mouse, keyboard etc.) and asynchronous data. This was mostly under the hood and the actual operations for observables are hidden in the VL.DevLib. The reason is that out of the box the operations do not go well together with the frame based Update concept of VL because they are intended to be called only once or when something has changed. But as of now we have wrapper nodes for the most common observable operations that do exactly that, listen for change and only rebuild the observables when necessary.
The go to node for handling events is definitely ForEach Region (Stateful) in the category Reactive. This region allows you to place any node inside and can also remember any data between two events. There is also one with version Keep that can filter out events using a boolean output. This region is very similar to the ForEach region for spreads, only that its input and output is event values in time instead of slices of a spread.
You can switch or merge event sources:
There are also filtering options with OfType or Where:
Other nodes include Skip, Delay, Delay (Selector), Scan, Switch, ...
If you want to leave the observable world and pass event values to the mainloop use one of the 3 nodes HoldLatest, Sampler or S+H which all behave a little bit different. Depends on what you need:
It's also pretty easy to generate event sources of your own:
As a general advice, only send values of type Record as event data because they are thread safe. If you send values of any Class type be sure that you know exactly what you are doing.
Yep, totally possible and has useful applications. But i am just gonna let this idea sink in for now...
The above just scratches the surface of whats possible with the reactive framework. If you want to know more browse some of the following links:
The pragmatic Rx expert from the quote above:
2 minute introduction to Rx
Visual explanation of the observable operations:
Operator Reference with marble diagrams
Videos from the creator team. Note that IEnumerable is called Sequence in VL and Spread is also a Sequence:
Erik Meijer: Rx in 15 Minutes
Erik Meijer and Wes Dyer - Reactive Framework (Rx) Under the Hood 1
Erik Meijer and Wes Dyer - Reactive Framework (Rx) Under the Hood 2
Introduction to Rx
Midi was released in 1982 and is one of the most successful hardware communication protocols in the world. The simple nature of the protocol makes it easy to implement and even more important, easy to understand for humans.
This makes it a perfect example for the first event based library in VL using the MIDI-Toolkit developed by Leslie Sanford.
Instead of having all settings on one node, functionality is now separate to allow arbitrary combinations.
Device nodes have an enum input for the input/output device driver you want to use. You can have many of them, even for the same driver. Under the hood they will share the actual device driver resource. The driver is opened only if it is necessary, for example if there is an event sink listening to it.
The dynamic device enum will update as soon as a midi device is connected or disconnected to the machine. So no restart required on configuration change:
MidiIn has one observable output for all midi messages received on the given device. MidiOut has one input that accepts an observable to send midi messages to the given device.
Following the midi message structure, there are filters that allow you to select only the messages you are interested in. For example only midi clock messages, or messages on a specific midi channel:
For all midi message types there are specific nodes to read the message content or construct new messages. These are mostly the native methods of the MidiToolkit library.
You can process a midi message (in fact any event) directly as it occurs. The new ForEach region in the Reactive category executes it's patch for each event that is passed in and can transform the event into a different message type and decide whether to pass the current event on via the Keep output.
This is part of a bigger programming paradigm that was also polished for the new midi nodes. Definitely check out for the blog post on Reactive Programming.
At some point all async input event handling in the background will be over and you want to leave the observable world and have the processed values in the main loop. For that there are several options:
For supereasy controller value input there is ControllerState or NoteState:
For more advanced scenarios refer to the Reactive nodes HoldLatest, S+H or Sampler which provide ways to pass event values safely to the mainloop.
If you want to generate midi messages in the mainloop you also have a simple node that generates controller message events:
For other messages use the Reactive nodes ToObservable which create an event source that you can use to send events from the mainloop.
Since VL makes a difference between a single value and a spread of values, some nodes come in 'plural' version to allow listening for example for multiple channels at one.
True to its spirit of community and experimentation, the Mapping Festival, dedicated to audiovisual arts and digital cultures, returns with a unique range of workshops for its 13th edition.
Both beginners and experts will once more be invited to explore a variety of subjects with an emphasis on new technologies and effervescent creativity, under the guidance of renowned specialists from all over the world.
The workshops will be spread over the three weeks of the festival, which is taking place from May 11 to May 28, 2017 in Geneva.
The deadline for registration is April 30, 2017.
Check out our website www.mappingfestival.com for more information and register via the dedicated form!
Introduction to projection mapping (12-17 yo) /// Saturday 13 – Sunday 14 May ///
Intended for teenagers and youngsters between 12 and 17 years old, this workshop will teach them how to create their first projection mapping thanks to the fun and simple HeavyM software.
HeavyM is a ready-to-use projection mapping software developed by Digital Essence, a young team based in France. HeavyM adapts your video content on volumes with an intuitive interface, and can generate real time graphic animations, with no need for further skills.
The founder from Digital Essence will help the participants in their first experience with projection mapping, and will answer their questions.
Participants will also have the occasion to build their own projection volumes.
Artefact /// Friday 19 – Sunday 21 May ///
Mickaël Lafontaine & Xavier Seignard
This workshop will teach participants how to scan 3D objects in order to create an interactive device combining 3D animation, robotics and projection mapping. They will design a robotic set connected to an interactive 3D stage, so that the rotation of the real object controls that of the scanned 3D object. This will lead us to conduct various projection mapping experiments on moving objects.
Interactive installations /// Tuesday 23 – Wednesday 24 May ///
Barthélémy Antoine-Loeff, Nicolas Bertrand
This workshop, which will take place over two half-days, aims at exploring various processes to create interactive installations. Different topics will be addressed, such as tangible interactions, animations, autonomous videomapping, and open-air video games.Each half-day will focus on a different kind of interactive installation, how to make it, as well as potential artistic ideas.
3D Mapping /// Tuesday 23 – Friday 26 May ///
László Zsolt Bordos, Viktor Vicsek, Ivó Kovács
Over the course of this four-day workshop, participants will first acquire basic knowledge of mapping techniques (2D, 2,5D, 3D), photometry, lenses, and content creation. They will then use software such as Photoshop, AfterEffects, 3dsMax, C4D, Realflow and MadMapper to create content and ultimately work on its rendering and finalisation.
Introduction to TouchDesigner /// Thursday 25 May ///
This workshop offers an introduction to the software TouchDesigner, an audiovisual platform that will equip you with the tools you need to create stunning realtime contents and rich user experiences. During this session, you will gain essential knowledge to start creating interactive media systems, architectural projections, live music visuals, or simply rapid-prototyping your latest ideas.
Use your face as interface /// Saturday 27 May ///
Popesz Csaba Láng
The human face is one of the most often used objects in computer tracking and recognition today. Tracking a face and following its movements, the machine interprets it in the way of emotions recognition. Why not reverse the roles and control the machine with your face? Using Pure Data, a visual programming language for multimedia creations, you will be able to literally control video playback with the movements of your face.
We are looking forward to seeing you there!
one of the more basic things any programming library has to support is parsing and the creation of XML data-structures. since vl is based on .net we don't have to invent anything here but can make direct use of .nets XDocuments, XElements, XAttributes datatypes. so we're happy to announce that in cooperation with dominikKoller we added xml/json support for vl:
so basically anything you could already do in vvvv plus some more. and this is only what we brought to the surface for you. using the underlying .net datatypes (XDocuments, ...) directly, a pro-user will (later) easily be able to use the whole range of functions that those datatypes provide for more advanced use-cases.
so, once again, something for the whole family..available in latest alphas now.
anonymous user login