» Blog
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

Blog

new post

Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.

  reset tags

addon-release core-release date devvvv gallery news screenshot stuff
delicious flickr vimeo
Order by post date popularity

After NODE 2020, having seen all the wonderful things Stride and vvvv can do together, it was inevitable to fall head first into the adventure that has been bringing Stride and VL.OpenCV into a playful and seamless friendship.

I am happy to announce that as of version 1.2.0 of VL.OpenCV, you can effortlessly and painlessly:

Find your camera's position based on a pattern or marker

Need to know where your camera is and what it's looking at based on an Aruco marker or a chessboard calibration pattern?

Say no more:

Find camera position and rotation based on Aruco marker 2

And now from the outside:

Find camera position and rotation based on Aruco marker 2

Dizzy yet?

Estimate the pose of an Aruco marker to create AR applications

Bring 3D objects and animations into your image using Aruco markers to create augmented reality projects:

AR Teapot

Calibrate a projector

Remember this beauty? It helps you figure out the position and characteristics of your projector in your 3D scene.

Calibrate projector

And reproject

Once you know where your projector and the spectator are, you only need to worry about the content. 3D projection mapping made easy!

Reproject

Not bad huh?

So there you have it boys and girls, 3D computer vision based adventures for all! Head to your local nuget distributor and grab a copy while it's still hot.

A big thank you to motzi, gregsn and tebjan for their invaluable help as well as to many others who contributed one way or another.

And as always, please test and report.

Keep your cameras calibrated kids!

Happy holidays!

Changelog

Added Stride compatible versions of
  • SolvePnP
  • ApplyNearAndFar
  • Perspective
  • ExtrinsicsToViewMatrix
  • ExtrinsicsToProjectionMatrix
New and improved help patches
  • Calculate a camera position using Aruco
  • Calculate a camera position using SolvePnP
  • Estimate the pose of Aruco markers
  • Calibrate a projector and reproject
  • Calibrate a camera
Bug fixes for
  • EstimatePose
  • FindChessboardCornersSB
  • VideoIn nodes
  • VideoPlayer nodes
  • CalibrateCamera
  • Others
New nodes
  • VideoIn nodes with lower level access to device index which enable using previosuly unsupported devices
General cleanup
Improved documentation
Moved beta OpenCV to DX11 transformation nodes to a separate document (\vvvv\nodes\vl\VVVV.OpenCV.vl)
ravazquez, Wednesday, Dec 16th 2020 Digg | Tweet | Delicious 4 comments  

Here is,

another addition to the series of things that took too long. But then they also say that it is never too late... VL was shipping with OSC and TUIO nodes from the beginning, but frankly, they were a bit cumbersome to use. So here is a new take on working with those two ubiquitous protocols:

OSC

Receiving OSC messages

To receive OSC messages you need to place an OSCServer node which you configure to the IP and Port you want to listen on. Immediately it will show you if it is receiving OSC messages at all on the Data Preview output.

Then use OSCReceiver nodes to listen to specific addresses. Either specify the address manually or, hit the "Learn" input to make the node listen to the address of the first OSC message it now receives.

Note, that the OSCReceiver is generic, meaning it'll connect to whatever datatype you want to receive. Supported typetags are:

  • i: Integer32, h: Integer64
  • f: Float32, d: Float64
  • s: String, c: Char
  • r: RGBA color
  • b: blob byte[]
  • T: true, F: false

In case of multiple floats, you can also directly receive them as vectors. And this works on spreads of the above types and even on tuples, in case you're receiving a message consisting of multiple different types.

Sending OSC messages

To send OSC messages you first need an OSCClient which you configure with a ServerIP and Port. Then you're using SendMessage nodes to specify the OSC address and arguments to send. Again note that the "Arguments" input is generic, so you can send any of the above types, spreads of those and even tuples combining different types!

By default, vvvv is collecting all the data you send and sends it out in bundles per frame. For optimal usage of UDP datagram size (depending on your network) you can even specify the maximum bundle size on the OSCClient node.

These are the basics. There are a couple of more things which are demonstrated in the howto patches!

TUIO

Receiving TUIO data

For receiving TUIO data you're using a TUIOClient which you configure to the IP and Port you want to listen on. The client already returns a spread of cursors, objects and blobs that you can readily access.

Sending TUIO data

For sending TUIO data you're using a TUIOTracker node which you configure with a ServerIP and Port. Then you give it a spread of cursors, objects and blobs to send out.


Available for testing now, in latest 2020.3 previews!

joreg, Wednesday, Dec 9th 2020 Digg | Tweet | Delicious 0 comments  

Previously on vvvv: vvvvhat happened in October 2020


November!

A seemingly calm month, but it is boiling under the covvvvers: First, you notice that we continue to update the 2020.2 release with bugfixes. The latest release is vvvv gamma 2020.2.4.

Then, as mentioned previously, we're currently mostly focused on getting a stable 2020.3 out which will include VL.Stride, our shiny new 3d engine. Best of it: you can follow our daily progress by downloading the preview releases. Already comes with tons of help and demo patches. Give it a spin!

Colors for nothing and shadows for free!

If you haven't yet, watch the recording of the last vvvv meetup where, among others, we have texone show us how he already takes VL.Stride to the next level...

And finally done are the completely reworked, easy to use OSC and TUIO nodes which will show up in one of the coming previews soon!

Contributions

Two new ones:

A little teaser:

And some new works in progress:

Learning material

Gallery

塵-The Dust by 杜鹏程

Jobs


That was it for November. Anything to add? Please do so in the comments!

joreg, Friday, Dec 4th 2020 Digg | Tweet | Delicious 0 comments  

Previously on vvvv: vvvvhat happened in August 2020


So once again, where were we...

oh, NODE20 just happened and never mind if you missed it, you can still access the recordings of all workshops!

If you haven't noticed yet, the latest previews for vvvv gamma now include VL.Stride, the fancy new 3d engine. We're quite happy with the feedback so far. Things mostly seem to work as expected. We're now focusing on making this preview into the first 2020.3 stable release including VL.Stride. But 3d is not all, we've also included a few other goodies in the 2020.3 branch, which are summarized in a separate blog post with the juicy title: vvvv - The Tool.

vvvv gamma 2020.2 is out and also the latest previews for beta41 already include the 2020.2.x branch of VL!

Contributions

Some new:

Updates:

Quite a few new works in progress:

Gallery

Visuals Reel by c nissidis

Jobs


That was it for September and October. Anything to add? Please do so in the comments!

joreg, Thursday, Nov 5th 2020 Digg | Tweet | Delicious 2 comments  

Probably,

the biggest NODE so far, in terms of reach. At least if you want to believe the viewing numbers on the videos of the daily streams. This time the whole world was able to participate and not only a handful of privileged being able to come to Frankfurt. What an undertaking to run a pop-up TV station for 7 days next to a 2 track, 9h a day workshop program...

On behalf of the whole team that made this edition possible, vvvv wants to thank david and Jeanne Charlotte Vogt, directors of NODE20 - Second Nature, for pulling the strings. Once again very well done, chapeau!

The team was huge and a lot of different things happened over the course of this week, too numerous to recap here. So in this blogpost I want to particularly summarize the vvvv focused parts and highlight the members of the vvvv community who helped make NODE20 possible.

The keyvisual by artist duo a_a_a_a and mburk calmly meandering in the form of an AR manifestation in most of the studio broadcasts.

TV Shows

You should watch them all: 7 days of quality panels and discussions around this years topic "Second Nature". But then, as promised, the following is a listing of the more vvvv related shows for your viewing pleasure:

  • The Emergency Broadcast Studio - Behind the Scenes, where readme talks about how he and bjoern took on the brave task of running the AR studio on VL.Stride, which was still very much in the works when they started...Must watch to get an impression of the effort of what was behind the studio setup. Credits also go to kopffarben who helped running the actual shows in the end!
Backstage at the EBS

Workshops

Massive thanks go to the whole team of The NODE Institute who ran the workshop and streaming operations on the ground together with: ravel, sebescudie, Rayment, katzenfresser and Ben Schiek.

And of course to every single one of the 26 workshop hosts and co-host who took the time to bring their knowledge to all of us: andresc4, Anna Meik, antokhio, baxtan, domj, dottore, elias, everyoneishappy, gregsn, Gene Kogan, hayden, idwyr, joreg, jule, kleinkariert, lasal, Maria Heine, Marian Dziubiak, motzi, ravazquez, sebl, sunep, Takuma, tonfilm, untone, vux.

The good news: Even though now that NODE is over, you can still access the recordings of all workshops!

Support

NODE is a community effort. Everyone is chipping in what they can. So finally I want to list a few companies without whose continued support in the form of material or human resources, NODE20 would not have been possible:


vvvv takes a deep bow in front of everyone mentioned. I sincerely hope I didn't forget anyones contribution but am well aware that this is not unlikely. So in case I missed someone, please someone let me know so I can add the info here!

After NODE is before the next NODE.
Back to work!

joreg, Thursday, Nov 5th 2020 Digg | Tweet | Delicious 1 comments  

vvvvolks. What a node!

Me and the complete team recover slowly. NODE was a blast and we can be incredibly proud to made it happen under the 2020 conditions. I do believe that the hybrid approach is something that has some future potential. Heads are spinning already how a next node would need to be.
To get some structured feedback we have setup this survey for all participants:
https://forms.gle/tMiVEMkQL89NMnKu8
You can help to make NODE better by filling this out. Thank you so much!

The recordings

Many have asked us for the workshop recordings. And here are some good news:

  • For those who attended NODE20: please go ahead and login to NODE20 on Talque to review the workshops there.
  • For those who did not attend but want to learn the same: We have bundled all the recordings into one course at the NODE Institute, which you can buy for 50 Euros only. 30 workshops, 90 hours of learning. Its massive!

Here is the story behind the decision: When we announced the festival in July/August it was clear that we have to give the ticket owners some exclusive access to the recordings afterwards to actually make them onboard the festival. Otherwise - we assumed - many could have chosen to simply wait until the festival is over and wait for the public recording. The festival would not have worked at all.

Now after the festival it feels a bit unnatural to hide the recordings to curious new people. Why not ride the wave of attention we created? Selling the recordings became an option. It would also help to close a financial gap of the overall festival budget. After some talks with the hosts about how we can handle this in a fair way we came to the conclusion that we will split the income between the Instructors and the festival. This feels natural as the institutes idea is to help the community to sustain and help instructors to get revenue for their educational work. The income does not got to the vvvv group but to all community instructors.

Love goes to all of the instructors and organizers and contributors. We are all deeply thankful for their effort and contribution. ravel, sebescudie, Rayment, katzenfresser and Ben Schiek, andresc4, Anna Meik, antokhio, baxtan, domj, dottore, elias, everyoneishappy, gregsn, Gene Kogan, hayden, idwyr, joreg, jule, kleinkariert, lasal, Maria Heine, Marian Dziubiak, motzi, ravazquez, sebl, sunep, Takuma, tonfilm, untone, vux, readme, bjoern,kopffarben and more vvvv people in the program.

Thank you !
David for The NODE Institute and Festival Team

david, Wednesday, Nov 4th 2020 Digg | Tweet | Delicious 1 comments  

The long wait is over!

vvvv gamma 2020.3 public previews now include VL.Stride, the new 3d rendering library, based on the opensource Stride 3d engine. You be the judge, but spoiler: this is rather huge!

Massive thanks go out to all early accessors who helped us uncover and fix countless buggers that you no longer have to run into. So this is also on your behalf. You're welcome!

Status

All of the basics are now in place. Find your favorite among these:

  • Primitives: Plane, Box, Sphere, Cylinder, Cone, Capsule, Donut and Teapot
  • Instancing: Via spread of transforms, other entities or buffers
  • Lights: Ambient, Directional, Point, Spot, Projector, Skybox
  • Shadows: On by default, configurable in quality/resource-consumption
  • Materials: Highly configurable PBR workflow through a large set of nodes, incl. easy normalmapping, displacements,...
  • Textures: From file, video, Spout, Skia, HTML or renderer
  • TextureFX: Basic selection available, more to come
  • Texture Feedback: Yes please!
  • Texture Readback: Absolutely, think pipet,...
  • ShaderFX: Experimental nodes to patch shaders visually
  • PostFX: Highly configurable via set of nodes, think: depth-of-field, bloom, ambient-occlusion, ...
  • Dynamic Meshes: Generate meshes using vertex- and indexbuffers on CPU
  • Shaders: Pixel, Vertex, Geometry, Compute. Write your own using full syntax highlighting in VisualStudio with hot-reload
  • Assets from file: Load textures and models directly from file
  • Assets from Stride Game Studio: Prepare assets and complete scenes in game studio
  • Windowing: Easy handling of multiple windows (and cameras)
  • Misc: Render Skia and HTML content directly onto the screen (ie. no texture-pass needed)

To give you an idea, here is a random collection of screenshots of what earlyaccessors have created with this already.

Still missing

To give you a heads-up, here are things you might expect already but are yet to come:

  • Loading models does not bring all their materials and animations in yet. To get a models materials showing automatically, you need to load them as an AssetModel via an extra Stride project. Animations are not yet supported.
  • Simple Text Rendering: for now best done via Skia or HTMLRenderers
  • Physics nodes are available in the Experimental section, meaning you can use them but still we want to give them one more round of refinement
  • VR support is still to come. Stride supports it, we just haven't exposed it yet properly
  • Things you'd do with the most popular contributions InstanceNoodles, DX11.Particles and FieldTrip are still missing a corresponding functionality

And then some more, but the above should be the most obvious ones you'll stumble upon.

How to get started?

Open the Helpbrowser (F1) and check out the explanations, howtos and examples. Remember the preview status, ie. those are not yet in their best shape. But they should help you find your way.

And if you really got nothing better to do in the week of October 2nd to 8th, then consider joining us for NODE20 where we have the following series of workshops dedicated to getting you started with VL.Stride:

A tickets costs 50€ and gives you full access to all of the above and so many more workshops and their recordings.

Pro-tip: This is a so-called no-brainer!

Thanks

A couple of people believed in the development of VL.Stride from the beginning and substantially supported its development. We bow before you:

joreg, Thursday, Oct 1st 2020 Digg | Tweet | Delicious 9 comments  

bonjourbonjour,

Following tests from a few months ago regarding publishing your shiny VL nugets with Github Actions, we now have a dedicated action ready to be used!

Nuget what?

For more informations on nugets, please refer to this section of the Gray Book.

Github what?

Github actions are small scripts with a specific purpose, allowing you to automate tasks on your repos. They are actually the building blocks of what's called a workflow : you chain several actions one after the other in your own small script, and decide under which condition the workflow is triggered (a new commit on master, a new tag, etc).

There are already tons of actions allowing you to do all sorts of things from creating issues to running unit tests, and even make phone calls with a predefined text!

The PublishVLNuget action

This action allows you to easily do the following tasks :

  • Build your Visual Studio solution, if your plugin has one
  • Download a package icon from an external url if you don't want to commit it to your repos every time
  • Pack your nuget either using a nuspec or a csproj file
  • Publish it to nuget.org (or any other feed)

In other words, you just have to push your code/patches and nuspec to github, and the script takes care of bringing it to nuget.org for you.

You can find the list of input parameters the action expects here.

How do I setup a workflow?

To get started with workflows, head over to the Github documentation.

Use cases

The action can adapt to many different scenarios. Let's cover three cases so you get an idea of how the action works, and how to adapt it to your needs.

My plugin does not have a visual studio solution

So your plugin solely consists of one (or many) VL documents and some help patches, plus your nuspec file that describes how your package will be structured.

(...) 
     - name: Publish VL Nuget
       uses: vvvv/PublishVLNuget@1.0.28
       with:
         nuspec: deployment/VL.MyPlugin.nuspec
         nuget-key: ${{ secrets.NUGET_KEY }}

Here, we are just using the nuspec and nuget-key inputs of the action.

My project has a Visual Studio solution and no nuspec file

Your csproj file can also describe how your nuget will be packed. In that case, simply omit the nuspec input. Note that if you provide a nuspec file anyway, it will take priority over your csproj.

I want to push to another nuget feed

By default, the action will push your package to nuget.org. You can simply use the nuget-feed input to push to a different feed.

Regarding external icons

I want to use an external icon, and my plugin has a nuspec file

The icon must be downloaded to an existing folder in your repo. We suggest you simply download it to its root :

(...)
- name: Publish VL Nuget
    uses: vvvv/PublishVLNuget@1.0.28
    with:
    (...)
    icon-src: https://wwww.url.to/nugeticon.png
    icon-dst: ./nugeticon.png
Paths in the workflow file are relative to the root of the repo!

Here, we ask the Github Action to download the icon from our URL and place it at the root of the repo.

Then, in the file section, your nuspec file must reference it from where the action will download it (src attribute) and place it wherever you like (target attribute), making sure target matches where the metadata section expects it.

(...)
    <metadata>
        (...)
        <icon>icon\nugeticon.png</icon>
    </metadata>
    <files>
        (...)
        <file src="..\nugeticon.png" target="icon\">
    </files>
(...)
Paths in the nuspec file are relative to where the file itself is placed!

I want to use an external icon and my plugin does not have a nuspec file

You can setup an icon for your project inside Visual Studio. The tricky part here is that you'll have to specify a path to a file that does not exist yet, since the Action will take care of downloading it later on. This can feel weird since Visual Studio's UI gives your a Browse button for you to pick a file. Simply fill the path manually to match the icon-src property of your workflow file.

For instance, your workflow file would look like this:

(...)
- name: Publish VL Nuget
    uses: vvvv/PublishVLNuget@1.0.28
    with:
    csproj: src\Whatever\Whatever.csproj
    icon-src: https://wwww.url.to/nugeticon.png
    icon-dst: ./deployment/nugeticon.png
    nuget-key: ${{ secrets.NUGET_KEY }}

and your VS package settings :

Thanks for reading, hope you'll enjoy using this one! If you are stuck or want more precision, don't hesitate to shout in the comments or in the forums.

Cheeeerz

sebescudie, Friday, Sep 25th 2020 Digg | Tweet | Delicious 0 comments  

Hello everyone,

Introduction

I'd like to give you an update on the toolkit front, that vvvv has always been. While vvvv beta can be described as a dynamic system, mutating while you mold your patches, vvvv gamma and its workhorse VL are of a different kind. With VL we embraced features like

  • static typing with its ability to detect errors early,
  • .Net DLL import opening a universe of possibilities,
  • user-defined data types that interplay with those defined by others,
  • compilation with its ability to export an app as an executable...

In short, we embraced robust software developing strategies that at first seem to contradict the idea of a playful development toolkit that allows you to mold your app. We went for compiled patches, running, and hot-swapping them while you are building them.

But we envisioned vvvv to be both

  • the playful toolkit you fell in love with
  • combined with the promises of a compiled language

While my last blog post was about the language, let's focus on the toolkit this time.

Toolkit

Let's have a look at some features that allow you to interact with the VL runtime, the system that runs your patch while it allows you to edit it. The features here empower you to enrich the patching experience. We understand that these improvements need to "trickle up" into the libraries and only thereafter will have an effect for all users.

So the following is probably mostly interesting for advanced user and library developers.

Tracking Selection within the Patch Editor

You now can react to a selection within the patch editor. The more libraries do this the more playful the environment gets. We still have to figure out all the use cases, but here is a list of what's possible already

  • separate the core functionality from its Editor UI. Imagine a TimeLine node that is decoupled from the timeline editor.
  • an Inspector for nodes or pads
  • a Preview like this:
preview nodes
  • even the help browser itself uses the feature to provide help for the selected node.

And there is more:
You can get a Live Element for a certain Pin or Pad.

  • Copy the permanent identity of the element into the clipboard by CTRL-SHIFT-I (I stands for Identity).
  • GetLiveDataHubForSerializedID hands you the pin or pad.

useful for the cases where you want to always inspect a specific pin or pad of some patch. This can be helpful for debugging.

Let the Patch Editor navigate to a Patch

When a Skia Renderer is your active window, Ctrl-^ let's you jump to the patch in which it is used. This is handy when you opened a bunch of help patches and you want to see the help patch that is responsible for the output.

You can use the node ShowPatchOfNode to do the same trick.

Tooltips for your own data type

Here you can see a custom tooltip for a user patched type "Person".

A european

You now can patch your own tooltip with RegisterViewer. This way the patching experience will be so much more fun. We're in the process of adding more and more viewers for our types.

Runtime Warnings

Up to now, we had

  • Red elements: Static errors. (E.g. a node that can't be found) These errors make the compiler ignore certain parts in your program as they are currently in development. The rest still runs. (Something what C# and others just can't)
  • Orange socks on links: Static warnings, potential problems. Something to look at when searching for a bug.
  • Pink nodes: Runtime Errors. A problem that only got detected during runtime and which is such a big problem suddenly that the system can't work as planned. Some patches don't run as planned. There are different ways how to handle these, pointing you at problems at runtime, but they can be painful.

And now we introduce to you:

  • Orange nodes, Runtime Warnings: They show you a problem at runtime. But it doesn't harm your system as pink nodes do. Orange nodes are runtime warnings. Library developers can put warnings on their nodes in order to communicate to the user that something is slightly off.

You can try it yourself by using the Warn or the Warn (Reactive) node.

just a reminder

The warning will not only show up on the Warn node, but also on the applications of your patch.

S&R nodes

Sometimes it's just convenient to be able to send data from one patch to another without the need of feeding the data via pins. We now have send and receive nodes, like known from beta.
Features:

  • The channel can be anything. It doesn't have to be a string.
  • They have several warnings. E.g. for when none or many senders are on a channel that a receiver is listening to.

Descriptive Tree patching

Some libraries focus on a simple idea:
Let the user build an object graph that describes what he wants in a declarative manner and the library will do the hard work to follow that description.

a tree

Examples for this approach are

  • VL.Stride
  • VL.Elementa
  • to some extend VL.Skia

VL.Stride and VL.Elementa have in common that they focus on a very certain type of object graph: A tree made out of entities and components.
Libraries like these can now talk to the user and enforce the user to not build any kind of graph, but a tree-shaped graph (where one child doesn't have many parents).

VL.Stride uses TreeNodeParentManagers, Warn nodes and S&R nodes internally to the deliver this feature:

no tree. we get a runtime warning

You'll very soon be able to inspect those patches.

Help patches to all those topics will show up in the CoreLib API section (at the bottom of the listing).

We hope you'll enjoy these ways of integrating with the development env.
Thank you and we'll see you soon!

yours devvvvs

gregsn, Wednesday, Sep 23rd 2020 Digg | Tweet | Delicious 5 comments  

Helo evvvveryone!

Are you teaching or studying vvvv in an educational institution? Want to join NODE20 with a group of students? Please get in touch, we want to offer you a discount!

  • Email edu@vvvv.org
  • Tell us about your institution and how many students want to join
  • We'll get back to you with an offer

NODE20 features 25+ online vvvv workshops on various topics covering the needs of beginners and advanced users. This means the week of October 2nd to 8th will be a very good moment to divvvve deep. Besides, there'll also be a rather fine conference program you'll not want to miss.

vvvv in education

You may say we're biased, but we believe that vvvv is one of the more suitable ways to get people in touch with topics like creative coding, generative design, computer graphics, interaction design, data visualization, computer vision, physical computing, machine learning and similar. This is even more true for the all-new vvvv gamma. Why? Here is a list of pros and cons with a focus on use in education:

Pros

  • It is free for non-commercial/educational use without any restrictions or registration
  • It is quick and easy to install
  • Its visual live-programming approach allows you to get to results and iterate quickly
  • It comes with a HelpBrowser that makes it easy to find tutorials and documentation on various topics
  • It is easily extendable with custom nodes by writing standard C# code or using almost any .NET nuget
  • All of its libraries are open-source, meaning they can be explored, learned from and extended
  • It uses industrie standard programming concepts known from object-oriented programming that once understood, can also be applied in other programming languages
  • The core developers and fine community of users offer direct help support via chat and the forum basically 24/7/365

Cons (status fall 2020)

  • It runs on Windows only
  • It cannot export to mobile, the web or microcontrollers

Have more? Let us know in the comments.

Hope to see you at NODE!

joreg, Monday, Sep 21st 2020 Digg | Tweet | Delicious 0 comments  

anonymous user login

Shoutbox

~11h ago

schlonzo: also a nice way to hide jitter

~11h ago

schlonzo: I really dig the alpha fade on texture previews in 2021.4

~17d ago

yar: And now it's opertaional. It's just a concept, bugs expected

~17d ago

yar: sorry, one moment it's broken

~17d ago

yar: concept of genrative telegram bot: t.me/arktkbot