Weird behavior of Decay (Animation) node if it's in a subpatch that isn't being evaluated every frame

Hi, I’m observing some rather unexpected behavior when I put a Decay (Animation) node into a subpatch, and only let that subpatch evaluate every couple of frames.

It seems like the Decay node would determine the step size that it needs to take by calculating the decay divided by current FPS of the vvvv instance.
This makes it actually decay much slower than specified if the subpatch isn’t evaluated for every frame.

I’d expect that if I specify that I want one second of decay time, it will decay for exactly one second, no matter what my frame rate is, making variable-size steps if required. While it seems to do that if it gets evaluated every frame, it seems to not take into account the fact that this isn’t being the case in this situation.
Whatever it does seems to be somehow based on the number of times that it’s being evaluated and the mainloop framerate, not any kind of wallclock time.

See attached patch for details, and feel free to request further information if you need any.

weird-decay.zip (2.4 kB)

If you stop the evaluation of the subpatch, how Decay is supposed to know that time is running? If decay it’s supposed to go on with its own calculations while Evaluate = false, then what would be the reason of Evaluate?

Every four frames.

Yes, it’s how it works. In your patch Decay output pin decreases its value by 1/fps. Raise MainLoop framerate to 20 and you’ll see step will be 0.05.
I’d say this is pretty common in framerate based software (that’s why 60fps games have smoother animations than 30fps ones).

It does decay for one second, it’s just that since you command it to stop calculate once in a while, or better every 0.40s - and before it can reach 0 you feed it with another 1 to decay from -, it never reaches 0.
If you just let decay work for each frame vvvv is running you would have an expected behavior. Because you set vvvv framerate at 10fps and then lower subpatch fps by setting evaluate to false.

I’d say you didn’t patch this, meaning that if you want such a thing you should probably patch something that returns such a result - since you’re actually stopping subpatch work, you should let it finish its job before feeding a new value, for example setting LFO Period to 4.40s if you want it to go from 1 to 0 in a 10fps environment which is 0.40 * 11 (11 is the number of steps to go from 1 to 0 with a 0.10 step) or have alook at the patch.

Hope this helps.

weird-decay_mod.v4p (42.2 kB)

I’m not sure if you fully understood what I mean.

The Decay node gets a decay time in seconds, not frames. If that would be a frame count value, I’d have expected this behavior.
However it does seem to adapt to the actual evaluation rate of the main patch, i.e. if that drops below the configured frame rate, the decay compensates for that by making a bigger step.
This further supports my understanding that it’s trying to let the decay take the specified wallclock time, not some number of evaluations that was guessed based on configured FPS.
This leads me to the assumption that if I don’t let it evaluate by putting it into a subpatch with disabled evaluation, it will of course keep its output value while it’s disabled, but when it can evaluate again it would make a bigger step to get “back on track”.

What I’m trying to do here is letting a whole bunch of subpatches tick synchronously with the DMX bus that they output data to, while having a higher mainloop framerate for timing resolution reasons (especially tap sync precision). I would not expect that to affect this kind of time constants.
An LFO, for example, behaves like I’d expect in this situation: If I only evaluate it occasionally, it will make bigger steps to compensate and meet its time constant. I’d expect decay to do the same. Those two nodes behaving the way they do seems inconsistent at best.

Why didn’t you post a patch showing two nodes behaving differently - i.e. putting a LFO inside the subpatch along with Decay one? That could be very helpful.
-BTW, it made me think of a bug in LFO, instead :D. If one stops evaluation of a patch containing a LFO for a few seconds, at re-evaluation you’ll find that Cycles “jumps” many values.
I tried, though these wouldn’t fit your needs, with LinearFilter: it reaches 0 in 12 frames (4 + 1 evaluate = false + 4 + 1 evaluate = false + 2 to reach 1 second), Damper in 12 frames. So, more inconsistencies.
I still think that stopping evaluation is not the best choice in your scenario, and that you could, for example, reset “something” in your patch.
Just saying: Integrate (Differential) would behave as Decay.

Hope this helps.

fixed in upcoming release. soory for your trouble.

I think this should be noted somewhere in the docs for plugin developers. I was not aware of this difference between “evaluate” (as in evaluate a function at time X) and “enabled” (as in “stop doing something”).

This means that for every node there needs to be a definition whether it is time-based (aka can be evaluated in the future or at any point of time) or previous-value-based (aka can only be done stepwise), and then plugin devs need to decide which of them their node is…

And, of course, there is a grey area here - what about Integrate, for which both definitions seem valid - take the last val and add something vs. integrate over the passed time and add the result. And then Newton etc., which computationally need the last result to go on or would have to basically do an internal integrate to be consistent with Damper/Decay… headache :)

@gregsn: what’s the official standpoint on that? As h99 points out, there are already a lot of inconsistencies there. Maybe also something to think about for vvvv50…

Of course, fortunately, I think not a lot of people are using Evaluate for such computation thingies :)

I would like to ask this too:

I would say that every node that lets the user work with time should treat time in a respectful manner: It flows - no matter if the node is evaluated or not.
Every node that doesn’t work with time should work frame-based (less calls slow down the animation).

Time based nodes:
A LFO that we didn’t check for 10 seconds should tell us that it cycled 10 times in the meantime.
A LFO with a period of one millisecond also should spin several times in the background from frame to frame in a standard 60fps use case.

A user should be right with the assumption that a time-based animation is easily imaginable as a continuous curve (like known from a timeline) that can be sampled for any point in time - typically most importantly for the future and typically using a global frame time.

The standard Output of an LFO and the Cycles output may be seen as one linear curve split in fractional and whole part.

All filter nodes like newton, deniro, oscillator, linearfiler, damper internally work that way anyway. They somehow model curves - everyone in a specific way - but all with the idea that one function “sample” will be able to output a position for any time in the future.

Simple filter nodes build up these curves whenever an input changed, advanced filter nodes let you choose. But they all work by the idea that filtering is just sampling a static filter curve that has been built at any time in the past.

Decay is very old and was designed a bit different. It didn’t work with the global frame time, but with the global delta frame time, which explains the unexpected behavior.
I’ll tell you how i fixed it as it might be interesting for implementers of time based nodes.
I just computed my local delta time telling me the difference between the individual calls and used this local delta time wherever the global delta time was used before. That way i didn’t need to rethink the whole node.