» contributions vs. the addonpack
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

contributions vs. the addonpack

on optical-flow elliotwoods commented:
"i think the AddonPack is a bit of a failed concept. It works for some users but I personally hate the idea of dumping all available addons into my vvvv folder."

just to clarify: we are aware that the situation with contributions and the addonpack is not perfect. the ideal solution would of course be a fullblown package-managing-system that handles:

  • dependencies
  • multiple sources
  • experimental/stable versions
  • a convenient browser
  • automatic download of missing plugins

sou..but until we have such a tool we think the contribution/addonpack duality works quite well. here is why:

without an addonpack we'd have the forum full of such:
"get this patch, works with beta26, needs contribution A version 2, B version 1 and contribution C version 3 (the one you find on myblog.com/contribC, the one on vvvv.org/xyz is outdated!)

an enduser (and we all are!) should not need to know about addons. for the first contact of a user with a node it is not important for him to know if it is an addon or a native node. if he is interested in that though, he can easily find out via the author-tag in the nodebrowser and find out more about a specifc addon.

the addonpack is a single download for the enduser that makes sure he gets working versions of all addons and their dependencies for a specific vvvv release. true, this adds a bit of a startup-lag but we think this is a good tradeoff for potentially reducing the problems of missing or out of date addons. no fiddling around with individual addons, just get the pack, don't touch it and you should be save 90% of the time.

now not every contributor wants to deal with github so we introduced a second standardized way to contribute addons, the contributions. here it is easy for everyone to upload stuff. also the integration of downloads into ones vvvv installation is easy]: make a directory "contributions" say on your desktop, reference that directory in vvvvs root and put the downloads in there. done.

of course the contributions bare the risk of becoming out of date but again that is an accepted tradeoff in order to make it possible/easier for more people to contribute.

that being said, ideally all addons would be developed in a fork of the vvvv sdk. like this it is very convenient for fellow coders to test/contribute to your stuff by simply pulling your feature-branches. in order to get an addon tested by people not familiar with github it makes perfect sense to upload binaries to the forum and get them discussed.

when an addon is ready for primetime all a developer has to do in order to get it included in the addonpack is then to send a pull-request to the main vvvv-sdk repository. once accepted we can all be sure that a version of all plugins working with a specific release will be available for all users with a single download.

and somewhere over the rainbow when we have the packaging-system we can stop distributing an addonpack and the system works directly with git in order to serve you always the freshest experimental/stable versions of only specifically requested addons (from even potentially different repositories, not only the vvvv-sdk). see? easy as cake.

till then, thanks for all your great work.
your devvvvs.

joreg, Monday, Nov 14th 2011 Digg | Tweet | Delicious 7 comments  
fibo 15/11/2011 - 01:41

Maybe it is OT, but I started this repo https://github.com/fibo/vvvv-patches

mrboni 16/11/2011 - 00:44

only partially on topic, but I still don't think the contributions folder makes sense until there's a good (automatic) way to include all plugins/effects etc used from there, local to the project when you want to make the project portable (which is most of the time, at least for me)

elliotwoods 18/11/2011 - 11:03

my general idea:

Resources

We define a resource as a folder. The resource's name (e.g. 'OpenCV') is the name of the folder. That folder contains:

  • Plugins
  • Templates
  • Help
  • Modules
  • Effects
  • Resource.xml (contains author, description, changelog)

By putting the resource's folder into the NodeList search path, then a user can use this resource.

For a developer

They've been developing their resource inside a folder locally.

They get to the point they're happy to upload it, and do so using the VVVV interface.

During upload:

  • Developer logs in
  • If first time resource uploaded, then the resource name is registered on the server
  • Each resource's name must be unique (that's its identifier)
  • The version number for this upload is incremented (previous versions are automatically kept on the server)
  • Each upload has a changelog comment
  • Developer can edit some other aspects of Resource.xml in GUI (e.g. credits, usage notes)
  • Upload (vvvv zips up the resources folder, stores as a version somewhere on vvvv.org)

For a user

NodeBrowser enumerates local and remote (i.e. in NodeContributions) nodes whenever you create a new node.

If you select a remote node, VVVV automagically downloads the dll or v4p, puts it into a folder 'e.g. vvvv\downloaded resources\EmguCV\'
and puts it into your graph (as if you'd selected a local node)

VVVV stores the version number and resource name against the DLL/v4p node in the patch

The NodeBrowser would also provide options for

  • Download all missing resources for the loaded graph (i.e. all red nodes)
  • Highlighting nodes in the list where updated versions are available / performing those updates automatically

Trying this out

i suggest creating this in its most simple sense as a demo to start with (especially since NodeBrowser is already open source / user editable)

On another note. Github works great for source code
but i think for users (e.g. dll's) i think contributions works better for users

Further possibilities

  • Thumbnails
  • Comments / Tips (issues should generally go into github against the source)
  • Alternative repositories (i.e. private). This may be useful for bigger companies who have lots of in-house nodes.
elliotwoods 18/11/2011 - 14:49

Reread joreg's (when I was reading it the first time I didn't realise it was joreg).

Realise it was more of a statement than a discussion :)

I am suggesting to scrap the addonpack and replace it with something else. But I'm not going to stomp feet. I totally appreciate people who love the addonpack. And will give it more time than I do at the moment.

I would like a response to the above suggestion though (including standardising a resource to a folder with the outline as above).

vux 18/11/2011 - 18:46

The resource idea is quite nice actually, since lot of distributions might include more than just plugins (often you need a plugin + a few shaders for example). In some ways that would not be incompatible with the addonpack, they could work well together.

I still like the idea of a "static" pack, since auto download sounds good on first instance, if i need an addon in a place with no internet (that still exists ;) , then there's no way to download it.

Doing an internet check is fun for developing, but on a permanent install i got a stable release and I want to leave it as it is, so having internet check for version would just slow down startup time a lot (having it as a global option is cool tho).

The biggest problem i see with addonpack is not really knowing what's inside, i'm sure some people (including me) coded a plugin for something to realize afterwards it was already done. Eventually having some form of installer with a description for each plug would be useful (like standard installer does, you can select features).

joreg 21/11/2011 - 04:14

i didn't mention the "resource" idea in my outline. i'd still call them addons though. and addons should of course be able to consist of more than a .dll and therefore be able to have their own (but standardized) subfolder structure. yes, this is missing and needed.

but as vux points out it will work well together with the pack. even such substructured addons will best be inside the oneclick download pack for the users convenience.

the idea of an upload/downlod package-manager i think is more of a developer fantasy. this is how we'd love to do it right. only making it more than a proof of concept (checking dependencies...) will take a lot of effort which we users wouldn't care for. for us users (as vux pointed out) it would be much more important to get a better overview of whats there. this is were the nodebrowser and the node reference are lacking, not a package-manager.

so yes, we're basically on the same track with the resource/structured-addons idea..just not implemented yet.

  • 1

anonymous user login

Shoutbox

~3d ago

skyliner: wanna do drone shows or applications? then check this super cool project of our man e1n

~7d ago

NoseBleedIndustries: Thanks Joreg! The few minutes I was able to see, very good workshops!

~7d ago

joreg: @NoseBleedIndustries please give us some days, we'll have an announcement soon...

~7d ago

NoseBleedIndustries: I could not assist the Node20 (workshops ) Any Idea when we will have access to the links of the recordings?

~9d ago

bjoern: unity has c# bindings for usd, under apache license: https://github.com/Unity-Technologies/usd-unity-sdk

~17d ago

ravazquez: @synth yes they are being recorded and will be available for future consumption

~17d ago

synth: Another stupid question: Are all #NODE20 sessions recorded and accessible for later viewing in case someone missed something?

~19d ago

joreg: Get a fresh drink and some snacks: Live in 45 minutes: #NODE20 opening: https://youtu.be/SlKKyEUihhY