Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.
credits Emilio Cordero Checa Andrea Moreno Orts Software: VVVV
We used vvvv to manipulate and spatialise the sound in an 8.1 sound system with the movement of the tattoo machine, also to create real time visuals and finally to control the lights with audio using dmx.
THE USE OF HEADPHONES IS RECOMMENDED TO HEAR THE BINAURAL RECORDING PROPERLY!
Originally, the purpose of the tattoo went beyond aesthetics and contained narratives within each of the cultures that practiced it, developing unique symbols around each one. Achieving great popularity in recent years, the rituality of the act has transformed, progressively pulling away the tatoo from the world of senses. It has become only a commercial good. Through this project we intend to detach from the common archetype of the tattoo and its current practice. We present a whole new series of still unknown formulations in order to experience tattooing from different sensual routes and enhance the act of tattooing.
Transcripción 3.0 is a series of live experimental acts where, through the practice of tattooing with the use of new technologies, our aim is to transcribe into the bodies of both performers and audience a series of sound and luminous phenomena. A new tattoo practice is proposed, using technology to reinforce the sensory act.
In the performance there are three figures; the tattoo artist, the Tattoo and the interface (software and hardware), functioning independently but connected by the same space and time. By the act of feedback the body and the machine communicate with each other through data, fluids, movement and sensations. The new tattoo is revealed not only in the wound - by attaching sensors to the tattoo machine and extracting data we open the possibility of transcribing this information using the displacement of the sound waves and the movement of light to affect the senses of vision, hearing and touch.
Get an insight into the new Mercedes EQC with two kinetic shift screens. This exhibit is first of its kind, combining digital augmentation, vector field particle simulation, custom fractal lightning, high resolution 3D geometry, displayed on stunning screens.
All of it, just a touch away. The touchscreen allows you to control the movement and to select the physical and digital animation.
Concept & Lead // Atelier Markgraph
Software & Art Direction // wirmachenbunt
Hardware // Expotec
credits wirmachenbunt / Atelier Markgraph
This is an attempt to show a little bit more of a project, than just a video. There are some interesting bits and pieces, relying on brilliant contributions and sometimes overlooked cookies.
But first, let's have the video anyway.
The EQC-Scanner is sort of a augmented, kinetic installation. One could argue with the term augmented here but it certainly adds information to the "real" layer. The whole thing is controlled by a touch screen, allowing you to pick topics or move the screens with your fingertip. It is all and all car technology communication, but in a playful package.
This is based on some articles like this LINK A pretty neat recursive routine to learn what recursive is. I used c# but i bet this is easy in VL, anyone ?
The plugin was used for the battery scene. Thank you captain obvious :)
Usually, when you are in RS232 or some fancy protocol land, you have to decode and encode values efficiently. Like encode high values with little use of digits. I always come back to jens.a.e'sBitWiseOps, this is one of the overlooked contributions.
For this project, the plugins were used to encode the engine controller messages.
The image above shows some sensor recordings of the screen movement. Creating this data viz in vvvv was quick and revealed a physical feedback loop. The violet curve shows the frame difference of the screen position. And while the real screen movement actually looked kind of smooth, the sensor data showed some heavy regular spikes. Obviously, the engines did some regular overshooting. Not a big problem and solvable on the hardware side. Interesting how data visualization can help to track down problems.
Controlling Particles in a meaningful way can be painful. Using vectorfields can bring some structure into the chaos. The tool Vectorraygen helps generating the vectors the easy way. It even has a node-based enviroment to drop some organic variation into your fields. And btw. the devs are very friendly.https://jangafx.com/
The tool was used to create the key visual, 500k floating particles along the car exterior.
Sure, this is not a big secret, it's one of vvvv's selling points. But i have to say, it just works. Bringing together 3 machines was actually fun. Hopefully VL preserves this killer feature.
This was pretty much the first time i really used my own plugin and it was surprisingly helpful. :) It consists of a initialization tree for the engines & sensors and an abstract transition model of the visual software part. This is an attempt to leave content out of the statemachine but rather use the state TOPIC for every topic in use. It might be harder to read and it doesn't allow you jump to a specific content, but it makes your software expandable (without touching Automata again).
Not sure if this is a best practise for everything, let's see how this works out in the future.
There is always the dream of a fully, totally dynamic vvvv project. Our ruby-on-rails based web tool helps to manage all texts and images. It even renders the texts as images, freeing vvvv from rendering crappy text but rather display smooth pre-rendered images. Most of the vvvv project is depended to the CMS but of course, there are some bits which are always hard-wired, like 3D stuff.
I hope you find this "article" informative, any questions, comments ?
a collective spatial installation in complementary colours.
By changing red, green and blue ambient light the visitor is constrained to focus on the respective complementary-coloured design elements. This simple effect induces the illusion of motion in actually static pictures and objects. Various techniques from the beginnings of animation are used to tell tiny stories, altogether creating an immersive visual experience in a contemporary moving-picture language.
cascais edition was part of Lumina Light Festival from 08.09. – 11.09.2016, commissioned by lumina.pt
Another little demo/POC of my live visuals.
Music by Antanas Jacinevicius
I used pre-recorded sample of "The Bridges bewteen the Universes" by Bodysoulspirit and some random Gentoo weekly update video from YouTube + added some sweet analogue and digital feedback effects (digital in VVVV).
Fragment of our sweet show with Antanas Jacinevicius and Kira Weinstein.
Show took place in Electromuseum, Moscow @ 01.06.2017.
I used some digital-analogue video transformations and a bit of feedback effect.
I used pre-recorded sample of "The Bridges bewteen the Universes" by Bodysoulspirit and some random Gentoo weekly update video from YouTube. It all relates to my previous video "Merge the Void" - https://vimeo.com/219812428
Disturban is a study of a modern city and its enormous social, technological and media pressures by native digital citizen Ergo Efremov (also seen and heard in Καταπυγον and HAUP me). It is an exercise in total deconstruction of electronic dance music which reaches high level of abstraction and yet preserves conceptual unity.
Sound of the project, described by listeners as “something between dub techno, muslimgause and noisegrind”, consists of drones, autistic samples, overused drummachines and polyrhythmic breaks. All melts together in grotesque and corporeal sound through a series of alchemic permutations, both digital and analogue: granular sampling and re-synthesis, signal leaks, buffer underrun and overflow, power starvation, bizarre feedback loops and artifacts of noise-canselling processes, compression and distortion.
Dissolve in harsh but tender hug of a soundwave.
All hail the beauty of a bug!
anonymous user login