Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.
hey everybody! Check out my blog about this whole thing, I spend lot of time on it! it covers:
(How I got to know) Milan Adamčiak and his inspiring/playful work
Grafofon, a 6 meters long instrument, inspired by analogue experiments, but driven by computer vision
Technical details of the project, esp. in its last iteration — the most technically complex, but also the most successful
What I learned along the way and what I plan for the future
Design: Waltz Binaire
Artistic Direction: Christian Mio Loclair
Generative Design: Marta Soto Morras
Typography: Johannes Poell
Soundtrack: kling klang klong
Concept and Backend: IDEO
Exhibition: Design Museum London
A dark, clear fountain reveals insights to colorful sediments of data.
Continuously drifting, weaving and organizing, each stone carries
memories of encounters filled with joy, sadness and fear.
Talk to me is an abstract interactive installation to experience a conversation with an artificial entity. Each statement made by the user shapes a unique pebble falling into a well of data. Once the user leaves the installation, its stones transform to data sculptures illustrating a collected set of critical thoughts and hopes.
credits Jia-Rey Chang & Peter Gate
This is the first time I used vvvv to create a real-time interactive visualization with live performance.
“The Deep Sound of Maramures” took place on 10th MAY at Control Club, which is the famous hotspot of Bucharest.
In the project of “The Deep Sound of Maramures”, with the unique geometric graphic style and spatial related artistic philosophy, several visual effects along with Peter Gate’s (Petru Pap) composition were created. A sequence of spatial visual elements was inspired by Peter’s “The Deep Sound of Maramures”. They (music and visual) are not fixed forms; they are alive. In order to make the audiences experience a unique journey, the approach is to create visually alive elements to follow the emotion of the music. It is not a common background visual effect running through the concert, but rather a set of real-time interactive resulting images interacted with the live music performance. For example, in the scene of “sea waves”, the speed and the curvature of the waves will be lively-generated in accordance with the live music. It illustrated a journey of a bird’s fantasy through different spatial environments from the nature of earth to outer space, from concrete landscape to abstract imagination…etc. While adding the time dimension of the music, it generates a 4D immersive space for people to fully engage in the show. This video represents the teaser/short version of the whole interactive sound/visual performance.
Hi folks. I'd like to share the video we've been working on for the last weeks with my buddy Kevin Hughes.
It's for the song "El triunfo del amor" by the patagonian artist "Shaman y los pilares de la creacion".
It was made capturing the raw kinect ( V1 ) data with the MS Kinect studio and then processing it with the PointcloudBuffer (DX11.Pointcloud Kinect) .
It was all captured with a blackMagick and then edited in Adobe Premiere.
The background footage was filmed in Patagonia Argentina.
credits Roberto Vitalini (@Bashiba) .:. Sebastiano Barbieri (@Noir) .:. www.bashiba.com
Interactive Art // Live Recording
Everything we see is atomized. Our boundaries, the skin that separates objects and people is gone.
Notice to Vimeo users: to see the particles you might need to watch this in its original resolution (4k) or at least 2K.
3D data: Microsoft Kinect
Spatial Sound + sonic events + Real-Time Rendering: Bashiba + vvvv
Programming: vvvv.org #bashiba #noir
credits didi bruckmayr aka sinus and florian berger aka flockaroo. raum.null
rendered realtime in vvvv and custom openGl engine with spout by didi bruckmayr aka sinus and florian berger aka flockaroo (check him in shadertoy!) credits: dx11, instancenoodles, particles unplugged, raw raymarching by sinus
Hello Devvvvs, this is my first time i post my work in here, i'm from Indonesia (are there anybody else from Indonesia too in here? hehe).
The visuals created by using vvvv and reacts to the audio and midi inputs from the ableton.
The musician play some midi notes in ableton while using the A.I. Duet and then
the A.I. Duet send back the midi response to ableton, and all the midi from ableton send to vvvv using osc (thanks to ((oscdevices-ableton-m4l-vvvv-tools.))
Im still learning vvvv and really open for any suggestions and critics.
I learn so much from this forum.
credits Andrej Boleslavský aka @id144 ― digital artist | Mária Júdová ― digital artist | Patricia Okenwa ― choreographer | Soňa Ferienčíková ― dancer | Roman Zotov ― dancer | Carmen Salas ― creative producer | Miles Whittaker aka Demdike Stare ― musician
Dust is a virtual reality piece which invites the audience to experience dance performance from the perspective of eternal particle travelling in the space.
“As an improvisational tool, VR can inspire creative movements; as an educational tool, it can record choreography and encourage public engagement, and, for us, it is a tool for endless artistic expression.”
Mária Júdová, Andrej Boleslavský.
anonymous user login