» Blog
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.

Blog

new post

Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.

  reset tags

addon-release core-release date devvvv gallery news screenshot stuff
delicious flickr vimeo
Order by post date popularity

credits Joshua von Hofen, Nils Nahrwold, Muthesius Kunsthochschule Kiel

Responsive Fabric

Joshua von Hofen
Nils Nahrwold
2015/16
Supervision by Prof. Tom Duscher and Chris Engler

A cloth that gives haptic feedback. It translates touch into sound and vision.
Made with vvvv and a Kinect 2 sensor.
The installation „Responsive Fabric“ turns an seemingly ordinary cloth into a tangible interface. An interaction with the surface of the fabric creates a multi-sensory experience that expands perception through a virtual layer. Touching the cloth generates acoustic and visual stimuli. By altering the physical form through pressing and stretching the fabric, the object also transforms visually and produces sounds. With the translation of haptics into sound and image, the object becomes an audiovisual instrument and an interface between reality and virtuality.Done at the Muthesius University for fine Arts and Design Kiel.

Currently shown at the Visible Sound Exhibition at Kunstkraftwerk Leipzig.

MultiNIL, Monday, May 21st 2018 Digg | Tweet | Delicious 0 comments  

credits Moritz Aznan, Nils Nahrwold (@multiNIL), Muthesius Kunsthochschule Kiel

Aircade is the result of three month group project at the Muthesius University of Fine Arts and Design. It was developed by Moritz Aznan and Nils Nahrwold and supervised by prof. Frank Jacob as part of the Interface Design Masters program.
The idea was to upgrade the classic airhockey-table with digital technologies to change the game entirely. The focus besides developing the necessary soft- and hardware, lay on designing games that require dexterity, cleverness and precision, so the users can make up their own new tactics.

special thanks to chris (u7angel) and Jens (jens.a.e) for support!

MultiNIL, Friday, May 18th 2018 Digg | Tweet | Delicious 0 comments  

credits wirmachenbunt, Expotec, copyright communication

Porsche Under Control is a kinetic, interactive exhibit. While the car encounters rough terrain, your goal is to keep the balance. While this is a hard task, the cars algorithms are the ones to beat.

The exhibit concept is done by Copyright Communication (awesome, Mert)
Engineering and construction by Expotec Mainz
Software and production by wirmachenbunt

Thanks to Heinrich for the algorithm support !

u7angel, Tuesday, May 15th 2018 Digg | Tweet | Delicious 2 comments  

Watch on Vimeo

In the project of “Deep Sound of Maramures”, with the unique geometric graphic style and spatial related artistic philosophy, several visual effects along with Peter Gate’s (Petru Pap) composition were created. A sequence of spatial visual elements was inspired by Peter’s “The Deep Sound of Maramures”. They (music and visual) are not fixed forms; they are alive. In order to make the audiences experience a unique journey, the approach is to create visually alive elements to follow the emotion of the music. It is not a common background visual effect running through the concert, but rather a set of real-time interactive resulting images interacted with the live music performance. For example, in the scene of “sea waves”, the speed and the curvature of the waves will be lively-generated in accordance with the live music. It illustrated a journey of a bird’s fantasy through different spatial environments from the nature of earth to outer space, from concrete landscape to abstract imagination…etc. While adding the time dimension of the music, it generates a 4D immersive space for people to fully engage in the show.

Tool: vvvv

More info & images:
http://www.archgary.com/works/tdsom/
http://www.archgary.com/2017/05/07/upcoming-events-the-deep-sound-of-maramures-lectureperformance/

P&A LAB, Saturday, Apr 21st 2018 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

Live AV show of generative real-time visuals and live electronic music.
An abstract feast of raymarching, KINECT point clouds, particle systems and geometry shaders extravaganza.
Entirely made with code: marching tamed noise functions and mixing procedural geometries with more mundane polygons.

Visuals made by evvvvil in HLSL and vvvv, played live with Novation LaunchCONTROL XL.
Music made by OddJohn and played live in Abelton with Minibrute and Push2.

evvvvil: https://twitter.com/evvvvil
oddjohn: https://oddjohnfr.bandcamp.com/

evvvvil, Saturday, Apr 21st 2018 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

Commissioned to work with SALT Research collections, artist Refik Anadol employed machine learning algorithms to search and sort relations among 1,700,000 documents. Interactions of the multidimensional data found in the archives are, in turn, translated into an immersive media installation. Archive Dreaming, which is presented as part of The Uses of Art: Final Exhibition with the support of the Culture Programme of the European Union, is user-driven; however, when idle, the installation "dreams" of unexpected correlations among documents. The resulting high-dimensional data and interactions are translated into an architectural immersive space.

Shortly after receiving the commission, Anadol was a resident artist for Google's Artists and Machine Intelligence Program where he closely collaborated with Mike Tyka and explored cutting-edge developments in the field of machine intelligence in an environment that brings together artists and engineers. Developed during this residency, his intervention Archive Dreaming transforms the gallery space on floor -1 at SALT Galata into an all-encompassing environment that intertwines history with the contemporary, and challenges immutable concepts of the archive, while destabilizing archive-related questions with machine learning algorithms.

In this project, a temporary immersive architectural space is created as a canvas with light and data applied as materials. This radical effort to deconstruct the framework of an illusory space will transgress the normal boundaries of the viewing experience of a library and the conventional flat cinema projection screen, into a three dimensional kinetic and architectonic space of an archive visualized with machine learning algorithms. By training a neural network with images of 1,700,000 documents at SALT Research the main idea is to create an immersive installation with architectural intelligence to reframe memory, history and culture in museum perception for 21st century through the lens of machine intelligence.

SALT is grateful to Google's Artists and Machine Intelligence program, and Doğuş Technology, ŠKODA, Volkswagen Doğuş Finansman for supporting Archive Dreaming.

Location : SALT Gatala, Istanbul, Turkey
Exhibition Dates : April 20 - June 11
6 Meters Wide Circular Architectural Installation
4 Channel Video, 8 Channel Audio
Custom Software, Media Server, Table for UI Interaction

For more information:
http://www.refikanadol.com/works/archive-dreaming/
_____
Credits:

SALT Research
Vasıf Kortun
Meriç Öner
Cem Yıldız
Adem Ayaz
Merve Elveren
Sani Karamustafa
Ari Algosyan
Dilge Eraslan
_
Google AMI
Mike Tyka
Kenric McDowell
Andrea Held
Jac de Haan
_
Refik Anadol Studio Members & Collaborators
Raman K. Mustafa
Toby Heinemann
Nick Boss
Kian Khiaban
Ho Man Leung
Sebastian Neitsch
David Gann
Kerim Karaoglu
Sebastian Huber

Refik Anadol, Saturday, Apr 21st 2018 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

Sometimes you just know how you will spend the evening.
All visuals generated in VVVV, music is based on a track generated in Jukedeck.
All hail the beauty of a glitch!

Pliskin, Saturday, Apr 21st 2018 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

With kinetic media elements for autonomous car concept

The “XiM17” showcase by Yanfeng Automotive Interiors helps answer the question, “What will people do in their cars once they don’t need to steer them anymore?” It presents several modes for varying situations and passenger counts. Our distributed motion and media control system provides unprecedented flexibility for the design of these scenes.

https://meso.design/en/projects/yanfeng-kinetic-seat-and-media-elements-for-autonomous-car-concept

Watch on Vimeo

With high-resolution photographs, database, tables, and quadrocopter simulation

Modesty and trust gets more important. We worked with Senckenberg Natural History Museum to enhance their exhibition “Fascination Variety” and its focal point, a gigantic wall display cabinet presenting 1,138 geological and biological objects. In close cooperation with the curator, we focused on developing a streamlined content management system to manage a wealth of extensive information and high-resolution photographs that were featured in three interactive media stations. While a clean and modest design was key for the information stations, we created an exciting educational game for younger age-groups.

https://meso.design/en/projects/senckenberg-naturkundemuseum-frankfurt-modest-multi-narrative-media-stations-for-natural-history-museum

Watch on Vimeo

With ultra-high-resolution photographic animations

For the exhibition “Cranach in Weimar” at Schiller Museum, we produced a digitally-edited replica of the Cranach Altarpiece. A digital image of the altarpiece was presented on an ultra-high resolution screen, inviting visitors to explore the work of art in unprecedented richness of detail. A control panel displayed background information on details selected in the image.

https://meso.design/en/projects/klassik-stiftung-weimar-hyperrealistic-face-to-face-experience-for-cultural-history-exhibition

anonymous user login

Shoutbox

~13h ago

sagishi: any vvvv wizardry going on at ADAF in athens this week?

~13h ago

vux: Freshly released, DirectX11 v1.3 directx11-1.3-update

~1d ago

weareallclowns: has anyone used avr-gcc with vvvv? thinking about uploading a vvvv sketch to arduino

~3d ago

motzi: @udo2013: you can't change the resolution of the standard DX11 renderer (res=window size). use temptarget renderer + preview instead

~3d ago

Tamoeba Kale: Is it possible that i am missing thew whole "animation" category nodes? how?

~3d ago

udo2013: hello. is there a way to change the fullscreen resolution of renderer dx11? found no possibility + can not be opened.

~4d ago

tekcor: @joreg vl.glTF loader looks intense inside, but is red in b36.

~4d ago

MultiNIL: @joreg sure!

~4d ago

joreg: @MultiNIL supa! can we have this as a gallery entry? 24

~4d ago

MultiNIL: forgot to share: our very first vvvv project, digital-hybrid-airhockey-table: https://vimeo.com/210617286 making of: https://vimeo.com/269974764