Blog-posts are sorted by the tags you see below. You can filter the listing by checking/unchecking individual tags. Doubleclick or Shift-click a tag to see only its entries. For more informations see: About the Blog.
credits thanks to robert willner & marco ritter for their >> Weekend Workshop 'DX11.particles'@ NODE17 Forum for Digital Arts << ..and for the bad english so that I also understood everything (which is not always the case for workshops). DX11.Particles::great stuff! two years ago I started with dottore´s dx9 gpu particles shader library. very complicated to handle. but my gpu partially burned off because of a piece of advice that I did not quite understand. and that was it with dx9 particles. <<>> music :: stavroz - merci éclair
credits David Sherpa
This augmented reality mirror, based on point of view, superposes a virtual layer on reality. It tries to explore alternative medium for augmented/mixed reality. However, the same setup was already made by Microsoft Research in 2012 https://www.microsoft.com/en-us/research/video/holoflector/ .
Made possible thanks to the material lend by Stereolux, Nantes and the help of Charles Loisy (Stereolux multimedia manager).
credits Joshua von Hofen, Nils Nahrwold, Muthesius Kunsthochschule Kiel
Joshua von Hofen
Supervision by Prof. Tom Duscher and Chris Engler
A cloth that gives haptic feedback. It translates touch into sound and vision.
Made with vvvv and a Kinect 2 sensor.
The installation „Responsive Fabric“ turns an seemingly ordinary cloth into a tangible interface. An interaction with the surface of the fabric creates a multi-sensory experience that expands perception through a virtual layer. Touching the cloth generates acoustic and visual stimuli. By altering the physical form through pressing and stretching the fabric, the object also transforms visually and produces sounds. With the translation of haptics into sound and image, the object becomes an audiovisual instrument and an interface between reality and virtuality.Done at the Muthesius University for fine Arts and Design Kiel.
Currently shown at the Visible Sound Exhibition at Kunstkraftwerk Leipzig.
credits Moritz Aznan, Nils Nahrwold (@multiNIL), Muthesius Kunsthochschule Kiel
Aircade is the result of three month group project at the Muthesius University of Fine Arts and Design. It was developed by Moritz Aznan and Nils Nahrwold and supervised by prof. Frank Jacob as part of the Interface Design Masters program.
The idea was to upgrade the classic airhockey-table with digital technologies to change the game entirely. The focus besides developing the necessary soft- and hardware, lay on designing games that require dexterity, cleverness and precision, so the users can make up their own new tactics.
special thanks to chris (u7angel) and Jens (jens.a.e) for support!
credits wirmachenbunt, Expotec, copyright communication
Porsche Under Control is a kinetic, interactive exhibit. While the car encounters rough terrain, your goal is to keep the balance. While this is a hard task, the cars algorithms are the ones to beat.
The exhibit concept is done by Copyright Communication (awesome, Mert)
Engineering and construction by Expotec Mainz
Software and production by wirmachenbunt
Thanks to Heinrich for the algorithm support !
In the project of “Deep Sound of Maramures”, with the unique geometric graphic style and spatial related artistic philosophy, several visual effects along with Peter Gate’s (Petru Pap) composition were created. A sequence of spatial visual elements was inspired by Peter’s “The Deep Sound of Maramures”. They (music and visual) are not fixed forms; they are alive. In order to make the audiences experience a unique journey, the approach is to create visually alive elements to follow the emotion of the music. It is not a common background visual effect running through the concert, but rather a set of real-time interactive resulting images interacted with the live music performance. For example, in the scene of “sea waves”, the speed and the curvature of the waves will be lively-generated in accordance with the live music. It illustrated a journey of a bird’s fantasy through different spatial environments from the nature of earth to outer space, from concrete landscape to abstract imagination…etc. While adding the time dimension of the music, it generates a 4D immersive space for people to fully engage in the show.
Live AV show of generative real-time visuals and live electronic music.
An abstract feast of raymarching, KINECT point clouds, particle systems and geometry shaders extravaganza.
Entirely made with code: marching tamed noise functions and mixing procedural geometries with more mundane polygons.
Visuals made by evvvvil in HLSL and vvvv, played live with Novation LaunchCONTROL XL.
Music made by OddJohn and played live in Abelton with Minibrute and Push2.
Commissioned to work with SALT Research collections, artist Refik Anadol employed machine learning algorithms to search and sort relations among 1,700,000 documents. Interactions of the multidimensional data found in the archives are, in turn, translated into an immersive media installation. Archive Dreaming, which is presented as part of The Uses of Art: Final Exhibition with the support of the Culture Programme of the European Union, is user-driven; however, when idle, the installation "dreams" of unexpected correlations among documents. The resulting high-dimensional data and interactions are translated into an architectural immersive space.
Shortly after receiving the commission, Anadol was a resident artist for Google's Artists and Machine Intelligence Program where he closely collaborated with Mike Tyka and explored cutting-edge developments in the field of machine intelligence in an environment that brings together artists and engineers. Developed during this residency, his intervention Archive Dreaming transforms the gallery space on floor -1 at SALT Galata into an all-encompassing environment that intertwines history with the contemporary, and challenges immutable concepts of the archive, while destabilizing archive-related questions with machine learning algorithms.
In this project, a temporary immersive architectural space is created as a canvas with light and data applied as materials. This radical effort to deconstruct the framework of an illusory space will transgress the normal boundaries of the viewing experience of a library and the conventional flat cinema projection screen, into a three dimensional kinetic and architectonic space of an archive visualized with machine learning algorithms. By training a neural network with images of 1,700,000 documents at SALT Research the main idea is to create an immersive installation with architectural intelligence to reframe memory, history and culture in museum perception for 21st century through the lens of machine intelligence.
SALT is grateful to Google's Artists and Machine Intelligence program, and Doğuş Technology, ŠKODA, Volkswagen Doğuş Finansman for supporting Archive Dreaming.
Location : SALT Gatala, Istanbul, Turkey
Exhibition Dates : April 20 - June 11
6 Meters Wide Circular Architectural Installation
4 Channel Video, 8 Channel Audio
Custom Software, Media Server, Table for UI Interaction
For more information:
Jac de Haan
Refik Anadol Studio Members & Collaborators
Raman K. Mustafa
Ho Man Leung
anonymous user login