» Blog
This site relies heavily on Javascript. You should enable it if you want the full experience. Learn more.


new post

  reset tags

addon-release core-release date devvvv gallery news screenshot stuff
delicious flickr vimeo

Click to toggle, doubleclick or Shift+click to solo a tag

Order by post date popularity

credits Shaul Tzemach, and Big thanks to contributors : Elliotwoods, Dominik koller, Luper, dottore, woei, unc, m4d, ArKano22, ivan.raster

Generative projection mapping experiment

Festival Light Night Tel Aviv, 24-27.12.2015

An attempt to decompose and rethink the architectural structure. It is based on random ways of blending different elements, creating the contradiction of an ever changing monument.

Sound made with pure data, thanks to experts martin brinkman, Coloscope.

shual, Monday, Jan 4th 2016 Digg | Tweet | Delicious 2 comments  

Watch on Vimeo

Build your own magical light installation in just a few clicks.
Try it now at www.keybright.net ;)
Full details for setup at home: http://imgur.com/a/tQk54
Full story behind the keybright concept: http://keybright.net/story.html

In November 2012, I created a simple installation called Magic Keyboard (https://vimeo.com/45154003). At the time, the project was only running on my laptop and wasn't ready to be opened on any device. It was part of a prototyping series blending digital projected elements bouncing with physical behaviors on real-world objects.
All these prototypes were quickly made on my personal time with the toolkit vvvv and the Box2D plugin by @mrvux.
I also had some great support from Moment Factory for the hardware and the LAB space.

In November 2015, the magic keyboard gained a lot of interest for a week. A lot of people seemed to be willing to try it at home.

It had been a long time since I wanted to make an opensource project that would allow people to create something together at home. So I took some time and gathered the elements to make this happen in the webbrowser. The installation had to be as simple as possible with just a quick drag and drop calibration.
It's now available to anyone to try and improve in the form of a simple full screen website :)

Follow keybright on facebook for updates

Watch on Vimeo

Installation on three touch screens Multitaction Cell 55 combined in one wall. We produced informational content and presentation about plants and products of the company.

art-director: Roma Erohnovich
producer: Sergey Kostenevich, Vera Malysheva
motion-designer: Maxim Malakhov
programmer: Alexey Rudenko

Radugadesign, Friday, Dec 4th 2015 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

Public art commission for 350 Mission Building in City of San Francisco, California.

‘Virtual Depictions:San Francisco’ is a public art project consist of series of parametric data sculptures that tell the story of the city and people around us within a unique artistic approach for 350 Mission’s media wall in collaboration with Kilroy Realty Corporation / John Kilroy and Skidmore, Owings & Merrill LLP Architects.

The main idea of ‘Virtual Depictions:San Francisco’ is to bring 21st century approach to public art to define new poetics of space through media arts and architecture and to create a unique parametric data sculptures that has an intelligence, memory and culture. Through architectural transformations of media wall located in 350 Mission’ lobby, main motivation with this seminal media architecture approach is to frame this experience with a meticulously abstract and cinematic site-specific data-driven narration. As a result, this media wall turns into a spectacular public event making direct and phantasmagorical connections to its surroundings through simultaneous juxtapositions. The project also intends to contribute to contemporary discourse of public art by proposing a hybrid blend of media arts and architecture in 21st century.

When creating this public artwork my goal was to make the invisible visible by embedding media arts into architecture to create a new way to experience a living urban space.

6MM LED Media Wall.
Custom Software, 90 minutes long dynamic visual experience in 12 chapters.
8 Channel sound.

Music for documentation: Goldmund - Threnody (Unseen-Music)
Sound Design: Kerim Karaoglu

Virtual Depictions: San Francisco By Refik Anadol
Commissioned by: Kilroy Realty Corporation
John B. Kilroy, Jr.
In cooperation with:
The City of San Francisco
Mayor Ed Lee
John Rahaim, Director of Planning
Project Partners:
Sensory Interactive
Sansi North America
Skidmore, Owings, and Merrill LLP
Webcor Builders
WSP | Parsons Brinckerhoff
Art Consultant:
DPA Fine Art Consulting
Deanna Postil Krawczyk – Michelle Isenberg
Refik Anadol Studios: Refik Anadol / Efsun Erkilic / Raman K. Mustafa
Kian Khiaban / Toby Heinemann / Rob Tom Browning / Bahadir Dagdelen / Yusuf Emre Kucur
Daghan Cam / Sebastian Neitsch / Johannes Timpernagel / Sebastian Huber / Kerim Karaoglu
Studio Management by DG Hunt & Associates, LLC
Special Thanks: Nicole Stromsness / Sarah MacIntyre / Douglas Giesey / Craig Hartman
Michael Fukutome / Eric Cole / Josh Cushner / Eric Covrig / Jason Cox / Pat Green
Julie Goodwin / Amanda Brownlee / Shannon Knuth
Copyright 2015 Refik Anadol Studios

Refik Anadol, Wednesday, Dec 2nd 2015 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

"google his story" ist eine interaktive Datenvisualisierung, die sich mit einem bedeutenden, gegenwärtigen Thema
auseinander setzt: dem Sammeln von Daten.
Wie viele Daten gibt man eigentlich preis? Was können andere aus meinen eigenen Daten ablesen? Was wird alles gespeichert?
Diese und weitere Fragen bildeten den Rahmen des Projekts.
Um darauf Antworten zu finden, habe ich den frei für jeden Google-Nutzer zugänglichen "Takeout" Service von Google benutzt.
Hier kann man Informationen über Suchanfragen, Positionsdaten, Cloud-Speicher-Verlauf, und viele mehr erhalten.
Ich habe mich in "google his story" auf die Suche nach meinen eigenen Spuren im Internet gemacht und mich hierfür auf meine Suchanfragen mitsamt Zeitstempel fokussiert.
Aus diesen eindimensionalen Zahlen, habe ich eine begehbare Datenvisualisierung entwickelt, die man mit Gesten steuern kann.

"google his story" is an interactive data visualization that deals with an important, current topic
sets apart: the collection of data.
How much data you are actually cheap? What can other read from my own data? What is stored everything?
These and other questions formed the framework of the project.
In order to find answers, I've used the freely accessible for every Google user "Takeout" service from Google.
Here you can information about searches, position data, cloud storage history, and get many more.
I have made myself into "google his story" in the search for my own footsteps on the Internet and focused on me for this my searches along with timestamps.
For these one-dimensional figures, I have developed a walk-in data visualization which can be controlled with gestures.

sound: “Seeing The Future” (by Dexter Britain)
soundsource : http://freemusicarchive.org/music/Dexter_Britain/Creative_Commons_Volume_2/Seeing_The_Future
soundlicenses: http://creativecommons.org/licenses/by-nc-sa/3.0/us/

Alexander Dalbert, Wednesday, Dec 2nd 2015 Digg | Tweet | Delicious comments on vimeo  

credits everyoneishappy and intothelight for their noodles and pointcloud contribs

Installation made for Bratislava, Slovakia Nuit Blanche event.

It is very basic in terms of visuals, this was essentialy my first work with noodles system and particles. Had to take out a lot of details and simplify it just to one big figure, because I requested totally different projectors then i got (instead of fullhd ST i got 2 1024x720 office oldies).

Aim was to work around style of Zuzana Sabova, her raw sketchy drawing and the Black Hole sculpture which is in the middle. It did interacted with particles but it was barely visible because of the projectors quality.

I have more 3d scans, which i plan to use in future and follow her style even closer. But i really like the sound, it was done in reaktor with help of contribution i made - Reaktor vobject OSC.

StiX, Tuesday, Dec 1st 2015 Digg | Tweet | Delicious 0 comments  

Watch on Vimeo


Jeff Zhu, Monday, Nov 30th 2015 Digg | Tweet | Delicious comments on vimeo  

Watch on Vimeo

Interactive video installation by Robert Pörschke, displaying the Swedish national costume, controlled by light sensors and a microphone.

A part of the group exhibition (Verklighetens) Folkdräkt with Art Mobile at Galleri K in Västerås in August 2015.

First time I used both MIDI CPU from Highlyliquid and the programming enviroment VVVV. Both worked better than I expected. More to come I bet.


Watch on Vimeo

Flatpack Festival asked me to become Artist-In-Residence in the Photonics Department of Aston University, to make a response to the work they do there. My piece would appear in the run up to Birmingham Library’s Light Fest, as part of an all-day event highlighting the research work of the Department which featured lectures and demonstrations.
Initially I planned to make a laser-based piece, however considering the health and safety implications of this (not to mention the hundreds of school children likely to be in attendance) I decided that using LED’s would be a much safer option instead.

All the demonstrations were based on the physics of light, colour mixing, refraction etc. which inspired me to create a more visceral piece : something to make you feel light, to perceive it and its effects on the brain.
To do this I made some 1.5m light probes from translucent tubes which featured 170 individually controllable pixels; 85 to the front and 85 to the back. This gave me the ability to control both the direct and the indirect light which bounced off the surface to the rear of the tubes, which meant I could separately colour the background and the detail.

My initial plan was to create an interactive piece but as I added audio it revealed a further layer which I decided to experiment with. I took the frequencies of red, green and blue light and dropped them down many octaves so that they became audible. I then used the activation of the red, green and blue pixels to control the volumes of these frequencies. I wondered, could it perhaps be possible to perceive colours through sound? I discovered that the combination of this audible droning of the frequencies combined with a slow colour change produced a very powerful emotional effect akin to someone switching on a “calmness switch” if you will. In this state one became very ‘present’ and indeed during the event many people commented that they felt like they had stepped into a different world.

I further refined the audio by adding some sub octaves to the frequencies, fine-tuning the effect and experimented interplay between the fore- and background illuminations until I was happy with the effect.

I feel like I have come full circle to the beginning of my interest in light in that I am again hand constructing light experiments. I now have the kind of technology I could have only dreamed about 20 years ago. I intended these probes to be multifunctional, and I intend to continue my investigation in other kinds of spaces through the exploration of minimal colour fields and more kinetic installations.


catweasel, Saturday, Nov 28th 2015 Digg | Tweet | Delicious comments on vimeo  

anonymous user login


~8h ago

ravazquez: good thinking @beyon, will do, thanks!

~9h ago

beyon: ravazquez: maybe it's better to start a forum thread and be more specific about your plugin and thread usage

~18h ago

andresc4: I made a nickname change, I'm vjc4 or Andres Alvarez like in fb. Just want let you guys know.

~19h ago

ravazquez: Thanks @microdee! Will look into it tomorrow ^^

~23h ago

microdee: ravazquez: System.Diagnostics.Task

~1d ago

ravazquez: Anyone with an easy to follow sample of proper thread usage within a vvvv plugin?

~2d ago

levi: thanks guys,not so eager for the purist solution,could use some sophisticated interface,probably some lightning-dmx software.