Experiment using openFrameworks and openCV for Spatial Media spring 2013.
A party is starting and you’re the only one invited. The camera is used to detect when someone enters the frame and throws a private party for 1. You get a hat and some confetti, but when you leave the party shuts down.
For Spatial Media we were asked to find a temporary interactive exhibit that was created for an event and critique it. I had a tough time finding examples of temporary exhibits, but eventually I came across “8 ties” for Hermès. It was created for an event in Milan and has now traveled to a few other locations. I haven’t seen this installation in person, but I found some videos online.
The theme of the collection is very digital, referencing many aspects of computer culture and data. The ties’ patterns use iconography like the power button and the recycle symbol, as well as binary among others. The site is much nicer than that video above:
The visuals, drawn from the patterns of the tie collection, are stunning. They are the highlight of the main attraction – the large video wall which I can only assume was a projection. What’s unclear is whether or not the visuals respond to the user. When you watch the video, you can’t really tell if the graphics are moving on their own or as a reaction to the user waving around kinect-style. The audience, who appear to be professionals in the fashion industry, don’t appear to understand either.
I definitely recommend the website. It’s visually very attractive and has some awesome glitchy audio and video. But from the videos online, it looks like the translation to a large screen exhibit may have fallen a bit flat. I think user interaction could have been made much stronger, for example, showing some sparks or something where the user’s hand is detected, or a more clear correlation between movement of the user and movement of the graphics.
For our second assignment, we needed to port an existing project over to using spacebrew. Last semester in ICM, I made a hand-tracking app that you could use to draw constellations. It was a bit hacked together using oscP5 to send the kinect data to the rendering sketch. So I hooked them together using spacebrew instead. It worked well, and was fairly quick to set up and get working… except for my errors in publishing the data. It was good practice in debugging, however, when there isn’t complete documentation and google doesn’t help. Found the bug, squashed it, and the result was beautiful.
Last night I got started using Spacebrew. It was actually really nice to use once I got everything setup and configured. I ran my own node server and used the example processing sketches to change the greyscale background of one sketch using a slider from another:
I then spent a good bit of time trying to get the arduino sketches to work with a leonardo + Wifi shield. I wasn’t able to get it yet, unfortunately, but I think I’ll have to dig into the source of the websockets or arduino spacebrew libraries (or both). I’m looking forward to trying this out with other projects!