No more mister nice thumb
A few assignments ago in physical computing, I had the idea to make a thumb wrestler with a homemade force detector. I’ll talk about that more below.
Ever since that lab, I’ve used what I’ve learned to build onto that project adding LEDs, a servo, and a piezo buzzer. Here’s a short video demonstrating the final prototype:
I haven’t posted any of my ICM homework to my blog yet, so this post will take care of it in one shot! Here are a few of the projects I’ve worked on recently.
Move the mouse inside the window to create sparks!
The inspiration for this one was an equalizer. I ended up spending way to much time thinking about the math of the line heights
click inside the box to start drawing.
I read an excerpt of Visual Intelligence: How We Create What We See by Donald D. Hoffman that spoke to the fact that there is evidence our brain constructs what we see and touch. The chapter discussed medical studies that were performed on patients who had limbs amputated but could feel the sensation that the appendage was still there (Phantom limb syndrome). Hoffman used this as an argument that even the sensation of touch is something created in our brains, not necessarily imposed through outside influences.
It’s something I’ve never thought about with regards to touch before. I’ve though about whether or not people see things the same way I do (physically, not metaphorically), or how my colorblind father and sister see the world, but I always just assumed sensations of touch were relatively universal. Is it possible to “spoof” a feeling of touch? I’m not sure if scientifically there is a way to make materials change their physical feeling yet, but that would be an interesting development. Or maybe a sensor in our brain that corresponds with a material. Wouldn’t a trackpad that changes texture depending on the application or game be really interesting? Maybe an e-book that could simulate the texture and weight of pages while still using a display. This really makes me rethink what could be possible for the future of human/computer interaction.
I loved looking through this list of different types of physical computing projects. Some were very inspiring (mechanical pixels, fields of grass), and others I’ve seen many many times. what I really like about both the mechanical pixels and fields of grass are the scale and response given to a user. Seeing so many objects working together in harmony is mesmerizing and hypnotic. I think these ideas are great for art pieces or interactive displays, but I think the usefulness degrades when I try to think of “practical” uses (whatever that means).
One trend I’ve been seeing a lot recently is the use of the Kinect in projects. I think its fascinating to use our bodies as a controller and interact with digital objects on a screen (or other objects). There seems to be a market for it as well since there are other competitors making similar products (Asus and Leap Motion are both interesting examples). I don’t think these are the epitome of human/computer interaction, however. I think the problem is in the feedback. I’ve played some games on with the kinect on the xbox. One game, for example, you block dodgeballs with your arms legs and torso. But where’s the feedback? How do I know I blocked one? I don’t feel anything on my arm or leg, or anything anywhere. I think that’s an opportunity for improvement. There should be resistance and feedback built into the system in order to achieve a more life-like feeling.