Matthew Kaney at ITP

Glove Runner

Physical Computing

For my Physical Computing midterm project, I worked with Lirong Liu on a glove-based game controller.

Our initial discussions revolved around a variety of gestural control schemes. In particular, we discussed various ways that hand gestures in space could be tracked, using gloves, computer vision, and other approaches. Some topics of discussion were pointing/moving in 3-d space, and various gestures for instrument control. After much discussion, we settled on a controller for a Mario-style side-scrolling game, where the player would make their character run by physically “running” their index and middle fingers along a table top. I think this gesture is attractive for a number of reasons. It has a very clear correspondence with the action performed on screen, and although the controller gives little physical feedback, the assumption that users would run in place on a table top helps ground their actions. Also, it seemed like a lot of fun.

Glove Mockup

From there, I began working with a physical prototype. To begin with, I took a pair of flex sensors (shown at left with a plug we added for better interfacing) and attached them to my fingers using strips of elastic. From this prototype, it was clear that the best sensor response came when the tip of the flex sensor was firmly attached to the fingertip and the rest of the flex sensor could slide forward and back along the finger as it bends.

Test Processing Output

Reading this sensor data into Processing, I was able to quickly map the movement to a pair of “legs” and get a sense of the motion in the context of a running character. For a standing character, we found that just two changing bend values (one for each sensor) could produce some very sophisticated, lifelike motion. Meanwhile, as I worked on the physical prototype, Lirong set up a basic game engine in JavaFX with a scrolling world and three different obstacle types.

At this point, we both worked on the software for a while, with Lirong setting up a system for converting various finger movements to discrete events (for steps, jumps, etc) and me working on various issues related to graphics and animation. In the end, we wound up using our sensor input with two different levels of abstraction: the high level (the specific running and jumping events) controls the actual game logic, while the low level (the actual bend values of the sensors) controls the character’s animation.

After that, I sewed up the two pairs of gloves shown in the video above, allowing the flex sensors to slide back and forth along the fingers. As we worked on the glove design, we tested with various users to identify potential sizing issues. From there, we built a simple, self-contained system for doing basic user control, and wired everything up.

Code can be found here:

A few challenges we faced:

A few things I think we did well: