Music and Machines: Highlights from the MIT Media Lab

I recently attended “Music | Machines: 50 Years of Music and Technology @ MIT,” part of MIT’s ongoing Festival of Art + Science + Technology (FAST).

One of the most interesting demonstrations was the iPhone Guitar by Rob Morris of the MIT Media Lab. Basically, he makes use of the iPhone’s accelerometer as an input for special effects.

The iPhone is attached to the guitar, so that certain gestural movements of the guitar in space–especially those that happen during an emotional performance–are detected and used to modulate the sound. The touch screen of the iPhone also comes in handy as an accessible on-guitar interface for selecting effects and inputting other variables.

The Muse

Digital music is no longer a new phenomena; in fact, it’s downright ancient when you consider that one of the first digital music contraptions was made in 1972. The Triadex Muse is an algorithmic music generator using digital logic, and was designed by Edward Fredkin and Marvin Minsky at MIT Lincoln Laboratory.

Music, Mind and Meaning

Speaking of Minsky, he discussed “Music, Mind and Meaning” with Teresa Marrin, Mary Farbood and Mike Hawley. Amongst the anecdotes Minsky mentioned an old concept of goals.

One of the ways human minds might achieve goals is to reduce the difference between what it has and what it wants. Music may utilize some of the same mental components–most music chops time in equal intervals and with equal substructures. These chopped experience windows can be compared, possibly in the same way that you can compare what you have with what you want.

Excerpts from the Concert

Computer based production is normal nowadays. So how would a computer and electronics oriented concert be special? Well, Todd Machover of the Media Lab was able to do that by assembling musicians that make some very unusual sounds and abnormal compositions. They all involve computers and/or electronics, but in innovative ways…and through live performances.

The concert began with a 1976 composition by Barry Vercoe called Synapse for Viola and Computer, an early work from MIT’s Experimental Music Studio. As a restaging of the 1970s performance, the digital accompaniment is inflexible, so it was up to the human soloist, Marcus Thompson, to maintain sync and “express himself within the confines.”

Vercoe – Synapse

Synapse was followed by Synaptogenesis, in which Richard Boulanger performs by triggering sound clips and transformations using a Nintendo WiiMote and a Novation Launchpad.

Boulanger – Synaptogenesis

Programmable drums machines have been around since 1972, but what is rare is to see the machine actuate physical percussion hardware. One such robotic instrument is the Heliphon, originally made by Leila Hasan and Giles Hall, and later redesigned by Bill Tremblay and Andy Cavatorta.


Photo credit: MIT

The sound from this double helix metallophone is produced via solenoids hammering the metal keys. It also has lights hooked in to give a visual indication of which keys are active.

Heliphon and humans Todd Reynolds (violin) and Evan Ziporyn (clarinet) performed Ziporyn’s Belle Labs – Parts 1 & 3.

Ziporyn, Reynolds, Heliphon – Belle Labs Parts 1 and 3

Heliphon is one of various robotic instruments commissioned by Ensemble Robot, a nonprofit corporation based in Boston, MA. Ensemble Robot also made WhirlyBot, which looks like a turnstile but sounds like a chorus of human-like voices, and Bot(i)Cello, which appears to be a cross between a construction tool and a stringed instrument.

The Future of the Underground

If you’re interested in hearing more electronic music, there is always new stuff (or remixes of old stuff) being made, far below the radar of the mainstream. You can hear some of it on the web, but being at a live performance or DJ set is a different experience, especially when the DJ modifies the music on the fly. There are some new tools to enable this, for example, here is DJ/producer Encati demonstrating a Kinect wobble controller for dubstep mutations:

What I would like to see more of are environmental actuations triggered by music, beyond just flashing lights. We have autogenerated visualizers, and we can use MIDI to control lights (and fire cannons), but what about having a room really transform automatically based on the music? I’m taking about dynamic 2D and 3D displays everywhere, autonomous mobile furniture, materials changing shape and color, and so on.

Leave a Reply