Enter the maze

Dancing Robots: Speed on the Dance floor

by Paul Curzon, Queen Mary University of London

A robot doing a handstand : copyright www.istockphoto.com 38881856

England football star Peter Crouch became famous for his robotic dances to celebrate goals. He promised to only repeat it if England wins the World Cup, so it probably won't happen again! What about robots dancing like humans instead then? Could a robot ever understand the music they were dancing to in the way a human does? To start with it would need to perceive the music like we do. It turns out even that it isn't so simple. We don't always hear sounds as they really are.

One experiment that showed this explored what people believe they hear when a percussionist plays an instrument. Justin London of Carleton College in the US showed it depends not just on the sound itself but also on how the musician moves as they strike the instrument. He recorded musicians playing notes on Marimbas:instruments a bit like a xylophone where the percussionist hits wooden bars with a mallet. Justin created stick figure animations of the musicians' arm movements as they made the sounds to use as the basis of the experiment.

Volunteers listened to these sound recordings. Sometimes they heard the sounds with and sometimes without the animations. They then rated how long the notes lasted. When people only heard the sounds, they could judge their length fairly accurately. Watching the animations changed their perceptions though. When the stick figure's arm movement stopped abruptly after a sound was played the volunteers thought the sound stopped quickly too. When they heard the same sound while watching an animation where a longer arm movement was made they tended to think that the sound lasted longer. What they saw affected what they thought they heard!

jerkiness of movement makes it hard to judge a sound's tempo.

Justin wondered if a similar thing might happen with the way we perceive how fast a track is being played - its tempo. Perhaps on a dance floor the speed of others dancing changes the tempo a person thinks they hear. He set up a new experiment with a team from Finland to find out. First they needed some music people could dance to. They chose a series of really danceable Motown songs from Wilson Pickett, The Temptations and The Supremes. They created both a faster and a slower version of each song to go with the normal one. They then got people to dance to the tracks, asking them to separately dance vigorously and then slowly to each. These dances were filmed and turned into moving stick figure animations as with the marimba playing. A new set of people listened to the tracks, both with sound alone and while they watched the stick figures dancing to the track. For each they were asked to judge how fast the tempo of the music danced to was.

The prediction was right. What people saw did affect the tempo of music they thought they were listening to. When they just heard the music they could tell which versions were faster and which slower. But when they were watching the stick figures dancing their brains started to get confused. When they watched a dancer dancing vigorously they thought the tempo was faster, for example. That was even though the dancers were still keeping time to the music. Their movements were just fast and jerky rather than slow and smooth. It seems to be the jerkiness of movement that made it hard for the volunteers to judge the tempo.

So on the dance floor we don't just perceive the music as it is. It depends on the way people are dancing around us too. If a robot were really to dance like a human, then for it to also feel the same to the robot, that robot would need our quirkiness in the way it perceived the world.

Understanding our multimodal brain

Justin London's aim is really to uncover fundamental things about how our brains work. The experiments show that the way we hear sound is about more than just the sound waves that hit our ears. Our perception of sound is different to the actual sound. That's important for people who design multimodal computer systems that present information for us to sense in more than one way. If they are to work well, it matters how we perceive the things we see, hear and feel. In Justin's experiments sight interferes with sounds. For an interface that uses sound and sight together to work well, the information we perceive in the different ways has to work together not interfere like that. Experiments like Justin's can help designer's understand how to get interfaces like that right.

This article draws on a talk given by Justin London at Queen Mary University of London in 2014.