Polyrhythmia: Turning movement into sound with GPS, iPhones and Arduino

Small machines perch in the structure of the building, tapping out rhythms on pipes, beams, floors and walls. These machines are controlled by an iPhone app and as people travel around the city the speed of their movement brings the tapping machines to life, playing the infrastructure of the building. As they navigate the city they pay attention to their speed of travel and imagine their movement calling out a tapped rhythm in the gallery, its sonic voice contributing to this temporal symphony of the city.

Polyrhythmia is an installation by Jen Southern, created through a Sound and Music Embedded residency at the Pervasive Media Studio. Visitors see little machines tapping out sounds. Each machine is tied to the movement of a participant travelling around the city, carrying a mobile phone that turns their motion into sound.

Overview

The Polyrhythmia system is split into two main parts: a mobile app and a physical machine. The mobile app is built using the AppFurnace framework and runs on an iPhone. It turns GPS data into sound played on headphones for the person carrying it. It also sends the GPS data back to the gallery.

In the gallery the second part of the installation, the physical machines, receive the GPS data and turn it into tapping sounds by striking objects. These tapping machines use an ethernet Arduino to receive and process the data. A small Sinatra web app sits between the mobile app and the physical tapping machines to handle message passing.

Mobile App

To turn GPS data into sound we used a mobile phone instead of building our own hardware. The first version was built using PhoneGap for Android handsets. This let us iterate and test quicker than building a native application.

It became apparent this was not a good solution. The GPS on our Samsung handsets was inaccurate and slow to update location. We could not find reliable reviews of GPS accuracy across Android handsets or listings of which handsets used which GPS chipsets.

In addition the audio playback capabilities were problematic. We needed to play multiple audio tracks at the same time. PhoneGap on Android had a low limit of how many tracks would play simultaneously. It’s not possible to know how many tracks can be played at a time until you try to exceed the limit. The upper limit would also fluctuate. When the OS was told to play an audio clip there would be a delay of up to a second before the clip played. This latency was not consistent.

To fix this we ported the app from PhoneGap on Android to AppFurnace on iPhone. AppFurnace originated as a framework for geolocated audio and as a result the audio and geolocation libraries within it are very reliable. The porting process took half a day, as both frameworks use javascript and have similar library calls. We had no further problems with audio or geolocation accuracy following the port.

Developing the app was slower than writing a web app due to the testing process. With each iteration of the geolocation code I would have to go outside to walk and run to see how it was performing. I do not recommend developing GPS software during the British winter.

Physical Machines

The physical tapping units are quite simple machines, comprised of an ethernet Arduino and a servo. The servo percusses a stick against an object to make sound. The arduino receives GPS information from the mobile app and triggers the servo based on the movement speed of the participant.

The most time consuming part of constructing the machines is making the casing, particularly the hand filing of square holes for the ethernet socket. As Tom Armitage points out, things aren’t finished until they have a box.

Communications

The physical machines and the mobile app are bridged by a small Ruby web server written using Sinatra. Mobile devices pass GPS information to it which it stores until polled for the information by the physical machines. An admin interface allows choosing which phone is paired to which machine.

Polyrhythmia has since been shown as part of the Differential Mobilities conference in Montreal and in Lancaster’s Peter Scott Gallery. To find out more see Jen Southern’s final report on her residency.