Small machines perch in the structure of the building, tapping out rhythms on pipes, beams, floors and walls. These machines are controlled by an iPhone app and as people travel around the city the speed of their movement brings the tapping machines to life, playing the infrastructure of the building. As they navigate the city they pay attention to their speed of travel and imagine their movement calling out a tapped rhythm in the gallery, its sonic voice contributing to this temporal symphony of the city.
Polyrhythmia is an installation by Jen Southern, created through a Sound and Music Embedded residency at the Pervasive Media Studio. Visitors see little machines tapping out sounds. Each machine is tied to the movement of a participant travelling around the city, carrying a mobile phone that turns their motion into sound.
Your browser does not support iframes.
The Polyrhythmia system is split into two main parts: a mobile app and a physical machine. The mobile app is built using the AppFurnace framework and runs on an iPhone. It turns GPS data into sound played on headphones for the person carrying it. It also sends the GPS data back to the gallery.
In the gallery the second part of the installation, the physical machines, receive the GPS data and turn it into tapping sounds by striking objects. These tapping machines use an ethernet Arduino to receive and process the data. A small Sinatra web app sits between the mobile app and the physical tapping machines to handle message passing.
To turn GPS data into sound we used a mobile phone instead of building our own hardware. The first version was built using PhoneGap for Android handsets. This let us iterate and test quicker than building a native application.
It became apparent this was not a good solution. The GPS on our Samsung handsets was inaccurate and slow to update location. We could not find reliable reviews of GPS accuracy across Android handsets or listings of which handsets used which GPS chipsets.
In addition the audio playback capabilities were problematic. We needed to play multiple audio tracks at the same time. PhoneGap on Android had a low limit of how many tracks would play simultaneously. It’s not possible to know how many tracks can be played at a time until you try to exceed the limit. The upper limit would also fluctuate. When the OS was told to play an audio clip there would be a delay of up to a second before the clip played. This latency was not consistent.
Developing the app was slower than writing a web app due to the testing process. With each iteration of the geolocation code I would have to go outside to walk and run to see how it was performing. I do not recommend developing GPS software during the British winter.
The physical tapping units are quite simple machines, comprised of an ethernet Arduino and a servo. The servo percusses a stick against an object to make sound. The arduino receives GPS information from the mobile app and triggers the servo based on the movement speed of the participant.
The most time consuming part of constructing the machines is making the casing, particularly the hand filing of square holes for the ethernet socket. As Tom Armitage points out, things aren’t finished until they have a box.
The physical machines and the mobile app are bridged by a small Ruby web server written using Sinatra. Mobile devices pass GPS information to it which it stores until polled for the information by the physical machines. An admin interface allows choosing which phone is paired to which machine.
Polyrhythmia has since been shown as part of the Differential Mobilities conference in Montreal and in Lancaster’s Peter Scott Gallery. To find out more see Jen Southern’s final report on her residency.