Aperture: Connecting USB sensors to Wifi

Aperture is a prototype box for bridging USB sensors to the internet over wifi. It’s main focus is on how to configure a wifi network on an internet of things object that has no keyboard or screen.


You plug power and a USB sensor in to Aperture. Aperture reads information from the sensor and posts to a server. In this case we plug in a RFID reader and it posts to the server each time a RFID tag is read. Little LEDs indicate if Aperture is connected to a wifi network or not. When it is not connected you can flip Aperture over to enter the name and password of the local wifi network.

There is no keyboard and screen for entering the wifi settings. Aperture contains a camera. The camera doesn’t take photos but instead scans for a QR code with wifi settings. This QR code can be generated by an app on a phone or computer already connected to the wifi network.

When it sees a QR code with wifi settings, it connects to the network specified in the code. All the user input and feedback is done on the phone or computer instead of on Aperture.

Money No Object

For previous project Money No Object we used a Raspberry Pi to connect an USB RFID reader over wifi to a server. Entering new wifi details into the Raspberry Pi was awkward. Either the Pi had to be connected to a monitor and keyboard, or we had to SSH into it to modify config files.

Connecting to the Internet

This is a difficulty internet connected objects have, how to wirelessly connect to the internet. There is no single convention for this yet. The Kindle does it by having it’s own 3G connection. Peel and Little Printer bring their own zigbee or similar network, where a second ‘bridge’ device plugs into an ethernet socket on the home router. We choose to reuse the existing wifi network, but need a way of entering wifi credentials.

A typical approach for this is to create a temporary adhoc wifi network, such as in the Nabaztag. The device is booted into a setup mode where instead of connecting to wifi it creates it’s own wifi network. The user connects to this network and enters the wifi credentials. The device is rebooted, and if it fails to connect to the wifi the process is repeated. This is quite a slow and frustrating process.

For Aperture we instead copy the wifi settings from phones or computers. The user’s phone or computer is already connected to the wifi. The details have already been entered and the device has verified it can connect. All we need to do is copy the settings across somehow.

Computational Cameras

We transfer the wifi settings to Aperture as a QR code. The Qr code can be generated by an application on a screen. It can also be provided as a sticker on the back of a consumer router or mifi, where the settings are already specified as text. But why use a camera and QR code for this?

Cameras are becoming cheap thanks to having been integrated in a few billion mobile phones. Cheap enough that we can start putting cameras into all sorts of products. This is similar to when LCD displays became low cost in the 80s and suddenly everything had a clock in it. Microwaves, ovens, VCRs and the like. Not because they needed a clock, but because the component was low cost.

My phone already has a camera on each side, for convenience. I keep finding cameras mounted in the control surface of vending machines. They’re usually not enabled in software and have no purpose. Cameras are affordable enough to add into products just in case they find a future use case.

Not all integrated cameras will be used for taking photos. Some will be used for computer vision. Detecting the presence of a person. Recognising gestures. Analysing emotion. Or in the case of Aperture, decoding a QR code. Aperture is an experiment in using low cost cameras as an user input device.

I expect to see more devices using cameras as input devices. But not necessarily because they will make for a good user interface. Currently my oven and hob have a touchpanel to control it. It makes for a much clumsier interface than an analog knob. But it has less moving parts. Fewer things to break. An easier assembly process. It’s beneficial to the manufacturer rather than to the user.

A camera module is similar to a touchpanel or LCD display. A single discrete module that’s easier assemble and has fewer ways to break. The only thing holding back ovens controlled by hand gestures is the lack of low cost integrated computer vision chipsets, for now. As Dan Catt says, computer vision is in its 8bit stage.

Machine Shall Speak Unto Machine

Aperture settings are presented as a 2D barcode. QR codes aren’t for people. They’re for machines to convey information to other machines. This is why they’re not well suited for use in advertising. Or as a document of uncertain origin says:

“Perhaps what is most disturbing and exciting about the 2D barcode is that it is legible at an aesthetic level while remaining illegible at the level of “content.” It is a form of computerized writing that appears not have the layer-cake of computer “languages” (compiler, programming, graphic, etc.) attached, which do the work of translating the nanoscopic voltages of the circuitboard into image and gesture. The 2D barcode is a surface, seemingly analogous to the surface of the screen or printout, produced by the computer for the computer. It is language that passes seen but not understood.”

For Aperture the QR code is being used for machine to machine communication. We control both ends of the experience, the device reading the code and the way the code is presented in the app. We don’t depend on the user having a QR decoding app and knowing how to use it. All they do is hold their phone up to the Aperture.

QR 2D Barcode Decoding on Raspberry Pi

The Aperture prototype uses a Raspberry Pi and its new camera board. We capture images on the camera and run them through the ZBar library for QR decoding. The camera on the Pi does not appear as a USB camera and has no V4L support. Instead it’s accessed over the MMAL API. I found a very helpful tutorial on using the Raspberry Pi with OpenCV which equally applied to using the camera with ZBar.

Currently the Pi is decoding one frame every 5 seconds or so. This is enough to prove the concept works but insufficient for a good user experience. This is largely down to my current inefficient implementation rather than a problem with the Pi or libraries. I’m pretty sure I could get it down to ~5fps with a few days’ work.

Trusting The Camera

The camera in Aperture does not store photos and is not remotely accessible. But how can the user trust that is the case? On Aperture we have a ring of glowing LEDs (NeoPixels) that illuminate if the camera is active. This shows the user the camera is on. It has the extra benefit of illuminating the QR code.

I considered adding a mechanical lens cap to cover the camera that the user would have to open, similar to the yellow lens cover on the wearable OMG Autographer camera. But this would add moving parts. Removing mechanical parts is the reason for using a camera in the first place.

Instead the camera is placesd on the underside of Aperture. This means it can’t see anything until the user flips the Aperture over to configure it.

Field of View

It’s hard to know where to hold the QR code to successfully scan it. The user does not know where the field of view of the camera is. With a 1D barcode scanner in a shop there is a red laser beam line projected to show where the scanner is aimed. We do not have this with a camera.

Tom Taylor pointed out this problem on the InPost parcel collection machines:

This is my eye

Error Messages

A flaw in this wifi setup approach is if there is a problem. Aperture has no way of conveying error messages. If something goes wrong, how does the user know what is wrong? Is it offline because it has the wrong wifi password or because the network is down?

We can convey error messages using patterns of LEDs that the user can decode in the setup app, but this isn’t an ideal approach.

One Way Communication

The communication between the wifi setup app and Aperture over QR code is one way. Aperture has no way of communicating back errors to the app and instead has to communicate to the user instead of to another machine.

A better approach might be Bluetooth Low Energy instead of a camera. Two way communications, low power and it’s in most new phone handsets. The wifi settings would be passed over Bluetooth and any error messages returned the same way. The user can be presented with more helpful error messages than a series of blinking LEDs. Once set up it will continue working over wifi with no need for the Bluetooth component.

A bluetooth module would also eliminate the need for a lens hole in the Aperture casing.


Aperture explored setting up wifi on an Internet of Things product and using a camera for user input. Computer vision was not the correct solution to our problem. While the camera module is cheap, the computation needed for computer vision is not yet low cost enough to use in a product like this. If I was to rebuild a version two of Aperture I would look at Bluetooth LE instead.