Creating movable tangible objects on interactive tables
Tangible objects placed on interactive surfaces allow users to employ a physical object to manipulate digital content. However, creating the reverse effect—having digital content manipulate a tangible object placed on the surface—is a more challenging task. We present a new approach to this problem, using ultrasound-based air pressure waves to move multiple tangible objects, independently, around an interactive surface. We describe the technical background, design, implementation, and test cases for such a system.
Mark T. Marshall, Tom Carter, Jason Alexander, Sriram Subramanian, Ultra-Tangibles: Creating Movable Tangible Objects on Interactive Tables. Proceedings of the 30th International Conference on Human Factors in Computing Systems (CHI 2012). May 2012. [PDF, 514 kB][ACM Digital Library]
- Be Moved By Ultrasound. WIRED Magazine UK. August 2012.
- Ultra-tangible technology manipulated with ultrasound levitation. WIRED.co.uk 13 July 2012.
There are three main components to the implementation of Ultra-Tangibles: ultrasound transmitter array, pulse generation, and the control loop.
Ultrasound Transmitter Array
The Ultra-Tangibles system consists of a 7-inch display surrounded by 144 ultrasonic transducers arranged three high in a rectangle, as shown in Figure 1. Each side is a 15×3 array of transducers; the ends are each 9×3 arrays.
Activating and deactivating the ultrasonic transducers, as well as calculating the phase differences between them is carried out by four XMOS XS1-G4 processors. They are mounted on XC-1A development kits to provide easy access to their outputs. Each processor controls 36 transducers and they are all connected via XMOS links to provide communication between them.
The first processor is connected to a PC via USB, through which it receives requests to create a pulse at a particular location. Upon receiving such a request, the first processor works out which transducers need to be activated and calculates the phase differences between them. This is then transmitted to the other processors, causing them to trigger the required transducers and thus generating the desired pressure waves.
Each ultrasonic transducer is driven by an 8-bit microcontroller (an ATMEL ATTiny26), which generates a 40 kHz 5V square wave signal when triggered by an XMOS XC-1A board. This signal is amplified to a 36Vp-p square wave using a MOSFET driver amplifier and then output as an ultrasonic sound wave by the transducer. An overview of the system is illustrated in Figure 2.
A Playstation Eye camera operating at 100FPS is mounted above the ultrasound hardware assembly. This is used to track the tangible objects. The video stream is fed into Community Core Vision, which extracts the positions and velocities of the objects and sends them to control software on the PC.
This control software calculates the direction and location of pulses that are required to move the objects to their designated target, or slow them down if they are going to overshoot. Requests for these pulses are then sent via USB to the XMOS processors.