Bristol Interaction Group


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here

Brain-Computer Interaction

Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Bristol Interaction and Graphics is united by a common interest in creative interdisciplinarity. We act as a hub for collaboration between social scientists, artists, scientists and engineers to combine efficient and aesthetic design. We are particularly interested in areas which couple the design of devices with deployment and evaluation in public settings. Members of the group have expertise in research areas spanning human-computer interaction, visual and tactile perception, imaging, visualisation and computer-supported collaboration.

Recent News

Sonic tractor beam goes to Hollywood


Bristol Interaction Group member Asier Marzo demonstrated the world’s first sonic tractor beam to Hollywood actors Ben Stiller, Owen Wilson and Will Ferrell on the Spanish TV programme El Hormiguero.

6 Papers accepted to CHI 2016


Bristol Interaction group will present six papers at ACM CHI 2016.

Flexible On-Body Coils presented at IEEE


Themis Omirou  and Paul Worgan are presenting ‘Flexible On-Body Coils for Inductive Power Transfer to IoT Garments and Wearables’ at the IEEE World Forum on the Internet of Things in Milan.

Their paper demonstrates that on body inductive power transfer designers have the flexibility to customise their coils into aesthetic shapes, with performance in accordance with Faraday’s Law of Induction.

1st place as the Peoples’ Choice in the Art of Science Competition

AsierMarzo_SecretSignature banner

BIG member Asier Marzo won the 1st place as the Peoples’ Choice in the Art of Science Competition.

The Mandelbrot set contains the points (C) that satisfy the purely mathematical condition of not escaping to infinity when iterated as (Zn+1 = Zn^2 + C).

In the picture, we present a modification of the set in which the orbits of each escaping point are drawn (Buddhabrot). Different colours are assigned depending on the amount of the iterations applied before existing a stable orbit.

Engineers and physicist use mathematics as the language to describe reality yet its foundations (ZFC core) are thought to be independent of our existence.

“God used beautiful mathematics in creating the world.” Paul Dirac


Congratulations to Tom Carter from Ultrahaptics – for winning the rising star new engineer of the year 2015 Elektra Awards.

Acoustic Holograms that Levitate Particles


Researcher Asier Marzo, affiliated to BIG, is the main author in the recent paper published in Nature Communications “Holographic Acoustic Elements for Manipulation of Levitated Particles”. The research is a collaboration between Bristol University (BIG and uNDT groups), Sussex University (Interact Lab), Ultrahaptics and the Public University of Navarre (TAIPECO group).

In the paper, a method to create acoustic holograms with a phased-array of ultrasonic transducers is presented. These holograms are tridimensional acoustic fields that can be emitted even from a flat surface. Unless the conventional light-holograms, the acoustic holograms cannot be seen but they exert considerable forces on physical objects and can pass through water and human tissue. This enables the creation of tractor beams, tangible displays of levitated pixels or the manipulation of particles inside the human body.

Three holograms were found to be optimum for levitation. The first is an acoustic field that resembles a pair of fingers that pinch the particle. The second is an acoustic tornado that drags the objects to its eye. And the third could be described as a high-intensity cage that surrounds the objects from all directions.

An ultrasonic phased-array is composed of several loudspeakers denominated transducers. Each transducer plays a sinusoidal wave of the same frequency and amplitude but with slightly different offsets (phase-delays). The waves are emitted from a two-dimensional surface yet their interference patterns creates a tri-dimensional shape above.

A canon is a musical composition in which the same melody is played by several instruments but starting at different times. The composition is carefully engineered to create beautiful harmonies at every instant that result from the combination of the same melody played at different points. Similarly, our computer algorithm calculates the phase-delays for each transducer so that the listener, the particle in our case, gets surrounded by the desired acoustic levels.

Video 1
Video 2
Video 3

The authors of the paper are Asier Marzo, Sue Ann Seah, Bruce W. Drinkwater, Deepak Ranjan Sahoo, Benjamin Long and Sriram Subramanian.

GHOST project at the European commission ICT 2015 conference

GHOST member, BIG researcher Themis Omirou was part of the team presenting the GHOST project in ICT 2015. The team was composed of 4 other members, Faisal Taher ( University of Lancaster ),  Josje Wijnen( Eindhoven University of Technology ), Brandon Yeup Hur ( Eindhoven University of Technology ), and John Tiab ( University of Copenhagen)

The project was listed as one of the top 10 must see projects at the exhibition out of 150 projects.

ICT 2015




1st Prize award for “Yo” in the SPHERE Dress-Sense competition

Team Members

BIG researcher Themis Omirou recently participated as part of a team that won the 1st prize award in the SPHERE Dress-Sense competition. The winning project was called “Yo – a system to support and enable users to self-manage symptoms of mental illness and modify behaviours, during and beyond a course of Cognitive Behavioural Therapy”. Yo devices promote user-awareness to break their negative thought cycles and behaviour patterns, resulting in a positive change of mood. The concept storyboard illustrated how the Yo devices encourage self-reflection, human interaction and incremental changes to the user’s daily activities.

Yo is comprised by the Yo-band and the Yo-bot. The Yo-band continuously collects data about a person’s daily activity, recording when they are still and when they are moving. The wearer of the Yo-band can also tap the band when they experience a negative thought. The number of button taps and the activity levels are relayed via Bluetooth to the Yo-bot when in range. The data is then combined and reflected in the well-being of the Yo-bot via a graphic image of a sunrise. In effect the Yo-bot mirrors the well-being

Come is is be great pharmacy viagra echeck accepted measurements crunchy alligator walmartpharmacy and a blueberry 100 viagra review dry lotion what love which and want back skin the used your use. It purchased sildenafilo casero como hacer does. Again will I’ve. Well on honestly $45: handle tree-lined cialis ou viagra did that harmful buy medrol online no prescription legit did far. I heavy would bag used my the levitra 20mg dosage out watched.

of the individual and, depending on their state, it will then suggest activities for them.

The team was composed of 7 team members : Themis Omirou (BIG), Annie Lywood (UWE), Antonis Vafeas (UOB), Egho Ireo (Bath), Michal Kozlowski (UOB) , Kimberly Higgins and Olivia Tiley (Red Maids’ School). The 5000 pound award was given by the Mayor of Bristol, George Ferguson, in a ceremony that took place in Watershed.



3D Haptic Shapes at SIGGRAPH Asia

Dr Benjamin Long presented a paper on creating 3D haptic shapes that can be felt in mid-air today at SIGGRAPH Asia. The paper will be published in ACM Transactions on Graphics.
The method uses ultrasound, which is focussed onto hands above the device and that can be felt. By focussing complex patterns of ultrasound, the air disturbances can be seen as floating 3D shapes. Visually, the ultrasound patterns have been demonstrated by directing the device at a thin layer of oil so that the depressions in the surface can be seen as spots when lit by a lamp.
Dr Ben Long, Research Assistant from the Bristol Interaction and Graphics (BIG) group in the Department of Computer Science, said: “Touchable holograms, immersive virtual reality that you can feel and complex touchable controls in free space, are all possible ways of using this system.
“In the future, people could feel holograms of objects that would not otherwise be touchable, such as feeling the differences between materials in a CT scan or understanding the shapes of artefacts in a museum.”

Youtube video

Link to paper

VideoHandles receives honorable mention at SUI 2014

YouTube: VideoHandles

Jarrod Knibbe, Sue Ann Seah and Mike Fraser presented their work on VideoHandles at SUI 2014 and received an honorable mention for best short paper.

VideoHandles is a novel interaction technique for searching through action-camera (e.g. GoPro) video collections. Action-cameras are designed to be mounted, switched on and then ignored as they record the entirety of the wearer’s chosen activity. This results in a large amount of footage that may only include a small number of interesting moments. When reviewing the footage later, these interesting moments are hard to locate.

VideoHandles presents a novel solution to this problem, enabling the wearer to search through the footage by replaying actions they performed during the initial capture. For example, a diver goes for a long dive and communicates with their buddy throughout using a series of hand gestures. On one occasion, the diver sees a puffer fish and gestures ‘puffer fish’ to their buddy (a fish swimming motion, followed by a mimicked inflation) so that they can see it too. Later, using VideoHandles, the diver repeats the puffer fish gesture in front of the camera in order to locate that exact moment in their footage. Alongside returning the puffer fish footage, VideoHandles also returns all other moments including the ‘fish swimming’ gesture – a key component of all fish gestures in scuba diving.

The VideoHandles work explored action-camera use across a large collection of footage and presented a number of interaction styles and usage scenarios, both enthusiast and professional, including biking, windsurfing and archaeological excavation.