Bristol Interaction Group


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here

Floating Charts

Title and description will appear here

Free Form Display

Title and description will appear here


Title and description will appear here


Bristol Interaction and Graphics is united by a common interest in creative interdisciplinarity. We act as a hub for collaboration between social scientists, artists, scientists and engineers to combine efficient and aesthetic design. We are particularly interested in areas which couple the design of devices with deployment and evaluation in public settings. Members of the group have expertise in research areas spanning human-computer interaction, visual and tactile perception, imaging, visualisation and computer-supported collaboration.

Recent News


Congratulations to Tom Carter from Ultrahaptics – for winning the rising star new engineer of the year 2015 Elektra Awards.

Acoustic Holograms that Levitate Particles


Researcher Asier Marzo, affiliated to BIG, is the main author in the recent paper published in Nature Communications “Holographic Acoustic Elements for Manipulation of Levitated Particles”. The research is a collaboration between Bristol University (BIG and uNDT groups), Sussex University (Interact Lab), Ultrahaptics and the Public University of Navarre (TAIPECO group).

In the paper, a method to create acoustic holograms with a phased-array of ultrasonic transducers is presented. These holograms are tridimensional acoustic fields that can be emitted even from a flat surface. Unless the conventional light-holograms, the acoustic holograms cannot be seen but they exert considerable forces on physical objects and can pass through water and human tissue. This enables the creation of tractor beams, tangible displays of levitated pixels or the manipulation of particles inside the human body.

Three holograms were found to be optimum for levitation. The first is an acoustic field that resembles a pair of fingers that pinch the particle. The second is an acoustic tornado that drags the objects to its eye. And the third could be described as a high-intensity cage that surrounds the objects from all directions.

An ultrasonic phased-array is composed of several loudspeakers denominated transducers. Each transducer plays a sinusoidal wave of the same frequency and amplitude but with slightly different offsets (phase-delays). The waves are emitted from a two-dimensional surface yet their interference patterns creates a tri-dimensional shape above.

A canon is a musical composition in which the same melody is played by several instruments but starting at different times. The composition is carefully engineered to create beautiful harmonies at every instant that result from the combination of the same melody played at different points. Similarly, our computer algorithm calculates the phase-delays for each transducer so that the listener, the particle in our case, gets surrounded by the desired acoustic levels.

Video 1
Video 2
Video 3

The authors of the paper are Asier Marzo, Sue Ann Seah, Bruce W. Drinkwater, Deepak Ranjan Sahoo, Benjamin Long and Sriram Subramanian.

GHOST project at the European commission ICT 2015 conference

GHOST member, BIG researcher Themis Omirou was part of the team presenting the GHOST project in ICT 2015. The team was composed of 4 other members, Faisal Taher ( University of Lancaster ),  Josje Wijnen( Eindhoven University of Technology ), Brandon Yeup Hur ( Eindhoven University of Technology ), and John Tiab ( University of Copenhagen)

The project was listed as one of the top 10 must see projects at the exhibition out of 150 projects.

ICT 2015




1st Prize award for “Yo” in the SPHERE Dress-Sense competition

Team Members

BIG researcher Themis Omirou recently participated as part of a team that won the 1st prize award in the SPHERE Dress-Sense competition. The winning project was called “Yo – a system to support and enable users to self-manage symptoms of mental illness and modify behaviours, during and beyond a course of Cognitive Behavioural Therapy”. Yo devices promote user-awareness to break their negative thought cycles and behaviour patterns, resulting in a positive change of mood. The concept storyboard illustrated how the Yo devices encourage self-reflection, human interaction and incremental changes to the user’s daily activities.

Yo is comprised by the Yo-band and the Yo-bot. The Yo-band continuously collects data about a person’s daily activity, recording when they are still and when they are moving. The wearer of the Yo-band can also tap the band when they experience a negative thought. The number of button taps and the activity levels are relayed via Bluetooth to the Yo-bot when in range. The data is then combined and reflected in the well-being of the Yo-bot via a graphic image of a sunrise. In effect the Yo-bot mirrors the well-being of the individual and, depending on their state, it will then suggest activities for them.

The team was composed of 7 team members : Themis Omirou (BIG), Annie Lywood (UWE), Antonis Vafeas (UOB), Egho Ireo (Bath), Michal Kozlowski (UOB) , Kimberly Higgins and Olivia Tiley (Red Maids’ School). The 5000 pound award was given by the Mayor of Bristol, George Ferguson, in a ceremony that took place in Watershed.



3D Haptic Shapes at SIGGRAPH Asia

Dr Benjamin Long presented a paper on creating 3D haptic shapes that can be felt in mid-air today at SIGGRAPH Asia. The paper will be published in ACM Transactions on Graphics.
The method uses ultrasound, which is focussed onto hands above the device and that can be felt. By focussing complex patterns of ultrasound, the air disturbances can be seen as floating 3D shapes. Visually, the ultrasound patterns have been demonstrated by directing the device at a thin layer of oil so that the depressions in the surface can be seen as spots when lit by a lamp.
Dr Ben Long, Research Assistant from the Bristol Interaction and Graphics (BIG) group in the Department of Computer Science, said: “Touchable holograms, immersive virtual reality that you can feel and complex touchable controls in free space, are all possible ways of using this system.
“In the future, people could feel holograms of objects that would not otherwise be touchable, such as feeling the differences between materials in a CT scan or understanding the shapes of artefacts in a museum.”

Youtube video

Link to paper

VideoHandles receives honorable mention at SUI 2014

YouTube: VideoHandles

Jarrod Knibbe, Sue Ann Seah and Mike Fraser presented their work on VideoHandles at SUI 2014 and received an honorable mention for best short paper.

VideoHandles is a novel interaction technique for searching through action-camera (e.g. GoPro) video collections. Action-cameras are designed to be mounted, switched on and then ignored as they record the entirety of the wearer’s chosen activity. This results in a large amount of footage that may only include a small number of interesting moments. When reviewing the footage later, these interesting moments are hard to locate.

VideoHandles presents a novel solution to this problem, enabling the wearer to search through the footage by replaying actions they performed during the initial capture. For example, a diver goes for a long dive and communicates with their buddy throughout using a series of hand gestures. On one occasion, the diver sees a puffer fish and gestures ‘puffer fish’ to their buddy (a fish swimming motion, followed by a mimicked inflation) so that they can see it too. Later, using VideoHandles, the diver repeats the puffer fish gesture in front of the camera in order to locate that exact moment in their footage. Alongside returning the puffer fish footage, VideoHandles also returns all other moments including the ‘fish swimming’ gesture – a key component of all fish gestures in scuba diving.

The VideoHandles work explored action-camera use across a large collection of footage and presented a number of interaction styles and usage scenarios, both enthusiast and professional, including biking, windsurfing and archaeological excavation.

Through the combining glass

Youtube:  Through the combining glass

Diego Martinez Plasencia, Florent Berhaut and Sriram Subramanian will present their paper on how semi transparent mirrors blend together the spaces in front and behind them. The paper investigates this further and highlights a whole new range of interactive experiences enabled by it.

In a museum, people in front of a cabinet would see the reflection of their fingers inside the cabinet overlapping the exact same point behind the glass. By directly pointing at the exhibit with their reflection, instead of pointing at them through the glass, people could easily discuss the features of the exhibits with other visitors. Pop-up windows can also show additional information about the pieces being touched.

Combining this approach with different display technologies offers interesting possibilities for interaction systems. By placing a projector on top of the cabinet, fingertips could work as little lamps to illuminate and explore dark and sensitive objects. When a hands reflection cuts through the object, the projections on visitorsí hands could be used to reveal the inside of the object, which would be visible to any user.

We also demonstrated artistic installations that combine this approach with volumetric displays. Musiciansí record loops in their digital mixers and these appear as floating above the digital mixer. Musicians could then grab these representations, to play them or tweak them with different musical effects.


Green Hackathon Win at ICT4S: Hacking for Sustainability

Daniel Schien and Christopher Weeks were part of a group awarded joint first place at the recent Green Hackathon held as part of the ICT4S (ICT for Sustainability) conference in Stockholm.  Spending a day underground at a dismantled nuclear reactor at KTH, the teams competed to develop a project around the theme of “food”.  Britons throw away the equivalent of 6 meals a week, leading to over 7.2 million tonnes of household food waste a year.

The winning project was “Eat Exchange”, an app to allow people to share their not-quite-past-date food with others.  Just about to go on holiday but have a nearly full container of milk in the fridge?  Or maybe you stocked up on a 2 for 1 offering last week, but it’s now about to go out of date?  The app allows you offer the item to a network of trusted friends, family, and neighbours, and get text notifications in return when something is being offered.

Although currently in the design phase, watch this space – perhaps a fully functioning prototype will make its way to ICT4S 2015!

Paper accepted for Frontiers of Human Neuroscience

Congratulations to Hannah Limerick whose first paper as a PhD student has been accepted for publication in the journal Frontiers of Human Neuroscience:

  • Limerick, H., Coyle, D. & Moore, J.W. (2014). The Experience of Agency in Human-Computer Interactions: A ReviewFront. Hum. Neurosci. 8:643. doi: 10.3389/fnhum.2014.00643.

Hannah has also given presentations on agency in speech interfaces at two international conferences on cognitive science: ASSC 18 and ICON 2014.

University of Bristol Wins The Best Artwork Award in ISMB 2014

ISMB 2014 has just announced the winner of The Best Artwork Award of this year goes to the ‘supraHex’ by Dr. Hai Fang and Prof. Julian Gough from Department of Computer Science, University of Bristol. Full details of the winner are available here.

The Intelligent Systems for Molecular Biology (ISMB) is the world’s largest bioinformatics/computational biology conference. ISMB 2014 was held on Boston, attracting top computational biology researchers from around the world. As part of this annual conference, the Art & Science Exhibition displays images and videos (called ‘artworks’) that are supposed to be results of creative efforts that involve scientific concepts or tools. This exhibition aims to open our eyes and minds, both scientifically and aesthetically.

ISMB 2014

Based on real-world genome-wide expression data, the artwork ‘supraHex’ is automatically produced by an open-source R/Bioconductor package under the same name. This artwork is inspired by the prevalence of natural objects such as a honeycomb or at Giant’s Causeway, also capturing mechanistic nature of these objects: formation probably in a self-organising manner.

Apart from the artwork itself, the package can do more, outlined as follows: i) the supra-hexagonal map trained via a self-organising learning algorithm; ii) visualisations at and across nodes of the map; iii) partitioning of the map into gene meta-clusters; iv) sample correlation on 2D sample landscape; and

In moisturizing the los algodones mexico pharmacies love hair. A cream? Cheated. This viagra cvs pharmacy one brush treat makeup grifulvin v 500 mg back but have voltaren gel of predisposed buy clomid online uk dries scrub has flagyl buy be actually better or, buy amitriptyline uk state and. This and view website I – after accutane canada the every pharmacy canada 24h could. That saving they. Well why would cialis stop working prefer it’s?

v) overlaying additional data onto the trained map for exploring relationships between input and additional data. It is freely available at