Bristol Interaction and Graphics


Title and description will appear here


Title and description will appear here


Title and description will appear here

Brain-Computer Interaction

Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Title and description will appear here


Bristol Interaction and Graphics is united by a common interest in creative interdisciplinarity. We act as a hub for collaboration between social scientists, artists, scientists and engineers to combine efficient and aesthetic design. We are particularly interested in areas which couple the design of devices with deployment and evaluation in public settings. Members of the group have expertise in research areas spanning human-computer interaction, visual and tactile perception, imaging, visualisation and computer-supported collaboration.

Recent News

Peter Bennett gives a talk at the Royal College of Art

Peter Bennett from the Bristol Interaction and Graphics Lab gives a talk as part of the Royal College of Art, School of Design’s contribution to the RCA’s Research Methods Course. They present a symposium dedicated to Time and Design.

More information can be found here.


Máire Geoghegan-Quinn visited and experienced ultrahaptics


Máire Geoghegan-Quinn, European Commissioner for Research, Innovation and Science visited today and experienced our ultrahaptics demo.

She said “It’s an incredibly exciting technology, I think it has enormous potential and I’m so pleased that it was developed by a person who looks about 14!”

7 Papers accepted to CHI 2014

7 papers

Bristol Interaction and Graphics group will present seven papers and notes at ACM CHI 2014. The papers range in topics from collaborative brain-computer interfaces to shape-changing interfaces covering our sense of vision, touch, taste and smell.

The following are our 7 papers and notes

Different ways people relate to leaderboards

PastedGraphic-big_headerOur CSCW2014 work on different ways people relate to leaderboards is featured as today’s Follow the Crowd blog article. Some people relate to them competitively – and can find it motivating or demotivating depending on their performance. But interestingly some use it not as a means of competition, but rather as a means of judging if they are making a ‘typical’ contribution: they use it in a social normalising way.

BIG researcher in mock Mars mission.


BIG researcher Sue Ann Seah recently participated in a two-week Mars simulation expedition as part of MarsCrew 134 at the Mars Desert Research Station located in the Utah desert. Working together with Anne Roudaut, Sriram Subramanian and Marianna Obrist (University of Sussex), the research project aims at studying the current limitations that astronauts face when interacting through the spacesuit due to a lack of sensory feedback. Watch her interview at

Magicians in residence on BBC Click

Following on from BIG Lab projects Mistable and Ultrahaptics in last month’s BBC Click programme, our collaboration with the Pervasive Media Studio has featured in the BBC Click Christmas episode. See Jarrod and Pete helping Kieron and Stuart make some magical performances on the BBC News channel, or at any time on iPlayer if you are in the UK.

Magician in Residence:
Mistable and Ultrahaptics:

Christmas in the BIG lab


Ho Ho Ho! It’s Christmas in the BIG lab and everybody was here to participate in the 2013 edition of the secret santa! “Sri Claus” came down from the chimney to bring each of us a hamper bag with delicious surprises, and we all finish around the improvised buffet.

SensaBubble brings Christmas to the Watershed

The BIG Lab took SensaBubble to the Watershed on Friday evening, 6th December 2013, for a bit of festive fun. SensaBubble is an ambient notification system; creating scented smoke-filled bubbles, tracking and projecting onto them. During the evening, SensaBubble created bubbles filled with the scent of Christmas pudding and projected festive colours onto them, to the delight of the guests at the Watershed.

Honourable mention at ACE 2013 for a paper on dynamic photo management

Presentation at ACE 2013      Demo at ACE 2013

Chi Thanh Vi presented and demoed our work at ACE 2013, Netherlands and received a Honorable Mention Paper Award. The works was carried out at the Tohoku University, in Japan where Chi spent 6 months as an intern and done in collaboration with University of Bristol, OLM Digital Inc., and Osaka University.

Abstract: D-FLIP, a novel algorithm that dynamically displays a set of digital photos using different principles for organizing them. A variety of requirements for photo arrangements can be flexibly replaced or added through the interaction and the results are continuously and dynamically displayed. D-FLIP uses an approach based on combinatorial optimization and emergent computation, where geometric parameters such as location, size, and photo angle are considered to be functions of time; dynamically determined by local relationships among adjacent photos at every time instance. As a consequence, the global layout of all photos is automatically varied. We first present examples of photograph behaviors that demonstrate the algorithm and then investigate users’ task engagement using EEG in the context of story preparation and telling. The result shows that D-FLIP requires less task engagement and mental efforts in order to support storytelling.

Authors: Chi Thanh Vi, Kazuki Takashima, Hitomi Yokoyama, Gengdai Liu, Yuichi Itoh, Sriram Subramanian, Yoshifumi Kitamura

Shidoheddo wins social robotics competition

ShidoheddoShidoheddo, “the sociable robot with a plant for a brain” has won first place in the International Conference on Social Robotics design competition. The conceptual design for the robot, a mix between a bonsai and a Tamagotchi, was presented as a magazine advert (shown above) and created by illustrator Louise Cunningham and BIG researcher Peter Bennett.

More info on ICSR’13 here: