BIG

Bristol Interaction Group

Recent Seminars


Designing Interactive Molecular Simulation Platforms

Molecular Simulation is amongst the fastest growing areas in the chemical sciences, an achievement highlighted by the award of the 2013 Nobel Prize in Chemistry to a group of scientists who the Nobel committee described as “taking the chemical experiment into cyberspace.” The dream is that one day we will perhaps be able to use computational design in order to engineer molecular materials in a similar way as we now use computational frameworks to design buildings and bridges. As attractive as this goal might seem, the path to get there is fraught with difficulties. One of the biggest hurdles arises from the fact that molecular systems are inherently hyper-dimensional. For example, even a relatively small biomolecule has 1000 atoms, which corresponds to 3000 degrees of freedom! A great deal of work has been devoted to designing algorithms in order to efficiently search and characterise the most promising regions of molecular design space. One particularly interesting strategy involves the development of interfaces that allow humans to use their molecular ‘design intuition’ to guide automated search algorithms. Two recent examples - FOLDIT [1] and Eterna [2] - have highlighted the ability of such approaches to significantly accelerate hyperdimensional molecular search tasks when compared to algorithms which do not allow for human interaction.

 

In this talk, I will describe recent work in my group aimed at designing interactive molecular simulation platforms which can be used for accelerating molecular research, scientific education, and artistic applications [3,4]. I will highlight a range of different approaches that we have been developing over the past few years: (1)  algorithms and hardware we designed to carry out interactive molecular dynamics utilizing an array of consumer depth sensors; (2) more recent work constructing cross-platform interactive molecular simulation frameworks which run on tablets; and (3) recent experiments with Oculus Rift and HTC Vive, aimed to furnish an interactive molecular simulation virtual reality environment. The interfaces that I will describe work by interpreting the human form (or bits of the human form) as an energy ‘avatar', which can then be rigorously incorporated into the physical equations of motion. GPU acceleration has been key to achieving a relatively fluid interactive experience. Preliminary tests run in a chemistry/physics education context show that these tools allow non-expert users to accelerate simple molecular search tasks by 3–4 orders of magnitude compared to brute force ‘blind-search’ algorithms. Should time allow, I hope to demo some of the things we have been working on at the end of the talk.

 

[1] Cooper et al. "Predicting protein structures with a multiplayer online game." Nature 466.7307 (2010): 756-760.

[2] Lee et al. "RNA design rules from a massive open laboratory." Proceedings of the National Academy of Sciences 111.6 (2014): 2122-2127

[3] Glowacki et al., “A GPU-accelerated immersive audiovisual framework for interactive molecular dynamics using consumer depth sensors,” Faraday Discussion 169, 2014, 63 – 89, open access

[4] Mitchell et al., “danceroom Spectroscopy: at the frontiers of physics, performance, interactive art, and technology,” Leonardo, April 2016, 49(2), p 138-147, (2016), cover article

(more...)

The Touch Diaries

I have been developing a body of work called The Touch Diaries the premise for which was initially driven by my observations of people around me that seemed isolated from physical contact and at points ‘taken over’ by technology. I fostered a growing interest in this shift that is taking place around the experience of human connection and interaction. I wanted to investigate what ‘being in touch’ means and what value physical and non physical touch holds for people now and how a shift in ‘relating’ may have an affect on both our internal and external connectivity. The Touch Diaries focused on personal stories around touch, it was qualitative and semi-longitudinal in its style of analysis, participant-led with intergenerational communities coming together to explore ideas around touch. The project looked at human-to-human touch in the everyday lives of participants and those around them and research was driven via a diary study and workshops. ‘Findings’ (data, ideas, movement, choreographic ideas etc) were developed and re-told through the production of live and screen-based dance.

(more...)

Fitter, Happier, More Productive?: What to ask of a data-driven life?

Our lives are increasingly suffused with data, and a ‘data-driven life’ is frequently presented as an aspiration and panacea. In this talk, I will present three recent projects which aim to expand how we think and talk about the role of data in our lives. In particular, I'll argue that designing for ‘lived informatics’ (Rooskby et al., 2014) should not only recognize that self-tracking takes place over a range of lived activities; it should also question what aspects of lived experience personal informatics can really address, and the implications of a data-driven life for how we experience the world.

I’ll talk first about my fieldwork of a ‘quantified past’ - speaking to long-term users of personal informatics tools and journaling apps. I’ll then introduce the Metadating project where we invited participants to ‘date with data’, as a means to understand the talk and social life of data.

(more...)

Shape-Change for Zoomable TUIs: Opportunities and Limits of a Resizable Slider

Tangible sliders are successfully used as they do not need visual attention. However, users need to balance between opposite concerns: size and precision of the slider. We propose a resizable tangible slider to balance between these concerns. Users can resize the on-screen representation of the slider by resizing the tangible slider. Our aim is to benefit from both tangibility and flexible control, and balance between precision and minimum size. We measured the pointing performance of our prototype. We also assess the potential drawback (additional articulatory task for deformation) by evaluating the impact on precision of the additional articulatory task for deformation: for pursuing a target, we show that our resizable prototype supports better precision than its small counterpart as long as users do not need to resize it more often than around every 9 seconds.

(more...)

Integrated Design Environments for Creating Interactive Media Content

This talk introduces Integrated Development Environments (IDEs) for creating interactive media content. The IDEs aim to improve programming experience by using graphical representations such as photos and videos. Moreover, recent work is designed to help not only programmers but also designers (thus named Integrated Design Environment), liberating the programmer's way of content authoring to more people.

(more...)