Molecular Simulation is amongst the fastest growing areas in the chemical sciences, an achievement highlighted by the award of the 2013 Nobel Prize in Chemistry to a group of scientists who the Nobel committee described as “taking the chemical experiment into cyberspace.” The dream is that one day we will perhaps be able to use computational design in order to engineer molecular materials in a similar way as we now use computational frameworks to design buildings and bridges. As attractive as this goal might seem, the path to get there is fraught with difficulties. One of the biggest hurdles arises from the fact that molecular systems are inherently hyper-dimensional. For example, even a relatively small biomolecule has 1000 atoms, which corresponds to 3000 degrees of freedom! A great deal of work has been devoted to designing algorithms in order to efficiently search and characterise the most promising regions of molecular design space. One particularly interesting strategy involves the development of interfaces that allow humans to use their molecular ‘design intuition’ to guide automated search algorithms. Two recent examples - FOLDIT  and Eterna  - have highlighted the ability of such approaches to significantly accelerate hyperdimensional molecular search tasks when compared to algorithms which do not allow for human interaction.
In this talk, I will describe recent work in my group aimed at designing interactive molecular simulation platforms which can be used for accelerating molecular research, scientific education, and artistic applications [3,4]. I will highlight a range of different approaches that we have been developing over the past few years: (1) algorithms and hardware we designed to carry out interactive molecular dynamics utilizing an array of consumer depth sensors; (2) more recent work constructing cross-platform interactive molecular simulation frameworks which run on tablets; and (3) recent experiments with Oculus Rift and HTC Vive, aimed to furnish an interactive molecular simulation virtual reality environment. The interfaces that I will describe work by interpreting the human form (or bits of the human form) as an energy ‘avatar', which can then be rigorously incorporated into the physical equations of motion. GPU acceleration has been key to achieving a relatively fluid interactive experience. Preliminary tests run in a chemistry/physics education context show that these tools allow non-expert users to accelerate simple molecular search tasks by 3–4 orders of magnitude compared to brute force ‘blind-search’ algorithms. Should time allow, I hope to demo some of the things we have been working on at the end of the talk.
 Cooper et al. "Predicting protein structures with a multiplayer online game." Nature 466.7307 (2010): 756-760.
 Lee et al. "RNA design rules from a massive open laboratory." Proceedings of the National Academy of Sciences 111.6 (2014): 2122-2127
 Glowacki et al., “A GPU-accelerated immersive audiovisual framework for interactive molecular dynamics using consumer depth sensors,” Faraday Discussion 169, 2014, 63 – 89, open access
 Mitchell et al., “danceroom Spectroscopy: at the frontiers of physics, performance, interactive art, and technology,” Leonardo, April 2016, 49(2), p 138-147, (2016), cover article(more...)
I have been developing a body of work called The Touch Diaries the premise for which was initially driven by my observations of people around me that seemed isolated from physical contact and at points ‘taken over’ by technology. I fostered a growing interest in this shift that is taking place around the experience of human connection and interaction. I wanted to investigate what ‘being in touch’ means and what value physical and non physical touch holds for people now and how a shift in ‘relating’ may have an affect on both our internal and external connectivity. The Touch Diaries focused on personal stories around touch, it was qualitative and semi-longitudinal in its style of analysis, participant-led with intergenerational communities coming together to explore ideas around touch. The project looked at human-to-human touch in the everyday lives of participants and those around them and research was driven via a diary study and workshops. ‘Findings’ (data, ideas, movement, choreographic ideas etc) were developed and re-told through the production of live and screen-based dance.(more...)
Our lives are increasingly suffused with data, and a ‘data-driven life’ is frequently presented as an aspiration and panacea. In this talk, I will present three recent projects which aim to expand how we think and talk about the role of data in our lives. In particular, I'll argue that designing for ‘lived informatics’ (Rooskby et al., 2014) should not only recognize that self-tracking takes place over a range of lived activities; it should also question what aspects of lived experience personal informatics can really address, and the implications of a data-driven life for how we experience the world.
I’ll talk first about my fieldwork of a ‘quantified past’ - speaking to long-term users of personal informatics tools and journaling apps. I’ll then introduce the Metadating project where we invited participants to ‘date with data’, as a means to understand the talk and social life of data.
Tangible sliders are successfully used as they do not need visual attention. However, users need to balance between opposite concerns: size and precision of the slider. We propose a resizable tangible slider to balance between these concerns. Users can resize the on-screen representation of the slider by resizing the tangible slider. Our aim is to benefit from both tangibility and flexible control, and balance between precision and minimum size. We measured the pointing performance of our prototype. We also assess the potential drawback (additional articulatory task for deformation) by evaluating the impact on precision of the additional articulatory task for deformation: for pursuing a target, we show that our resizable prototype supports better precision than its small counterpart as long as users do not need to resize it more often than around every 9 seconds.(more...)
This talk introduces Integrated Development Environments (IDEs) for creating interactive media content. The IDEs aim to improve programming experience by using graphical representations such as photos and videos. Moreover, recent work is designed to help not only programmers but also designers (thus named Integrated Design Environment), liberating the programmer's way of content authoring to more people.(more...)
Even though considered a rapid prototyping tool, 3D printers are very slow. Many objects require several hours of printing time or even have to print overnight. One could argue that the way 3D printers are currently operated is very similar to the batch processing of punched cards in the early days of computing: all input parameters are pre-defined in the 3D modeling stage, the 3D printer then simply executes the instructions without human intervention.
If we look at the history of computing, increasingly faster processing times allowed us to move away from batch processing and enabled completely new interaction paradigms: while slow batch processing required carefully thinking ahead, command line input allowed for tighter feedback loops, and direct manipulation finally enabled even novice users to quickly iterate towards a solution.
In this talk, I argue that by speeding up personal fabrication technology, we will be able to interactively shape physical matter in real-time in the same way as today’s fast computers allow us to interactively manipulate information. As a first step towards this goal, I will present my CHI/UIST publications on faster fabrication (e.g. WirePrint [UIST’14], Platener [CHI’15]) and new interaction paradigms for interactive manipulation of matter (e.g. constructable [UIST’12], LaserOrigami [CHI’13]).
All welcome, no registration required.
Organised by the Bristol Interactions and Graphics group (http://big.cs.bris.ac.uk)(more...)