feelies
Invited Talk - 9th November 2016 - 11am Arundel Building, Room: 223
Talk by Prof. Philippe Palanque on "Engineering Automation and Interaction: how to reconcile User Experience, Usability and Dependability to build Resilient Interactive Systems".

More information

Close

Abstract

Innovation and creativity are the research drivers of the Human-Computer Interaction (HCI) community which is currently investing a vast amount of resources in the design and evaluation of “new” user interfaces and interaction techniques, leaving the correct functioning of these interfaces at the discretion of the helpless developers. In the area of formal methods and dependable systems the emphasis is usually put on the correct functioning of the system leaving its usability to secondary-level concerns (if at all addressed). However, designing interactive systems requires blending knowledge from these domains in order to provide operators with both usable and reliable systems. The talk will present possible research directions and their benefits for combining several complementary approaches to engineer interactive critical systems. Due to their specificities, addressing this problem requires the definition of methods, notations, processes and tools to go from early informal requirements to deployed and maintained operational interactive systems. The presentation will highlight the benefits of (and the need for) an integrated framework for the iterative design of operators' procedures and tasks, training material and the interactive system itself. The emphasis will be on interaction techniques specification and validation as their design is usually the main concern of HCI conferences. A specific focus will be on automation that is widely integrated in interactive systems both at interaction techniques level and at application level. Examples will be taken from interactive cockpits on large civil commercial aircrafts (such as the A380), satellite ground segment application and Air Traffic Control workstations.

Short Bio

Philippe Palanque is professor in Computer Science at the University Toulouse 3 and is head of the ICS (Interactive Critical System) research group at IRIT (Toulouse Research Lab in Computing Sciences). Starting in 1995 he spent 2 years at CENA (Research center of civil aviation) to develop and apply formal specification and interactive system design techniques to the field of air traffic control. For 4 years he has been involved in several research projects funded by the French Department of Defense dealing with the notations and tools for the specification of real-time interactive systems (including Command and Control systems for drones, multimodal interfaces for military cockpits and ground segment systems in satellite control rooms …). As for civil aviation, he is now involved in the specification and certification issues of new interactive cockpits (that have to be compliant with ARINC 661 specification standard) of aircrafts including A380, A400M and Boeing 787. He is the secretary of the IFIP Working group 13.5 on Resilience, Reliability, Safety and Human Error in System Development and is steering committee chair of the CHI conference series at ACM SIGCHI (Special Interest Group on Human-Computer Interaction). He edited and co-edited more than twenty books or conference proceedings and co-authored more than 200 refereed publications in international conferences and journals.

SynapticTheater
Invited Talk - 26th October 2016 - 2-3pm Arundel Building, Room: 223
The Synaptic Theatre Project: Multi-sensory strategies for creating immersive experiences

More information

Close

Abstract

Polymorf is a Dutch design studio. In their work they engage audiences by creating fully embodied immersive experiences. As an interdisciplinary cross media collective Polymorf designs by any media necessary. This is reflected in their body of work which consists of VR experiences, theatrical performances, and installation pieces. In their currents projects they explore the possibilities for scent and touch to create multi-sensory strategies for story and experience design. By directly influencing the affective response, Polymorf incorporates the viscerally of meaning in their designs. Within their latest project, the Synaptic Theatre, they investigate the possibilities for artists and designers to create meaningful and immersive experiences by stimulating the limbic and neurological system of the audience. By creating work that directly manipulates the brain, fundamental questions are adressed about the authenticity and autonomy of the human embodied experience.

Short Bios

POLYMORF designs immersive experiences and media projects. The projects are time based and often engage all senses. Projects are ranging from interactive art-installations, advertising, educational projects, film, theatre and operas. Polymorf, projects reflect on the now, the (post) human condition and relationships between humans and technology.

Marcel van Brakel is the founder of cross media group Polymorf. He works as independent media designer, play writer, librettist and film and theatre director. Marcel was co- founder of film collective FlimFilm and worked as ceo and theatre director at Het Witte Vuur. Van Brakel is also a lecturer Multimedia Design, 3D Design and Performativity the Department of Communication Media Design at the Avans University of Applied Science in Breda.

Frederik Duerinck is a filmmaker and producer located in Breda. For many years Frederik has produced and directed a great amount of documentaries,Corporate films, e-learning, and online healthcare- applications. Since 2004 Frederik teaches part time at CMD Breda as lecturer film- multimedia- design and project supervisor. He is also co- founder of the CMD Netlab.

Wander EIkelboom is a lecturer, critic and publicist in the fields of philosophy and participatory media cultures. He is editor in chief of the publications ‘Sense of Smell’ and Void() magazine. He is an international experienced speaker on topics like scent & history and scent & data.

feelies
Invited Talk - 5th August 2016 - 11am - Jubilee Building, Room: 118
Talk by Luca Rinaldi on "Sensorimotor experience constrains serial and temporal order processing".

More information

Close

Abstract

Space and time are tightly coupled to each other in both the physical world and in the human mind. For instance, we subjectively experience that the passage of time goes along with the passage of space when we move from one place to another. But does this experience of time in a physical space affect our cognition of temporal concepts? In this talk, I will present evidence from both developmental and adult studies supporting the idea that temporal concepts are grounded in spatial coordinates through the sensorimotor system. In particular, I will show that prior sensorimotor experience in space (i.e., reading and writing, finger counting, locomotion) affects the way humans process and represent temporal information.

Short Bio

I received my PhD in Experimental Psychology on the sensorimotor mechanisms subserving the control of visuospatial attention and the processing of temporal information, under the supervision of Luisa Girelli from the University of Milano-Bicocca in January 2016. During this time, I also had the opportunity to gain experience with several techniques for measuring the kinematics of human movements (e.g., motion tracking system, eye-tracking, graphics tablet) under the supervision of Avishai Henik at the Ben-Gurion University (Beer-Sheva, Israel) and of Peter Brugger at the Zurich Center for Integrative Human Physiology (Zurich, Switzerland). My postdoctoral research career began in January 2016 under the supervision of Tomaso Vecchi at the University of Pavia, where I am pursuing my interests in time processing. In particular, I am currently investigating how our brains manage to adjust prior directional experience to current information coming from the different senses - such as sight and hearing - to orient our bodies in space and in time. In addition to exploring the basic mechanisms of space-time processing, I am interested in how these are affected by emotions and how they develop over the lifespan. For these reasons, I am now joining for some months the Centre for Brain and Cognitive Development at Birkbeck University, under the supervision of Annette Karmiloff-Smith.

DiLuca
Invited Talk - 1st July 2016 - 11am - FULTON Building, FUL-114
Massimiliano Di Luca, Lecturer in the School of Psychology at the University of Birmingham.

More information

Close

Presentation

Massimiliano Di Luca is a Lecturer in the School of Psychology at the University of Birmingham, working in the research centre for Computational Neuroscience and Cognitive Robotics. He received the Laurea in Psychology from the University of Trieste in 2000 and the PhD in Cognitive Science from Brown University in 2006. Afterward, he worked as postdoctoral researcher at the Max Planck Institute for Biological Cybernetics. His research goal is to understand human sensory processing. Using psychophysical methods and computational modeling, he investigates how the brain combines multisensory information for perception and action. For the past year, he has been on a sabbatical Oculus VR.

Talk Abstract

How do we know if an avocado is ripe, raw, or rotten? When we press our fingers against the avocado, sensory signals provide information related to its material. Our brain combines the information into a representation of the material properties and compares it to the expectations about how an avocado is supposed to feel: too hard and we should wait for it to ripe, too soft and it already too late. In his talk, Max will give an overview of his work on how humans interact with soft objects and how multiple sensory signals are used to perceive material properties, with an emphasis on object deformability. The empirical results will be compared to the predictions of a computational model of softness perception, where sensory signals are combined to obtain perceptual estimates. The use of probability distributions as a description of the signals and the expectations involved in the computations makes the model suitable to describe the cognitive mechanisms underlying the use of multiple sources of information in softness perception according to the rules of Bayesian inference.

NBerthouze
Invited Talk - 8th April 2016 - 11am - FULTON Building, Room: 101
Nadia Berthouze, Professor in Affective Computing and Interaction at the Interaction Centre of the University College London (UCL).
Bringing affect into technology: the case of chronic pain physical rehabilitation.

More information

Close

Emotions and affective states more generally play an important role in people’s life, including when they interact with increasingly pervasive technology. Yet, their integration into technology for real life applications is sparse. Our research aims to design technology that is capable to take into account how we feel so as to provide us with relevant support. This talk will focus on technology for chronic pain physical rehabilitation. Chronic pain brings with it many affective states in addition to frustration or boredom at engaging in repetitive exercises and functional activity. Those include low self-esteem for the new body we have to accept, fear and anxiety of injuring oneself, and low perceived self-efficacy modulated by attention to pain. Whilst gamification has been found to mitigate the more boring aspects of physical rehabilitation, the other affective states are still mostly overlooked resulting in low adherence to the therapy program and low transfer to everyday functional capabilities. In this talk, I will present our investigations into the affective barriers to physical rehabilitation in chronic pain and how technology could help breaking them. Finally, I will briefly present related works on automatically detecting affective states from body expressions and touch behaviour.

Page 2 of 3

About the SCHI Lab

The SCHI Lab research lies in the area of Human-Computer Interaction (HCI), an area in which research on multisensory experiences makes a difference on how we design and interact with technology in the future. The interdisciplinary team explores tactile, gustatory, and olfactory experiences as novel interaction modalities.

Contact

Sussex Computer Human Interaction Lab

Creative Technology Research Group

School of Engineering and Informatics

University of Sussex Chichester, 1

BN1 9QJ Brighton, UK

Phone: +44 (0)1273 877837

Mail: m.obrist [at] sussex.ac.uk

University of Sussex

ERC

© 2017 SCHI Lab. All Rights Reserved. Designed By JoomShaper