COGAIN Symposium:
Communication by Gaze Interaction

Wuppertal, Germany • August 19th and 21st, 2017

In collaboration with:


Important dates

Submission due: April 22nd
April 29th
Acceptance notice: May 13th
May 15th
Camera-ready: May 20th
May 29th
Registration: June 15th
Workshop: August 19th
Symposium: August 21st

Chairs

General co-chairs
Andreas Bulling
Carlos H. Morimoto

Program co-chairs
John Paulin Hansen
Roman Bednarik

On August 21st, from 10:30 to 11:10 am.

Eye movement as material for interaction design

Hans Gellersen, Lancaster University, UK

Eye movements are central to most of our interactions. We use our eyes to see and guide our actions and they are a natural interface that is reflective of our goals and interests. At the same time, our eyes afford fast and accurate control for directing our attention, selecting targets for interaction, and expressing intent. Even though our eyes play such a central part to interaction, we rarely think about the movement of our eyes and have limited awareness of the diverse ways in which we use our eyes --- for instance, to examine visual scenes, follow movement, guide our hands, communicate non-verbally, and establish shared attention.

This talk will reflect on use of eye movement as input in human-computer interaction. Jacob's seminal work showed over 25 years ago that eye gaze is natural for pointing, albeit marred by problems of Midas Touch and limited accuracy. I will discuss new work on eye gaze as input that looks beyond conventional gaze pointing. This includes work on: gaze and touch, where we use gaze to naturally modulate manual input; gaze and motion, where we introduce a new form of gaze input based on the smooth pursuit movement our eyes perform when they follow a moving object; and gaze and games, where we explore social gaze in interaction with avatars.

Hans Gellersen

Hans Gellersen Hans Gellersen is Professor of Interactive Systems at Lancaster University. Hans' research interest is in sensors and devices for ubiquitous computing and human-computer interaction. He has worked on systems that blend physical and digital interaction, methods that infer context and human activity, and techniques that facilitate spontaneous interaction across devices. In recent work he is focussing on eye movement as a source of context information and modality for interaction.