Gaze-based Interaction

visual touch – your look turned into action 

Human communication with computers and other automated machinery can be vastly enhanced and accelerated by eye tracking. The human eye is constantly responding to stimuli and this information can be used for guiding various systems much faster than with conventional input devices, such as a computer mouse or joystick.

Since visual interaction devices are extremely small, lightweight and mobile, the possibilities for their use and areas of gaze-based interaction research are manifold, and continuously expanding. Applications thus far have included:

  • Education (especially when combined with voice inputs)
  • Research (such as microscopes with eye control)
  • Gaming (gaze control strategy and simulator PC games)
  • Kiosk (keyboard replacement)
  • Assistive (gaze interaction for the physically challenged)

Such systems are usually head-mounted, often incorporated into glasses or headsets, and so have to be extremely comfortable. For this same reason, users must be able to freely move their head and eyes. Moreover, since the system should ideally be compatible with voice or other input devices, it needs to be easily integrated.

Get more information.


Case Studies

Interface Design by using the remote eye tracking device

The Cognitive Task Modelling Group at the University of Sheffield, UK, has been working on a project modeling the perception and comprehension of dynamic displays in computer interface design. The project compares interfaces that follow design principals identified by the ICS model of film watching with interfaces where these principals are flouted.

Eye movement data is collected by the iView X™ RED non-invasive and unobtrusive remote eye tracking device to retain a naturalistic testing situation. The availability of the raw eye tracking data in plain text form allows the experimenter to apply their own customized algorithms to identify contingencies between dynamic screen events and changes in gaze location. In this way, fixation data can be synchronized with interface actions and user responses to establish whether interfaces are minimizing users' cognitive effort in the ways proposed by the ICS principles.

Overview Case Studies Gaze Based Interaction