Gaze-based Interaction

visual touch – your look turned into action 

Human communication with computers and other automated machinery can be vastly enhanced and accelerated by eye tracking. The human eye is constantly responding to stimuli and this information can be used for guiding various systems much faster than with conventional input devices, such as a computer mouse or joystick.

SMI Eye Tracking can also be connecting with leading partner solutions, e.g. VR engines or biometric systems, for a broad range of applications.

Since visual interaction devices are extremely small, lightweight and mobile, the possibilities for their use and areas of gaze-based interaction research are manifold, and continuously expanding. Applications thus far have included:

  • Education (especially when combined with voice inputs)
  • Research (such as microscopes with eye control)
  • Gaming (gaze control strategy and simulator PC games)
  • Kiosk (keyboard replacement)
  • Assistive (gaze interaction for the physically challenged)

Such systems are usually head-mounted, often incorporated into glasses or headsets, and so have to be extremely comfortable. For this same reason, users must be able to freely move their head and eyes. Moreover, since the system should ideally be compatible with voice or other input devices, it needs to be easily integrated.

Get more information.

Case Studies


What are the possibilities for using eye tracking technology in cognitive computing, specifically in research on applications for persons with dementia? This is one of the central questions which Dr. Daniel Sonntag and his team at the German Research Center for Artificial Intelligence pursue in their KOGNIT research project. To do so, they use the SMI Eye Tracking Upgrade for the Oculus Rift DK2 HMD - complemented with the SMI Unity plug-in - for integrating a person's gaze with real and virtual environments.

learn more

Interface Design by using the remote eye tracking device

The Cognitive Task Modelling Group at the University of Sheffield, UK, has been working on a project modeling the perception and comprehension of dynamic displays in computer interface design. The project compares interfaces that follow design principals identified by the ICS model of film watching with interfaces where these principals are flouted.

Eye movement data is collected by the iView X™ RED non-invasive and unobtrusive remote eye tracking device to retain a naturalistic testing situation. The availability of the raw eye tracking data in plain text form allows the experimenter to apply their own customized algorithms to identify contingencies between dynamic screen events and changes in gaze location. In this way, fixation data can be synchronized with interface actions and user responses to establish whether interfaces are minimizing users' cognitive effort in the ways proposed by the ICS principles.

Overview Case Studies