Designing human–machine interactions to be natural and intuitive.
Research focus areas
In the Human–Computer Interaction (HCI) research field, we investigate how people interact with digital systems — today and in the future. Our goal is to design interactions and interfaces that are natural, user-friendly, and intuitive. To achieve this, we work with a wide range of modern technologies and employ diverse modalities and forms of interaction:
Eye tracking to analyse visual attention
Voice interaction to control systems through spoken language
Speech interaction and conversational interfaces with virtual assistants and companions
Gesture-based interaction for touchless interfaces
Projection-based interactive displays
Brain–computer interfaces to capture cognitive states and intentions
Adaptive displays
Human–AI interaction to design comprehensible and trustworthy AI systems
Human–robot interaction, including social robots and workplace robotics
Ubiquitous computing, embedding digital technologies seamlessly into everyday life
Information and data visualisation: development of visual representations of complex content or datasets to make information understandable, accessible, and interpretable — including both static visualisations and interactive dashboards
Our approach is interdisciplinary. We combine computer science, cognitive science, design, and ethics to develop technologies aligned with users’ needs.
In our projects, we design, develop, and evaluate innovative, human-centred interaction solutions — for example in health, education, work, leisure, industry, and public spaces.
Services
We offer a comprehensive range of high-precision eye-tracking systems and software solutions suitable for cognitive experiments, human–computer interaction studies, and usability testing — including portable options for on-site data collection.
Contact us
For further information or to discuss potential collaboration, please contact Thekla Müller.
Infrastructure
Remote Eye Tracker: EyeLink 1000 Plus
Eye tracker with a sampling rate of up to 2,000 Hz. High-resolution eye tracker suitable for cognitive experiments, human–computer interaction studies (e.g. gaze-adaptive displays), and usability studies. The mobile carrying case enables on-site data collection.
Eye Tracker: EyeLink Portable Duo
It supports a sampling rate of up to 2,000 Hz when used with head stabilisation and 1,000 Hz in remote mode (allowing free head movement). The high-resolution eye tracker is suitable for cognitive experiments, human–computer interaction studies (e.g. gaze-adaptive displays), and usability studies. As its name suggests, the EyeLink Portable Duo is a portable system. It is installed in our usability lab but can be easily dismantled and transported using a mobile carrying case for on-site data collection.
Included hardware:
Soft and hard carrying cases
Lightweight host laptop
Portable Duo camera (core component of the eye tracker)
Laptop and tripod mounts
Lightweight head support
Response device (e.g. for measuring reaction time)
Selected Specifications:
- Sampling rate: up to 2,000 Hz
- Average accuracy: up to 0.15°
- Saccade event resolution: 0.05° (microsaccades)
- Spatial resolution: 0.01°
- Pupil size resolution: 0.1% of diameter
- Compatibility with glasses: excellent
- Online event detection: fixations / saccades / blinks / fixation updates
Software
Our Software-Solutions
- SR Research WebLink (1x lifetime licence): A screen recording software solution that enables EyeLink users to record eye movements while participants view and interact with websites, computer software, live events (e.g. puppet tasks), gaming consoles, tablets, mobile phones, and other dynamic media such as videos and PDF documents. It is ideal for usability testing, HCI research, and many other scenarios.
- SR Research Experiment Builder (1x lifetime licence): Allows users to create anything from simple experiments — where each trial presents static text or an image and waits for a participant response — to highly advanced experiments featuring complex gaze-contingent event sequences with excellent temporal precision. Frequently used by psychologists and neuroscientists.
- EyeLink Data Viewer (1x lifetime licence): A tool that enables users to view, filter, and generate output reports from EDF data files recorded with EyeLink eye trackers. It is used to analyse data collected with the EyeLink Portable Duo.
Remote Eye Tracker: Tobii X1 Light Eyetracker
Lightweight eye tracker with 30 Hz resolution. Fully functional and suitable, for example, for web usability studies, as well as for cognitive experiments and human–computer interaction studies (e.g. gaze-adaptive displays) where ultra-high precision is not required.
Software: Tobii Studio
Extended (Virtual, Augmented, Mixed) Reality (XR) devices with integrated eye trackers (available in the Media Lab or on the FHNW campus)
Apple Vision Pro
It is currently not possible to access raw eye-tracking data. Objects can be highlighted when the user looks at them; however, this functionality is managed by an Apple framework, meaning we have no control over the underlying process.
Hololens 2
The HoloLens 2 features integrated eye tracking, enabling the visualisation of gaze and interactions with objects. It is also possible to collect eye-tracking data, for example which object was viewed and for how long. Additional eye-tracking data require custom programming or the use of ARETT. Four devices are available for student projects.
Eye Tracking Specifications
Real-time tracking with two IR cameras
HTC Vive Pro Eye
The HTC Vive Pro Eye is an enhanced version of the standard HTC Vive Pro, featuring precise eye tracking by Tobii.
EyeTracking-Spezifications:
- Gaze data output (binocular): 120 Hz
- Accuracy: 0.5°–1.1°
- Calibration: 5-point
- Trackable field of view: 110°
Possible data output:
- Timestamp
- Gaze direction
- Pupil position
- Pupil size
- Eye openness
Programming interface: HTC SRanipal SDK
SDK engine compatibility: Unity, Unreal
Meta Quest Pro
The sampling rate is approximately 72 Hz. Eye tracking is used to make eye contact and facial expressions of your avatar appear more natural during virtual interactions with other users and to enhance image quality within the area you are looking at in VR. Eye tracking can also be used as an input modality, meaning you can interact with virtual content based on where you are looking. Eye tracking is not used to identify you. It estimates the direction of your gaze, and images of your eyes never leave the headset and are deleted after processing.
Eye-Tracking Studies Using Webcam and Smartphone
Eye tracking is also possible using standard cameras, albeit with lower stability. We have experience conducting eye-tracking studies using commercially available devices and webcams. If you have any questions, please feel free to contact us.


Degree programmes offerings
Continuing education offerings
Contact us
For further information about the FHNW School of Computer Science or to discuss potential collaboration opportunities, please contact us:

Prof. Dr. Arzu Çöltekin
- Phone
- +41 56 202 84 73 (Direct)
- arzu.coltekin@fhnw.ch
Our School

