Skip to content

asterics/eye-lcos-tracker

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 

Repository files navigation

Eye-LCOS-Tracker

The Eye-LCOS-Tracker (Eye-Low Cost Open Source-Tracker) repository is an initiative to create a low-cost open source eye tracker with consumer hardware. As many eye tracker on the market are either closed source or very expensive (starting with €1.500,-), the goal of this initiative is to provide an alternative.

Requirements

The eye tracker should fulfill the following requirements:

  • Open Source: All elements of the tracking device (hardware, firmware, neural models, software, ...) should be open source to ensure community commitment and prevent vendor-lock-in.
  • Low Cost: The overall cost of the tracker should be below €300,-
  • Off-the-shelf availability: The eye tracker should be a plug and play device and be purchasable by the end users globally. If not available as one device there should only be a few mainstream components and a simple DIY tutorial.
  • Use Cases: The main use case is computer control for people with disabilities. It is not intended to be used for user experience evaluation, which needed higher frame rates and the support for eye saccade detection.
  • Tracking quality:
    • A tracking accuracy of up to 3°? is aimed for. The lower the better.
    • Good tracking results should be achieved with and without eye glasses
    • Good stability against head movmeents and head orientation (-> high FOV, probably 2 cameras)
    • FPS: not critical, 60-100Hz desired

Eye-Tracking Essentials

Soe importnat term like bright/dark pupil center corneal reflection techniques, 3D-eye model calibration, timing/accuracy/precision are explained here:

Hardware

The OAK-1 and OAK-D devices are open source devices supporting spatial AI on device (edge) processing. The Intel® Movidius™ Myriad X™ (VPU) is used to do embedded neural inference. Additionally, there are many open hardware design variants that make it easy to start with a baseboard and add another camera module or interfacing component.

Luxonis Hardware

Camera Options

Other Embedded Vision HW

Approaches

The VPU is primarily intended for neural inference processing of machine learning models. The OpenVINO toolkit is an Intel® abstraction toolkit to convert from different source models and to optimize the model for the VPU processing.

Machine Learning

The VPU is able to process several machine learning models serially or in parallel.

The OAK gaze estimation example demonstrates the processing of face detection, landmark detection, head pose estimation and gaze estimation models.

Gaze estimation pipeline using several machine learning models

There is also a fork with integrated relativ mouse control by gaze movement. The projects are based on the OpenVINO-compatible version of LCTyrell.

Other interesting ML models

Embedded CV processing

In cases where the CV algorithm is not based on machine-learning it would be beneficial to execute traditional OpenCV operations on device in order to keep the host CPU free for user operations. The OpenCL support for the Myriad X VPU is currently in preview mode. The OpenCL support is also planned to be integrated into the depthai python API for the OAK devices. But the depthai already supports or will support some basic hardware accelerated pre-processing and filtering operations (e.g. edge detection, object tracking, motion estimation, ...)

Further information can be found in the FAQs:

ML Frameworks for Vectorized Math

Another approach to execute traditional computer vision operations is to embed computer vision operations into a machine learning model and add it to the inference pipeline of the VPU. See the example of rahulrav to convert an RGB video into a gray-scale video using pytorch.

CPython Embedded Scripting

The OAK devices have CPython support (currently alpha stage) in order to communicate with the HW components and for being able to add embedded processing of application logic.

Active Appearance Model

TODO: Add eye tracking algorithms using Active Appearance Models

Interesting Papers

Human Computer Interaction

Quantitative evaluation and comparison of the efficiency of input devices can be done using Fitt's law and comforming to the ISO 9241-9. The throughput metric is a good metric for the comparison of input devices. See the publication FittsFace: Exploring Navigation and Selection Methods for Facial Tracking as an example.

Further resources:

Interesting papers

About

Low Cost Open Source Eye Tracker

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published