Close

Development Plans over the next few days:

A project log for Low Cost Open Source Eye Tracking

A system designed to be easy to build and setup. Uses two $20 webcams and open source software to provide accurate eye tracking

john-evansJohn Evans 10/18/2018 at 22:070 Comments

TCP/IP sockets: added TCP socket project for communicating with different applications over the network. Doing this will allow great flexibility in terms of communicating with different systems that may be wireless.

Greater separation of Projects: A lot of work has been done to separate out the eyetracking logic from the operating system specific machine vision. Camera and machine vision logic is held in one project (.exe) while EyeTracking, TCP Sockets, and Logging are held in different DLLs. This should allow for great flexibility to eventually expand this project to different systems.

Perspective Transformation: A big goal has been to move away from a monitor based system, this will be used to control various machines, such as robotic arms via systems such as the tinyG stepper motor controller system. Other platforms may be used, as long as they are able to attach to a TCP socket on the network, and interpret the commands sent.

By introducing perspective transformation, known fixed points may be identified with the front facing camera and the shift will be used to generate a transformation matrix. This transformation matrix will be used to transform the eye tracking location generated in order to correct for head movement (Yaw/Pitch/Row) as well as (X/Y/Z Cartesian movement relative to the known points. IR will be used on the front facing camera in order to ease the identification of these points.

Decawave Time of Flight Location System:  Matt will be integrating Decawave based location system for greater understanding of head location in Cartesian space. We're hoping for +/- 50mm of resolution.

BNO055 9 axis Accelerometer: Matt will also be integrating a BNO055 for better understanding of Yaw/Pitch/Roll of the user's head. This may be added to a sort of rudimentary data fusion model for correction of the fix point beacon system we are trying for with the front facing camera and IR light sources. The team has fairly limited understanding of homography and more advanced machine vision techniques, so this may end up being essential for properly calibrating the user's gaze relative to the machine that will be interfaced with

TinyG Based Machine Control: Another big goal was to interface with a tinyG based CNC system. End goal being to draw a picture using the user's gaze.

More Comfortable Head Gear: My 3d printer is back up and running so an attempt will be made to print a superior head mounted unit for carrying the cameras and additional hardware which will be used in this project. 

MVP: allow user to interface with computer screen for better ease of use: The minimum we can hope for is to allow the user to type a few words using only their eyes. This could be a massive feat should it be accomplished in a way that is easy to use for the individual. It's unclear what this interface may eventually look like, but it should be 1: accurate, 2: low eye strain, 3: easy to learn how to use

Discussions