Improvement of Driving Simulator Eye Tracking Software

Principal Investigator:

Brian Davis, Research Fellow, Mechanical Engineering

Co-Investigator

Project Summary:

This work is focusing on improving the eye tracking tools used in the HumanFIRST driving simulator. Eye tracking is an important feature for simulation-based projects. It allows researchers to understand where participants are focusing their visual attention while driving the simulator. Currently, the eye tracking system is capable of providing a nearly continuous record of the direction in which the driver is looking with respect to real-world coordinates. However, this by itself does not give any information about which objects the driver was looking at. For example, it may be necessary to identify when and for how long they were focused on a real-world object (e.g., gauge cluster, center stack, or mirror) or an object in the simulated world (e.g., a car, road sign, or potential hazard). To collect this type of information, additional processing is necessary. Current methods to process the raw eye tracking data are time intensive, requiring a human to go through eye tracking data and system video by hand in order to extract useful data. This project is examining the process by which the eye tracking data is processed in order to identify and implement software tools to make the process more efficient. One option that is being examined is the use of existing eye tracking video output from the system that superimposes a dot representing the driver's gaze location on video from a forward-facing scene camera. It may be possible to process this video using computer vision algorithms in order to detect which object (e.g., a car, road sign, etc.) is under the dot and use that to calculate gaze locations and fixation times by object. A second method being examined is the use of the eye tracking system's 3D vision vector and applying that to known information about the simulator's state (i.e., the position and speeds of different elements relative to the driver/vehicle. That information could then be combined to calculate with what and for how long the driver's line of sight intersects with virtual and real-world objects. A successful project outcome will be the software tools and associated documentation for reducing or eliminating the amount of human intervention when processing the eye tracking data.

Sponsor:

Project Details: