Scientists Create Improved Eye Tracking Technology For Virtual Reality Systems

0

Researchers have developed a mathematical model that helps accurately predict the next gaze fixation point and reduces the imprecision caused by blinking.

The study, published in the SID Symposium Digest of Technical Papers, indicates that the model would make VR / AR systems more realistic and responsive to user actions.

“We effectively solved the problem with the foveal rendering technology that existed in the mass production of virtual reality systems,” researcher Viktor Belyaev, professor at RUDN University in Russia.

Foveated rendering is a core technology in VR systems. When a person looks at something, their gaze is focused on the so-called foveal region, and everything else is covered by peripheral vision.

Therefore, a computer must render images in the fovea region with the highest degree of detail, while other parts require less computing power.

This approach improves computational performance and eliminates problems caused by the gap between limited GPU capabilities and increased display resolution.

However, foveal rendering technology is limited in the speed and accuracy of predicting the next gaze fixation point, as the movement of a human eye is a complex and largely random process.

To solve this problem, the researchers have developed a mathematical modeling method that makes it possible to calculate in advance the next gaze fixation points.

The model’s predictions are based on the study of so-called saccadic movements (rapid and rhythmic movements of the eye). They accompany the movements of our gaze from one object to another and can suggest the next point of fixation.

However, these models cannot be used by eye trackers to predict eye movements because they are not precise enough, the team said.

Therefore, the researchers focused on a mathematical model that helped them obtain jerky motion parameters. After that, this data was used to calculate the foveal region of an image.

The new method has been tested experimentally using a VR headset and AR glasses. The mathematical model-based eye tracker was able to detect minor eye movements (3.4 minutes, which is equivalent to 0.05 degrees), and the imprecision amounted to 6.7 minutes (0.11 degrees).

In addition, the team managed to eliminate the calculation error caused by the blinking: a filter included in the model reduced the inaccuracy 10 times.


Source link

Leave A Reply

Your email address will not be published.