According to foreign media reports, for people with disabilities who can’t speak, move their arms, hands or even head, the computer connected eye tracking system allows communication through eye movement. However, this system has some disadvantages, and the new experimental eye tracking headset prototype is said to solve these problems.

Christopher mcmurrough, a lecturer in computer science and engineering at the University of Texas at Arlington, said that the calibration of conventional eye tracking systems requires the help of trained experts. In addition, the processing of tracking eye movement usually lags behind and does not allow instantaneous communication.

However, his new head wearing device will not have these problems. The prototype developed by mcmurrough is equipped with a forward-looking 3D depth mapping camera on the top and a binocular tracking device pointing to the wearer’s eyes. In addition, his program creates a 3D map of the environment in front of the user according to the output of the camera. By combining these data with the output of the eye tracker, the direction of the user’s gaze can be inferred, allowing the program to determine what they see in the 3D map.

Eye tracking headgear will help people with disabilities

Therefore, the system should be able to perform tasks such as activating the control of the electric wheelchair or instructing the robot arm to grasp objects such as a bottle of water – all the user will have to do is look at those things.

“My interest in this technology stems from my mother-in-law. As an ALS patient, she has difficulties using eye tracking equipment,” mcmurrough said. “The latest version of our device can be worn as a pair of ski goggles, with a camera on the top and an eye tracker embedded in the lens, so that patients can use it for a long time when they carry it with them.”

The patented technology can also be used in games, augmented reality and monitoring the medical environment affecting eye movement.

Leave a Reply

Your email address will not be published. Required fields are marked *