According to its creators, the system based on reading the brain's magnetic fields may facilitate the use of computers by disabled persons. The study's results were published in the journal Frontiers in Neuroscience.
Gaze-based computer control is increasingly being used by paralysed people, and the "eye control" function is even included as a standard feature in some operating systems, MSUPE researchers said. An eye-gaze tracking device, an eye tracker, is used for this purpose. The eye tracker detects the position of the user's pupil with a video camera and calculates what spot on the screen the user is looking at. "Mouse clicks" are replaced by dwelling one's gaze on certain screen sections.
According to scientists, this technology currently has a major flaw – it cannot discriminate between intentional and unintentional gaze dwells, which can happen, for example, when a user wants to look at something. It is simply impossible to avoid involuntary dwelling as a person's gaze can easily get out of conscious control, resulting in false positives, the scientists explained.
Researchers at MSUPE are creating a control system that will be able to detect whether a gaze delay is intentional or involuntary. According to the creators, it will be a hybrid of eye-tracking and another technology used to help people with disabilities, a brain-computer interface (BCI).
The BCI's ability to differentiate between brain signals generated by certain mental acts makes it possible to give computer commands literally "with the power of thought".
"Many have tried to combine BCIs with 'eye control'. BCI offers an easy way to make a 'click', for example, by imagining a movement with your hand. But this combination is still proving to be extremely inconvenient, as the BCI is slow, forcing the user, by imagining the desired action, to gaze on one spot for a long time. In addition, it is rather difficult to combine the mental actions required for BCIs with intentional gaze dwells", Sergei Shishkin, project leader and Leading Research Scientist at MSUPE's MEG Centre, explained.
Specialists from the Moscow State University of Psychology & Education used a different approach in their development: a user of their system only needs to intentionally hold their gaze, and this very action, according to the scientists, already changes the pattern of brain signals in a way that makes it possible to note the intention to "click".
To accurately distinguish brain signals corresponding to this intention, magnetoencephalography (MEG) was used, which enables a contact-free recording of the brain's weak magnetic fields. Participants in the experiment played a specially modified version of the computer game "Lines", where control is performed by using gaze dwells.
The obtained MEG data was processed by artificial neural networks that determined whether the gaze dwells were intentional or involuntary.
"We cannot yet accurately determine the 'intention of the gaze dwell' to implement the technology immediately. The reason is that we have not yet been able to record enough data during the experiment to train the neural networks. This is a standard problem in the classification of brain signals by deep artificial neural networks. We are now working on expanding our sample of MEG data", Anastasia Ovchinnikova, Senior Researcher at at MSUPE's MEG Centre, said.
The research was carried out within the framework of cooperation between MSUPE and the National Research Centre "Kurchatov Institute" with the support of the Russian Science Foundation. In the future, the scientists plan to create a high-precision hybrid interface combining eye-tracking, MEG, and electroencephalography data in one system.