AI Can Now Determine the Person's Traits by Eye Movements

50 students from the Flinders University took part in the study: 42 women and 8 men
31 July 2018   459

A group of researchers was able to teach artificial intelligence to reveal the traits of a person by the movements of the eyes. The accuracy of the determination varies depending on the line. Neuronet recognized extraversion in 48.6% of cases, conviction in 45.9% and neuroticism in 40.3%. With such features as openness and curiosity, AI cope somewhat worse. This is reported by VentureBeat.

The relationship between eye movements and character traits was known for a long time. However, previously the data were collected exclusively in the laboratory. In this study, scientists set out to make records in the field.

In addition, traditionally collected data was processed manually. Artificial intelligence to perform this work was used for the first time.

To obtain the necessary information, 50 students from the Flinders University were selected: 42 women and 8 men. Their personal characteristics were determined with the help of three profiling tests: NEO Five-Factor Inventory, Perceptual Curiosity and Curiosity and Exploration Inventory. Then the subjects were asked to walk along the campus of the university for 10 minutes and acquire the subject of their choice. The recording of eye movements was carried out using SensoMotoric Instruments equipment and mobile phones.

All collected data were processed by a neural network. The accuracy of the determination did not exceed 50%, however, the researchers consider this a good result. According to the specialists who conducted the research, if you use more data for training, it will be possible to improve the result.

The work carried out made it possible to expand knowledge about the relationship of eye movements with personality traits. It was found, for example, that the diameter of the pupils is closely related to neuroticism.

Software developers have already tried to make the user's eye movement part of the interface. The FacePause extension for Google Chrome pauses the video on YouTube if the viewer turns away from the computer.

Voice Assistant to Recognize Voiceless Commands

Technology, based on neural network, can be used in public places without the risk of disturbing others
22 October 2018   70

Developers from Tsinghua University have developed a voice assistant for smartphones that recognizes commands from the user's lip movements. This technology can be applied in public places without the risk of disturbing others.

Yuanchun Shi and colleagues presented an article at the UIST 2018 conference in which they described lip recognition technology and its translation into text. Such a voice assistant uses the front camera and the convolutional neural network. The algorithm tracks 20 control points that accurately describe the shape of the lips, and also determines how open the user's mouth is. This allows you to recognize the beginning and end of the command. The second algorithm decrypts the data. In this case, while all the calculations occur separately on a powerful PC.

For recognition, a limited set of commands is used — a total of 44, which apply to both individual applications and specific functions, such as turning Wi-Fi on and off. System-wide tasks are also supported, such as responding to a message or highlighting text.

The developers claim that the average recognition accuracy is 95.5%. It is based on the results of training on the speech of 21 people. Tests were conducted in the Beijing subway. As a result, it turned out that this method is considered more comfortable by users.

So far, the developers do not specify when the new application will appear in the release. However, if a powerful computer is still needed for recognition, it will not happen soon. Or the system will require a permanent connection to the network.