AI Can Now Determine the Person's Traits by Eye Movements

50 students from the Flinders University took part in the study: 42 women and 8 men
31 July 2018   1010

A group of researchers was able to teach artificial intelligence to reveal the traits of a person by the movements of the eyes. The accuracy of the determination varies depending on the line. Neuronet recognized extraversion in 48.6% of cases, conviction in 45.9% and neuroticism in 40.3%. With such features as openness and curiosity, AI cope somewhat worse. This is reported by VentureBeat.

The relationship between eye movements and character traits was known for a long time. However, previously the data were collected exclusively in the laboratory. In this study, scientists set out to make records in the field.

In addition, traditionally collected data was processed manually. Artificial intelligence to perform this work was used for the first time.

To obtain the necessary information, 50 students from the Flinders University were selected: 42 women and 8 men. Their personal characteristics were determined with the help of three profiling tests: NEO Five-Factor Inventory, Perceptual Curiosity and Curiosity and Exploration Inventory. Then the subjects were asked to walk along the campus of the university for 10 minutes and acquire the subject of their choice. The recording of eye movements was carried out using SensoMotoric Instruments equipment and mobile phones.

All collected data were processed by a neural network. The accuracy of the determination did not exceed 50%, however, the researchers consider this a good result. According to the specialists who conducted the research, if you use more data for training, it will be possible to improve the result.

The work carried out made it possible to expand knowledge about the relationship of eye movements with personality traits. It was found, for example, that the diameter of the pupils is closely related to neuroticism.

Software developers have already tried to make the user's eye movement part of the interface. The FacePause extension for Google Chrome pauses the video on YouTube if the viewer turns away from the computer.

MelNet Algorithm to Simulate Person's Voice

It analyzes the spectrograms of the audio tracks of the usual TED Talks, notes the speech characteristics of the speaker and reproduces short replicas
11 June 2019   318

Facebook AI Research team has developed a MelNet algorithm that synthesizes speech with characteristics specific to a particular person. For example, it learned to imitate the voice of Bill Gates.

MelNet analyzes the spectrograms of the audio tracks of the usual TED Talks, notes the speech characteristics of the speaker and reproduces short replicas.

Just the length of the replicas limits capabilities of the algorithm. It reproduces short phrases very close to the original. However, the person's intonation changes when he speaks on different topics, with different moods, different pitches. The algorithm is not yet able to imitate this, therefore long sentences sound artificially.

MIT Technology Review notes that even such an algorithm can greatly affect services like voice bots. There just all communication is reduced to an exchange of short remarks.

A similar approach - analysis of speech spectrograms - was used by scientists from Google AI when working on the Translatotron algorithm. This AI is able to translate phrases from one language to another, preserving the peculiarities of the speaker's speech.