قالب وردپرس درنا توس
Home / Technology / This is how the analysis of feelings by computer – digital

This is how the analysis of feelings by computer – digital



It's hard to overload emotions. Because we humans even perceive the most delicate movements in the facial expressions and gestures of our counterpart, and we hear many nuances. Computers can not do this as well as sensitive people. But they can handle huge amounts of data. "This opens up completely new opportunities for emotional analysis," says Olga Perepelkina, head of research at the young company Neurodata Lab.

The Moscow-based company and research department in Florida recently opened a branch in Root, Switzerland. and sound equipment. Neurodata Lab reviewed 805 speeches delivered by federal councilors during the current year to the National Council and the Council of States for baz.ch/Newsnet; Between 72 and 274 appearances per person.

Example of an image, main emotion: disgust. Image: PD

Almost in real time, the software analyzes emotions with self-learning algorithms and neural networks. "We rely on a multichannel approach," said Perepelkina: The voice and facial expressions, gestures and body movements of the people filmed are analyzed. "The result rate is so much higher than with simple analyzes, as they are common today." The system should also learn quickly: "The more training data we have, the better the results." With a new algorithm, the shading of individual pixels The Neurodata Laboratory wants to feel the pulse even without additional measuring device

The system speaks only English

Perepelkina only not hide the fact that there are many sources of error in such analyzes. For pulse analysis, for example, the video material must be very good. In the emotional analysis, there are sociocultural cliffs: until now, the system has been formed in English. Even subtle and culturally induced communication differences can lead to inaccuracies and individual behavior. In addition, facial expressions and gestures are not interpreted in the same way by all people.


Example image, main emotion: grief. Image: PD

Computer-aided emotional analysis is much more than a gadget, Olga Perepelkina points out. "Today, there are already many possible applications and many more will be added in the future." For example, robots should better understand the meaning of their counterparts and learn to communicate more naturally and "humanly". Emotional translators for people could emerge with gaps in this area. The car could in the future analyze if the driver is fine – and if necessary pull the emergency brake. It is conceivable that employers use analytics technology during job interviews. The advertising industry is already experiencing it. New opportunities make it easier to document people's reaction to a product.


Image example, main emotion: Joy. Picture: PD (Editors Tamedia)

Created: 05.12.2018, 18:32


Source link