Microsoft wants to go a step further with the dynamics provided by its services focused on video calls. And for this, Microsoft Research has launched a new project by developing an AI, which can interpret the emotions of the attendees in a virtual meeting.
This system focused on Microsoft Teams uses a neural network that analyzes a series of factors to interpret the emotions reflected by the expressions of the users. In this way, AI can monitor audience reactions while a presentation is developing.
So you could know if you bore your audience with your presentation
Microsoft does not want speakers to be blind when presenting or giving a performance in a virtual meeting, not knowing if the audience is interested in the topic, seems confused or is involved in the presentation. To do this, this AI analyzes the audience and every 15 seconds it provides a small report to the speaker, highlighting the user who seems most involved, taking into account the most relevant expressions.
This AI, called AffectiveSpotlight, focuses on the faces of competitors to analyze their movements and expressions. For example, if they have furrowed eyebrows, if they move their heads or what kind of emotions they show in their facial expressions.
So while the AI performs the analysis in the background, the speaker can focus on their presentation without trying to guess whether or not the “non-verbal cues” are positive in relation to their presentation. Instead, use these signals to your advantage, thanks to AffectiveSpotlight.
To measure the effectiveness of this AI, a series of tests were carried out, and they verified that the speakers extended their speeches, at the same time, that they were more aware of the audience, such as mention the study. However, it is still very far from being a solution, as there are many factors to analyze about its real effectiveness when interpreting emotions, not forgetting the problems related to the privacy of users.