Can computers recognise human emotions?

Our MeetMe video interviewing platform has been helping employers to get to know candidates from the start of the application process. And now, by analysing sentiment expressed in these videos using Microsoft Media and Cognitive Services, we can provide further insight into job applicants.

Microsoft’s Media Analytics tool recognises facial expressions relating to different emotions. From neutral through to happiness, surprise, sadness, anger, disgust, fear and contempt, it’s able to give an indication of emotion throughout a video. These particular emotions are understood to be cross-culturally and universally communicated with particular facial expressions.

At Cognisess, we’re making use of the very latest in machine learning and data analysis techniques. So we’re pleased to be incorporating the data gained from using Microsoft Media and Cognitive Services on our video interviews.

James Pain, a Creative Computing student at Bath Spa University, has converted this data into colourful visuals that clearly show the sentiment expressed at different points in time.

We asked Cognisess Research Associate Kayleigh to share her MeetMe video with us so we could analyse the sentiments expressed throughout. Here’s the result:

And, just for fun, we thought we’d take a look at some well known political figures to see which emotions they express while speaking. Here Obama and Trump are speaking in very different tones on very different topics – so we’re definitely not comparing like for like. But it’s interesting to see the subtleties in human communication and note how the software categorises emotions.

Take a look at this analysis of cool, collected Obama during his last press conference as President of the United States:

And here’s a much more emotionally charged speech from Trump expressing anger about journalists at the White House:

While moderate levels of surprise and anger are present throughout Trump’s speech, Obama remains very neutral, punctuated by a few strong peaks of happiness and sadness. Here Obama is controlling his emotions, while Trump is fully expressing his. It could be argued that Obama is a more nuanced emotional communicator who “switches on” emotions to emphasise a point, whereas Trump’s underlying emotions are always engaged.

We’ll be making much more use of this technology to gather data from video interviews. Already we’ve been able to predict the perceived positivity of candidates through analysing video content, and this could be highly valuable in selecting individuals for customer-facing roles.

For more information on Cognisess Deep Learn – the machine learning engine behind everything we do, and to find out more about how Cognisess could help your business, get in touch at