
Bits With Brains
Curated AI News for Decision-Makers
What Every Senior Decision-Maker Needs to Know About AI and its Impact
The Rise of Empathetic AI: How Hume's EVI Just Might Revolutionize Human-Machine Interaction
3/31/24
Editorial team at Bits with Brains
While Artificial intelligence (AI) has made remarkable strides in recent years, one area that has remained a challenge is the ability to understand and respond to human emotions.

A recent breakthrough has been achieved by Hume.AI with the development of EVI, the world's first voice AI system with emotional intelligence.
EVI incorporates a combination of advanced models and algorithms to analyze facial expressions, vocal intonations, and language, enabling it to infer emotions with remarkable accuracy. This multimodal approach to emotional intelligence sets EVI apart from traditional AI systems, opening new possibilities for more natural and empathetic human-machine interactions.
At the core of EVI's emotional intelligence capabilities is its facial expression model, which analyzes 48 dimensions of emotional meaning, including happiness, sadness, anger, fear, surprise, and disgust. Based on scientifically validated models of facial movement and vocal modulation, this model generates outputs that describe a person's emotional state in real-time, providing insights into their current feelings and reactions.
However, EVI's emotional intelligence extends well beyond facial expressions to encompass the nuances of speech and vocal patterns. Its speech prosodies model analyzes intonation, volume, and rhythm, capturing the emotional meaning conveyed through how words are spoken, rather than what is being said. Prosody refers to the patterns of rhythm, stress, intonation, and tempo within speech. It's sometimes called the "music" of language. Additionally, EVI's vocal burst model analyzes non-linguistic vocal utterances, such as laughter, sighs, and groans, which often convey emotional states that may not be expressed through words.
Complementing its analysis of facial expressions and vocal patterns, EVI's emotional language model analyzes written or spoken language to detect emotions expressed through word choice and tone. It identifies topics or entities mentioned in speech or text and the emotional tone associated with them, providing a deeper understanding of the emotional context.
What really sets EVI apart is its ability to combine facial expression, speech prosodies, vocal bursts, and emotional language analysis into a comprehensive multimodal understanding of emotional states. By integrating multiple modalities, EVI can infer emotions with greater accuracy and depth, enabling more natural and empathetic interactions.
EVI's emotional intelligence capabilities have far-reaching potential applications across various industries. In mental health use cases, EVI can provide emotional support and insights to individuals seeking counseling, enhancing the quality of care. In customer service, EVI can analyze customer interactions to identify emotional states and provide personalized responses that enhance customer satisfaction.
Further, EVI can assist in understanding student emotions and providing tailored learning experiences in education, enhance entertainment experiences by analyzing audience reactions and adapting content accordingly, and even assist in monitoring patient emotions and providing emotional support during medical treatments in healthcare settings.
Hume's EVI represents a significant step forward in this direction, paving the way for a future where AI systems can truly understand and respond to human emotions.
see a demo here: https://www.youtube.com/watch?v=SEqVfjBP9Mc
Sources:
[2] https://digialps.com/evi-by-hume-ai-now-ai-can-detect-your-emotions-just-with-your-voice/?amp=1
[3] https://www.mdpi.com/1424-8220/19/8/1863
[4] https://venturebeat.com/ai/is-ais-next-big-leap-understanding-emotion-50m-for-hume-says-yes/
[5] https://www.youtube.com/watch?v=SEqVfjBP9Mc
[7] https://www.youtube.com/watch?v=TglkizXUOsc
[9] https://www.tomsguide.com/ai/i-had-a-conversation-with-evi-the-new-empathic-ai-voicebot-from-hume
[10] https://dev.hume.ai/docs/empathic-voice-interface-evi/overview
[12] https://www.biometrie-online.net/images/stories/techno/voix/Voix-Prosodie/Prosodie-Article.pdf
[14] https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2021.759485/full
[16] https://www.hume.ai/blog/hume-ai-publication-in-nature-human-behavior-deep-learning-and-vocal-bursts
[18] https://www.hume.ai/research
[19] https://www.davidcrystal.com/Files/BooksAndArticles/-3887.pdf
[20] https://kids.frontiersin.org/articles/10.3389/frym.2021.698575
Sources