Artificial Intelligence

How Emotional AI Helps Humans Beat Depression

We explore real-world use cases of emotional AI helping people find the balance in their life by recognizing and addressing their emotions.

Rationality, logic, cold—these are the words commonly associated with artificial intelligence. However increasingly AI is being used to target human problems, those that involve complex emotional responses previously thought to be exclusively the domain of human intelligence. It’s hard to imagine electronic psychologists, psychiatrists, and other mental health professionals. But thanks to AI developers, they do exist.

Known as emotional AI, such solutions are widely presented as both workplace enhancing tools and medical applications. In this article, we’ll explore some of them in order to find out how effective a computer can be at dealing with purely “human” emotional problems.

From Cold Tech to Emotion

Dreams of emotional intelligent robots have long been around, dating back to Baron Wolfgang von Kempelen’s chess-playing Mechanical Turk in 1769. But even in the wildest of dreams it was rarely thought that such a machine could detect human emotions and react to them with any level of success.

Almost two and a half centuries later, in January 2018, Gartner stated:

By 2022, your personal device will know more about your emotional state than your own family.
Annette Zimmerman, Research Vice President at the leading business, Gartner

Despite claims of big brother monitoring, Zimmerman, like many of her peers, doesn’t necessarily think of the advancements as threatening. Conversely, they can be of benefit by increasing the quality of human life and lead to more personalized experiences.

How Does Emotional AI Work?

Emotional AI software has to detect what a human is feeling and then classify those emotions. Detection and classification are two typical tasks of machine learning, and especially deep learning (think artificial neural networks). The only question remains which kind of data to analyze. Not taking into account the potential of monitoring the brain's electrical activity, there are three main data sources for emotional AI: facial expression, text, and speech.

Facial Expression

By analyzing a collection of key landmarks on the face, deep learning algorithms can determine which emotion a person is feeling.

Facial Expression Analysis

At the moment, the technology can detect seven basic emotions—anger, contempt, disgust, fear, joy, sadness, and surprise—with a reasonable amount of consistency.

However, it hasn’t quite yet accomplished the micro-expression detection level, a-la Dr. Cal Lightman, in TV’s popular Lie to Me series, which could indicate hidden emotions.


Speech recognition works by detecting a number of features in a person’s voice—such as tone, intonation, and tempo—and then evaluating them to determine which emotion is being presented.

This works similarly to the human way of establishing an understanding of an emotional state through voice alone. Even when talking on the phone, we can feel when a person is upset—the rhythm of breath changes—or when a person is experiencing anger—a sharpness to the voice.

Working in a comparable way, speech recognition software detects these variables and even changes in a voice.


Once thought impersonal, as people increasingly use text messages and digital methods to communicate, technologies have become adept at detecting emotion even from just a few characters.

Of course, we all know the difference between :) and :( or “I’m so happy to see you” and “I’M SO HAPPY TO SEE YOU,” but emoji’s and capital letters aside, how can we truly know the meaning behind the other person’s words, when texting?

The most straightforward way to understand this feature is that we, as humans, use certain words more commonly when we are experiencing a particular emotional state. For example, those suffering from depression are less likely to be overtly expressive. Emotional AI uses this knowledge to analyze hidden emotional meaning of a text conversation.

Our team brings face recognition and sentiment analysis to businesses

Emotional AI in Focus—Use Cases

In his book Homo Deus (human god), historian, philosopher and author, Yuval Noah Harari talks about the advancements in artificial neural networks and their potential to transform human nature itself.

Harari likens human evolution to that of AI, describing artificial neural networks as nothing more than inorganic forms of our organic thinking processes. He highlights the weakness of such systems, as they function based on particular data sets, which are as inherently flawed as the humans that created them.

Take Microsoft’s Tay for example. Tay was a bot designed to learn from those who it interacted with it online and generate social media messages that would speak in a language its audience would understand.

Released in 2016, most unfortunately, Tay did this job a little too well. In a total of 24 hours, the AI-based technology did indeed learn from its audience, more specifically, from the internet trolls who took the friendly bot from “humans are super cool” to “I just hate everybody” at breakneck speed.

Tay, a Microsoft's Twitter Bot

Fortunately, lessons were learned. Since then, times have moved on and technology has progressed making it less prone to such stark errors.

The latest advancements have seen emotional AI used not only for entertainment and marketing purposes, but also in mental health. Let’s take a look at some of the most recent developments on the market today.

Two of the most prominent mental health technology products are TalkSpace and Woebot. Each uses emotional AI to help users improve their emotional state.


This application links users with licensed therapists to provide support essentially 24/7. Originally, the TalkSpace software was designed to function exactly as the name says, as a space to talk online. However, the system’s analysts soon discovered that over 80% of their users were only making use of its text feature.

With the possibility to text a therapist, the app’s clients become more comfortable talking about things they would normally never speak about in a face-to-face conversation. The platform also constantly uses feedback data to improve the quality of care. Yet, despite claims of “affordable therapy,” its $49 a week price tag could prove too much for those in need.


Based on the concept of Cognitive Behavioral Therapy (CBT), Woebot seeks to treat depression by combining the clinical experience of mental health professionals, the experience of the user, and backed those with a dose of artificial intelligence.

With a focus on accessibility, the free to download app connects its users with a bot that employs a series of CBT techniques designed to improve emotional health. The machine learning algorithms behind the bot help it generate more appropriate responses.

In the Workplace

Emotional health isn’t just a personal problem. Emotional AI can aid companies and employees in resolving a number of work-based issues that can impact mental health and cause depression.


Many people feel uncomfortable reporting workplace discrimination, like most victims they feel themselves somewhat at fault. Such situations, where emotions are so strong, enact issues with memory recall, which can cause the sufferer to falsely recall events or become too flustered to do so clearly.

Spot works by recording and timestamping the report of abuse using a question-asking bot based on AI tech. It allows the reporter to have a clear record of what happened, and most importantly the time to decide if they choose to report it, without the risk of forgetting any details.


Ever feel you’re coming across the wrong way? Or, you just can’t land that job? Ixy is an AI-based app that analyzes text and email conversations to tell its users how exactly they are presenting themselves to another person.

The technology provides helpful real-time feedback to aid them in adjusting their communication and avoid another rejection letter, which can lower self-esteem.

Beyond Verbal

Through the use of speech detection technology, Beyond Verbal analyzes the tone of voice and intonation—the so-called vocal biomarkers—to assess emotional state.

While many of the technology’s uses are based on creating more productive sales, it can also be employee with both general and mental patients to determine emotional state via phone.

This could be extremely vital to first-responders seeking to access the seriousness of a situation and those caring for close relatives.

Rise of the Terminators or a New Level of Development?

As machine learning patents are on the rise, many are discussing the risks and potential benefits of the technology for the human race. Sci-fi geeks may worry about the rise of the terminators, but this is far from the minds of scientists driving such developments.

The pros and cons of the technology work in balance to create limits and structures for its use. It is, after all, only as good as the data set that it learned from.

The key takeaway here is the definition of what AI is and what it is not. Artificial intelligence is just that––artificial. It lacks the consciousness required to take genuinely independent and emotional decisions like a human.

However, as the tech progresses, we are likely to see rapid advancements that work towards human growth and development for a more progressive future. This could be in streamlining working processes, medical technology, or even in an area previously unthought of, only time will tell.

Content type
Add emotional analysis to your enterprise projects.
Consult Iflexion's development experts


It’s simple!

Attach file
Up to 5 attachments. File must be less than 5 MB.
By submitting this form I give my consent for Iflexion to process my personal data pursuant to Iflexion Privacy and Cookies Policy.