Categories News

Should You Feel Good About Emotion AI?

As artificial intelligence learns to interpret and respond to human emotions, IT leaders should begin contemplating how emotion AI could play a critical role in their enterprises and markets.

Despite its name, emotion AI doesn’t refer to an angsty computer. It’s actually the tag applied to a form of AI that, through training and machine learning, is able to analyze, understand, and perhaps even replicate human emotions, says Michael Armstrong, CTO at conversational intelligence healthcare developer Authenticx.

Emotion AI is a branch of artificial intelligence that aims to recognize, interpret, and respond to human emotions, adds José Moya, outreach manager at custom software developer Capicua Full Stack Creative Hub.

Potential Applications

Emotion AI has multiple possible applications, particularly in customer service. “It can, for example, recognize the difference between an angry or an excited customer call, and route calls to the agent best equipped to address a caller’s concerns,” Armstrong says. The technology can also analyze comparable speech patterns to offer insights in real-time, including recommendations on how to handle a particular customer call. “It’s already used to detect insurance fraud, with AI using voice analysis to help detect whether someone calling to file a claim is lying.”

Emotion AI can also use speech analysis to evaluate a call’s success or failure. For example, was a customer happy and satisfied by the call’s end, or did their voice indicate frustration or anger? Collected results can be used to identify ongoing pain points agents are frequently unable to resolve, Armstrong explains.

Wei “Coco” Xu, a Georgia Tech computer science professor and a researcher with the National Science Foundation’s AI-CARING Institute, anticipates applications extending beyond business. “Imagine having a robot pet dog that reacts like a real dog based on people’s emotions,” she says. “A robot would be easier to be care for than a real dog while providing companionship for its elderly owner.” Emotion AI also has the potential to improve psychotherapy, making it more easily and widely available. “Caution is clearly needed, however, for this and other high-stake applications,” Xu advises.

As emotion AI evolves, Armstrong expects that it will be increasingly applied to marketing, helping businesses evaluate customer reactions to new products and services. On the other end, customer service reps are already using sophisticated AI simulators in their training, he notes. “These AI-powered simulators change and adapt responses based on the reps’ input, assessing their levels of empathy and ability to help customers.”

Dangers Ahead

IT teams will need to continually educate themselves on emotion AI technical and ethical aspects in order to limit the possibility of misunderstandings and misuse, Xu says.

Emotion AI is a fluid technology. Programmers must continually review and train data models to ensure ongoing accuracy and reliability, Armstrong advises. “Part of that training must include domain-specific industry data,” he notes. “Otherwise, organizations risk getting inaccurate interpretations and skewed results, negatively impacting any insights used to make business decisions.”

Oversight and Regulation

While emotion AI holds significant promise, and its applications in mental health, customer service, and marketing are multiple and intriguing, the technology also raises numerous ethical concerns. “Privacy and consent, the accuracy of interpretation, and the potential misuse of emotional data are all valid considerations,” Moya says.

AI with the ability to understand human emotions could be used in ways that compromise user privacy by monitoring and analyzing emotional reactions to content or information, observes digital privacy expert Ray Walsh, with privacy education firm ProPrivacy. “This has the potential to infringe on consumer privacy rights.”

Walsh adds that businesses already possess large amounts of consumer data, and incorporating emotion tracking into their profiling strategies could amplify their capacity to manipulate users and exploit personal desires.

Regulation should focus on who can use the technology, when it can be used, and perhaps most importantly, on whom it can be used, Xu says. “In some cases, it should be regulated by legal measures, in other cases it could be regulated by mainstream societal expectations, such as those that might cause a company to adjust its practice following a bad PR experience.”

“Emotion AI has the potential to become the critical differentiator in mining the customer’s voice in marketing, healthcare, and other industries to gain much deeper insights,” Armstrong says. “Yet IT organizations and enterprises should strive for full transparency while committing to understand emotion AI’s tools, abilities, and limitations.”

Final Thought

AI algorithms are complex, with billions of parameters, Armstrong observes. “Attempting to understand all of those parameters is unlikely,” he admits, adding that the technology’s output and impact should be closely monitored. “Right now, the most beneficial way to accomplish this task is with a human in the loop.”

What to Read Next:

Emotional Intelligence: An Often-Overlooked IT Leadership Skill

Does Your IT Organization Need an AI Team?

Podcast: Is 2023 an AI Hallucination Odyssey?

More From Author