Emotional AI will not replace empathy

1 year ago
tgadmintechgreat
110

In 2023 emotional AI, a technology that can perceive and interact with human emotions, will become one of the dominant applications of machine learning. For example, Hume AI, founded by Alan Cowan, a former Google employee, is developing tools to measure emotions through verbal, facial expressions, and vocal expressions. Swedish company Smart Eyes recently acquired Affectiva, a subsidiary of the MIT Media Lab, which developed the SoundNet neural network, an algorithm that classifies emotions like anger from sound patterns in less than 1.2 seconds. Even video platform Zoom is introducing Zoom IQ, a feature that will soon give users the ability to analyze emotions and engagement in real time during a virtual meeting.

In 2023, tech companies will release advanced chatbots that can accurately mimic human emotions to create more empathetic connections with users in banking, education, and healthcare. Microsoft’s Xiaoice chatbot is already a success in China, with users on average interacting with “her” more than 60 times a month. It also passed the Turing test: users could not recognize it as a bot for 10 minutes. An analysis by Juniper Research Consultancy shows that chatbot interactions in the healthcare industry will grow by almost 167 percent compared to 2018 and reach 2.8 billion interactions per year in 2023. This will free up medical staff time and potentially save about $3.7 billion for healthcare systems around the world. .

In 2023, emotional AI will also become commonplace in schools. In Hong Kong, some high schools are already using an artificial intelligence program developed by Find Solutions AI that measures micro-muscle movements on students’ faces and detects a range of negative and positive emotions. Teachers use this system to track students’ emotional changes as well as their motivation and attention, allowing them to intervene in a timely manner if a student loses interest.

The problem is that much of emotional AI is based on faulty science. Emotional AI algorithms, even when trained on large and diverse datasets, reduce facial expressions and intonations to emotions without considering the social and cultural context of the person and situation. Although, for example, algorithms can recognize and report that a person is crying, it is not always possible to accurately determine the cause and meaning of tears. Similarly, a frown doesn’t necessarily mean an angry person, but that’s the conclusion the algorithm is most likely to make. Why? We all adapt our emotional expressions according to our social and cultural norms, so our expressions are not always a true reflection of our inner state. Often people do “emotion work” to hide their real emotions, and the way they express their emotions is more likely to be a learned response rather than a spontaneous expression. For example, women often change their emotions more than men, especially those who are assigned negative meanings such as anger, because they are expected to do so.

Thus, artificial intelligence technologies that make assumptions about emotional states are likely to exacerbate gender and racial inequalities in our society. For example, a 2019 UNESCO report shows the detrimental impact of the gender differentiation of AI technologies with “female” voice assistants designed according to stereotypes of emotional passivity and slavery.

Facial recognition AI could also perpetuate racial inequality. An analysis of 400 NBA games using two popular emotion-recognition programs, Face and the Microsoft Face API, found that, on average, black players were associated with more negative emotions, even when they were smiling. These results support those of other studies showing that black men tend to exude more positive emotions in the workplace because they are stereotypically perceived as aggressive and threatening.

Emotional AI technologies will become more prevalent in 2023, but if not challenged and explored, they will reinforce systemic racial and gender biases, reproduce and reinforce global inequalities, and further disadvantage those who are already marginalized.

Leave a Reply