Are science fiction movies becoming reality? What does the development of robots capable of understanding human emotions mean?
Until recently, robots existed only in the realms of science fiction movies. Today, however, they have become an integral part of our daily lives.
Initially introduced into our lives through mechanization, robotic systems were predominantly used in production processes in factories and in industries like automotive manufacturing. With the emergence of artificial intelligence, however, we began witnessing these systems being equipped with human-like qualities. Among these, the ability to think and establish cause-and-effect relationships stands out as a primary feature.
In recent years, research has taken this technology to an even more advanced level. The advancements in technology have given a rapid pace to developments in robotics, specifically in humanoid robots, which are engineered to seem and behave like human beings. These robots serve various purposes in human-computer interaction (HCI), ranging from education to entertainment.
The development of emotional intelligence for robots, which is substantially significant, allows them to successfully sense, reason, and react to human emotions.
Robots now have the ability to detect and examine human emotions simultaneously via integrated tools such as microphones and cameras. This ability makes intuitive communication possible and expands their implementation in social environments.
Yahyaoui et al. (2025) introduce a mobile humanoid robot, Tiago++, incorporating a facial emotion recognition (FER) system. This interface shows the emotions of numerous people simultaneously. The study highlights the current state of emotion recognition technologies in human-robot interaction while emphasizing the significant contributions of past research to advancements in this field.
In the related research:
- Zhao et al., (2020), built a robot responding to identified emotions by deploying particular datasets, but they didn’t mention how the robot was created.
- Also, Dwijayanti et al. (2022) designed a robot that was able to both recognize emotions and calculate the distance between itself and the person, but this time the obstacle was that the robot couldn’t move around.
- Spezialetti et al., (2020) scrutinized research on robot emotion identification, but that study did not encompass state-of-the-art deep learning methods.
Additionally, researchers reviewed the advances in facial emotion recognition, and analyzed the new techniques and tools such as the emergence of more accurate techniques and simplified models working with less computational power. Even though these developments are considerably noteworthy, they seem to be too large and need extensive memory to operate, complicating the utilization of them in smaller robots.
Researchers have also delved into how these systems function in practice, shedding light on their operational mechanisms.
How does the Emotional Interface Operate?
An effective emotion detection system requires three main components.
- Face detection: A machine learning object recognition software called The Haar Cascades scans the intensity of pixels on the image based on its trained model. This excludes non-face areas using an organized “cascade of classifiers,” increasing accuracy and efficiency.
- Emotion recognition: Subsequent to face recognition, deep learning algorithms that are especially trained on the FER2013 dataset analyze emotions. For categorizing emotions into seven groups, namely neutrality, anger, contempt, fear, happiness, sorrow, and surprise, these algorithms utilize facial image elements.
- User-friendly graphical interface: To make the system user-friendly and convenient, a graphical interface for users to interact with was created. It records live video from a camera, recognizes faces using the Haar Cascade classifier, then analyzes the photos to identify emotions. The interface shows the identified emotions with the aim of providing the user with feedback instantly. Real-time multi-face emotion detection and classification is supported by this interface.
The Humanoid Robot: Tiago++
The emotion detection system was integrated into the Tiago++ humanoid robot, enabling it to recognize and respond to emotions in real time. The robot’s camera was linked to the software utilizing a program known as the Robot Operating System (ROS). Whenever the robot’s camera records live footage, a program named MediaPipeRos located faces within the footage. The robot can turn its head left, right, up, or down to maintain a view of faces.
After scanning faces, the robot determines an individual’s emotion, such as happiness or sadness, by analyzing the faces it observes. A basic interface is used to present those emotions on a screen. Progress bars, for instance, display the robot’s level of assurance in the recognized emotion. In order to react to the emotions it perceives, the robot can also gently move its body, which gives it a more realistic and interactive sense. By identifying emotions and reacting accordingly, the Tiago++ robot becomes more engaging and easy to use while interacting with people.
To ensure its performance, Tiago++ was tested extensively. Researchers chose the EfficientNetV2-B0 model for its efficiency on limited hardware. Tests included recognizing emotions from a single person and identifying emotions of two individuals simultaneously. In both scenarios, Tiago++ performed effectively, demonstrating its capability in human-robot communication.
Takeaways
To sum up, Tiago++ represents a significant development in studies of integration of emotional intelligence into humanoid robots. Its application has a high potential in many fields varying from healthcare to education. By combining advanced technology with intuitive interaction, Tiago++ paves the way for a future where robots enhance human life through meaningful connections.