Researchers at the Ulsan National Institute of Science and Technology (UNIST) have developed a cutting-edge technology for real-time human emotion recognition, which has the potential to revolutionize various industries with a focus on wearable systems. The team was able to address the challenge of interpreting abstract emotional data by creating a multi-modal system that combines verbal and non-verbal expression data.
This system utilizes a personalized skin-integrated facial interface (PSiFI) powered by friction charging, incorporating a bidirectional triboelectric strain and vibration sensor for simultaneous data sensing and integration. The fully integrated data processing circuit ensures wireless real-time emotion recognition, even when individuals are wearing masks.
By incorporating information from facial muscle deformation and vocal cord vibrations, the system offers personalized services based on users’ emotions. This technology has shown impressive accuracy in identifying human emotions, as demonstrated in a digital concierge application within virtual reality (VR) environments.
Collaboration with Nanyang Technical University in Singapore, supported by the National Research Foundation of Korea (NRF) and the Korea Institute of Materials (KIMS), underscores the importance of this advancement in human-machine interface (HMI) devices, paving the way for enhanced interaction capabilities between humans and machines.