Connect with us

Hi, what are you looking for?

AI Research

Researchers Develop AI Model Mimicking Human Emotion Formation with 75% Accuracy

Researchers at NAIST and Osaka University unveil an AI model that accurately mimics human emotion formation with 75% agreement to self-reported feelings.

Researchers at the Nara Institute of Science and Technology (NAIST) in Japan, in collaboration with Osaka University, have developed a groundbreaking computational model aimed at understanding how humans form emotions. Led by Assistant Professor Chie Hieida and supported by Assistant Professor Kazuki Miyazawa and then-master’s student Kazuki Tsurumaki, the study was published in the journal IEEE Transactions on Affective Computing on December 3, 2025, following its online release on July 3, 2025. This research could pave the way for more emotionally aware artificial intelligence systems.

The study is based on the theory of constructed emotions, which posits that emotions are not instinctual responses but rather complex constructs formed by the brain in real time. This theory suggests that emotions arise from the integration of internal bodily signals (interoception), such as heart rate, with external sensory information (exteroception) like visual and auditory stimuli. As Dr. Hieida noted, while there are existing frameworks that address how emotions are conceptually formed, the underlying computational processes have not been thoroughly examined.

To investigate this phenomenon, the research team employed multilayered multimodal latent Dirichlet allocation (mMLDA), a probabilistic generative model designed to identify hidden statistical patterns in data. This model analyzes the co-occurrence of diverse data types without being pre-programmed with emotional labels, allowing it to discover emotional categories independently.

The model was trained on unlabeled data collected from 29 participants who viewed 60 images sourced from the International Affective Picture System—a widely recognized tool in psychological research. While engaging with the images, the researchers recorded participants’ physiological responses, such as heart rate, using wearable sensors, and gathered verbal descriptions of their emotional states. This comprehensive approach captured the complex interplay of visual interpretation, bodily reactions, and self-reported emotional experiences.

When the model’s inferred emotion concepts were compared with the participants’ self-reported emotional evaluations, a substantial agreement rate of about 75% was observed. This rate significantly exceeds what could be anticipated by chance, indicating that the model’s categorizations closely align with human emotional experiences.

The implications of this research extend beyond academic interest; it suggests a pathway toward creating more responsive and human-like AI systems. “Integrating visual, linguistic, and physiological information into interactive robots and emotion-aware AI systems could enable more human-like emotion understanding and context-sensitive responses,” Dr. Hieida emphasized. Such advancements could facilitate enhanced interactions between humans and machines, making technology more intuitive and relatable.

Moreover, the ability of the model to infer emotional states that individuals might find difficult to articulate verbally could yield significant benefits in fields such as mental health support, healthcare monitoring, and assistive technologies for individuals with developmental disorders or dementia. By providing a computational framework that connects theoretical understanding of emotions with empirical validation, this research addresses a critical question in the intersection of psychology and technology.

In conclusion, the work by Dr. Hieida and her team not only contributes to the academic understanding of emotion formation but also sets the stage for practical applications that could transform how we interact with AI and robotics, ultimately leading to more emotionally intelligent machines capable of responding to human needs with greater sensitivity. As the field of AI continues to evolve, the integration of emotionally aware systems may soon become a critical aspect of technological development.

Source: Nara Institute of Science and Technology
Journal reference: DOI: 10.1109/TAFFC.2025.3585882

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.