TY - JOUR
T1 - Eliciting Emotions
T2 - Investigating the Use of Generative AI and Facial Muscle Activation in Children’s Emotional Recognition
AU - Solis-Arrazola, Manuel A.
AU - Sanchez-Yanez, Raul E.
AU - Gonzalez-Acosta, Ana M.S.
AU - Garcia-Capulin, Carlos H.
AU - Rostro-Gonzalez, Horacio
N1 - Publisher Copyright:
© 2025 by the authors.
PY - 2025/1
Y1 - 2025/1
N2 - This study explores children’s emotions through a novel approach of Generative Artificial Intelligence (GenAI) and Facial Muscle Activation (FMA). It examines GenAI’s effectiveness in creating facial images that produce genuine emotional responses in children, alongside FMA’s analysis of muscular activation during these expressions. The aim is to determine if AI can realistically generate and recognize emotions similar to human experiences. The study involves generating a database of 280 images (40 per emotion) of children expressing various emotions. For real children’s faces from public databases (DEFSS and NIMH-CHEFS), five emotions were considered: happiness, angry, fear, sadness, and neutral. In contrast, for AI-generated images, seven emotions were analyzed, including the previous five plus surprise and disgust. A feature vector is extracted from these images, indicating lengths between reference points on the face that contract or expand based on the expressed emotion. This vector is then input into an artificial neural network for emotion recognition and classification, achieving accuracies of up to 99% in certain cases. This approach offers new avenues for training and validating AI algorithms, enabling models to be trained with artificial and real-world data interchangeably. The integration of both datasets during training and validation phases enhances model performance and adaptability.
AB - This study explores children’s emotions through a novel approach of Generative Artificial Intelligence (GenAI) and Facial Muscle Activation (FMA). It examines GenAI’s effectiveness in creating facial images that produce genuine emotional responses in children, alongside FMA’s analysis of muscular activation during these expressions. The aim is to determine if AI can realistically generate and recognize emotions similar to human experiences. The study involves generating a database of 280 images (40 per emotion) of children expressing various emotions. For real children’s faces from public databases (DEFSS and NIMH-CHEFS), five emotions were considered: happiness, angry, fear, sadness, and neutral. In contrast, for AI-generated images, seven emotions were analyzed, including the previous five plus surprise and disgust. A feature vector is extracted from these images, indicating lengths between reference points on the face that contract or expand based on the expressed emotion. This vector is then input into an artificial neural network for emotion recognition and classification, achieving accuracies of up to 99% in certain cases. This approach offers new avenues for training and validating AI algorithms, enabling models to be trained with artificial and real-world data interchangeably. The integration of both datasets during training and validation phases enhances model performance and adaptability.
KW - artificial neural networks
KW - facial emotion recognition
KW - facial muscle activation
KW - generative artificial intelligence
UR - http://www.scopus.com/inward/record.url?scp=85216212699&partnerID=8YFLogxK
UR - https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=pure_univeritat_ramon_llull&SrcAuth=WosAPI&KeyUT=WOS:001404628500001&DestLinkType=FullRecord&DestApp=WOS_CPL
UR - http://hdl.handle.net/20.500.14342/4905
U2 - 10.3390/bdcc9010015
DO - 10.3390/bdcc9010015
M3 - Article
AN - SCOPUS:85216212699
SN - 2504-2289
VL - 9
JO - Big Data and Cognitive Computing
JF - Big Data and Cognitive Computing
IS - 1
M1 - 15
ER -