Repository logo
 
Publication

Unraveling emotions with pre-trained models

datacite.subject.fosEngenharia e Tecnologia::Engenharia Eletrotécnica, Eletrónica e Informática
datacite.subject.sdg09:Indústria, Inovação e Infraestruturas
dc.contributor.authorPajón-Sanmartín, Alejandro
dc.contributor.authorArriba-Pérez, Francisco de
dc.contributor.authorGarcía-Méndez, Silvia
dc.contributor.authorLeal, Fátima
dc.contributor.authorMalheiro, Benedita
dc.contributor.authorBurguillo-Rial, Juan Carlos
dc.contributor.authorBENEDITA CAMPOS NEVES MALHEIRO, MARIA
dc.date.accessioned2025-11-10T09:46:47Z
dc.date.available2025-11-10T09:46:47Z
dc.date.issued2025-11-20
dc.description.abstractTransformer models have significantly advanced the field of emotion recognition. However, there are still open challenges when exploring open-ended queries for Large Language Models (LLMs). Although current models offer good results, automatic emotion analysis in open texts presents significant challenges, such as contextual ambiguity, linguistic variability, and difficulty interpreting complex emotional expressions. These limitations make the direct application of generalist models difficult. Accordingly, this work compares the effectiveness of fine-tuning and prompt engineering in emotion detection in three distinct scenarios: (i) performance of fine-tuned pre-trained models and general-purpose LLMs using simple prompts; (ii) effectiveness of different emotion prompt designs with LLMs; and (iii) impact of emotion grouping techniques on these models. Experimental tests attain metrics above 70 % with a fine-tuned pre-trained model for emotion recognition. Moreover, the findings highlight that LLMs require structured prompt engineering and emotion grouping to enhance their performance. These advancements improve sentiment analysis, human-computer interaction, and understanding of user behavior across various domains.eng
dc.identifier.citationA. Pajón-Sanmartín, F. De Arriba-Pérez, S. García-Méndez, F. Leal, B. Malheiro and J. Carlos Burguillo-Rial, "Unraveling Emotions With Pre-Trained Models," in IEEE Access, vol. 13, pp. 182458-182473, 2025, doi: 10.1109/ACCESS.2025.3623877.
dc.identifier.doi10.1109/access.2025.3623877
dc.identifier.issn2169-3536
dc.identifier.urihttp://hdl.handle.net/10400.22/30773
dc.language.isoeng
dc.peerreviewedyes
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.relationINESC TEC- Institute for Systems and Computer Engineering, Technology and Science
dc.relation.hasversionhttps://doi.org/10.1109/ACCESS.2025.3623877
dc.relation.ispartofIEEE Access
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subjectEmotion recognition
dc.subjectlarge language models
dc.subjectnatural language processing
dc.subjectopen-ended responses
dc.subjectprompt engineering
dc.subjecttransformer models
dc.titleUnraveling emotions with pre-trained modelseng
dc.typejournal article
dspace.entity.typePublication
oaire.awardTitleINESC TEC- Institute for Systems and Computer Engineering, Technology and Science
oaire.awardURIinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDP%2F50014%2F2020/PT
oaire.citation.endPage182473
oaire.citation.startPage182458
oaire.citation.titleIEEE Access
oaire.citation.volume13
oaire.fundingStream6817 - DCRRNI ID
oaire.versionhttp://purl.org/coar/version/c_970fb48d4fbd8a85
person.familyNameBENEDITA CAMPOS NEVES MALHEIRO
person.givenNameMARIA
person.identifier.ciencia-id7A15-08FC-4430
person.identifier.orcid0000-0001-9083-4292
project.funder.identifierhttp://doi.org/10.13039/501100001871
project.funder.nameFundação para a Ciência e a Tecnologia
relation.isAuthorOfPublicationbabd4fda-654a-4b59-952d-6113eebbb308
relation.isAuthorOfPublication.latestForDiscoverybabd4fda-654a-4b59-952d-6113eebbb308
relation.isProjectOfPublication5efdbedb-4666-4d5b-94f0-0a938b0d5ce4
relation.isProjectOfPublication.latestForDiscovery5efdbedb-4666-4d5b-94f0-0a938b0d5ce4

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Unraveling_Emotions_With_Pre-Trained_Models.pdf
Size:
2.59 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.03 KB
Format:
Item-specific license agreed upon to submission
Description: