Please use this identifier to cite or link to this item: http://hdl.handle.net/10773/38006
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPereira, Larapt_PT
dc.contributor.authorBrás, Susanapt_PT
dc.contributor.authorSebastião, Raquelpt_PT
dc.date.accessioned2023-06-13T13:57:21Z-
dc.date.available2023-06-13T13:57:21Z-
dc.date.issued2022-
dc.identifier.isbn978-3-031-04880-7pt_PT
dc.identifier.urihttp://hdl.handle.net/10773/38006-
dc.description.abstractEmotions are a high interesting subject for the development of areas such as health and education. As a result, methods that allow their understanding, characterization, and classification have been under the attention in recent years. The main objective of this work is to investigate the feasibility of characterizing emotions from facial electromyogram (EMG) signals. For that, we rely on the EMG signals, from the frontal and zygomatic muscles, collected on 37 participants while emotional conditions were induced by visual content, namely fear, joy, or neutral. Using only the entropy of the EMG signals, from the frontal and zygomatic muscles, we can distinguish, respectively, neutral and joy conditions for 70% and 84% of the participants, fear and joy conditions for 81% and 92% of the participants and neutral, and fear conditions for 65% and 70% of the participants. These results show that opposite emotional conditions are easier to distinguish through the information of EMG signals. Moreover, we can also conclude that the information from the zygomatic muscle allowed to characterized more participants with respect to the 3 emotional conditions induced. The characterization of emotions through EMG signals opens the possibility for a classification system for emotion classification relying only on EMG information. This has the advantages of micro-expressions detection, signal constant collection, and no need to acquire face images. This work is a first step towards the automatic classification of emotions based solely on facial EMG.pt_PT
dc.description.sponsorshipThis work is also funded by national funds, European Regional Development Fund, FSE through COMPETE2020, through FCT, in the scope of the framework contract foreseen in the numbers 4, 5, and 6 of the article 23, of the Decree-Law 57/2016, of August 29, changed by Law 57/2017, of July 19pt_PT
dc.language.isoengpt_PT
dc.publisherSpringerpt_PT
dc.relationinfo:eu-repo/grantAgreement/FCT/CEEC IND 2018/CEECIND%2F03986%2F2018%2FCP1559%2FCT0028/PTpt_PT
dc.relationinfo:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00127%2F2020/PTpt_PT
dc.rightsrestrictedAccesspt_PT
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/pt_PT
dc.subjectEMGpt_PT
dc.subjectEmotion characterizationpt_PT
dc.subjectEntropypt_PT
dc.titleCharacterization of emotions through facial electromyogram signalspt_PT
dc.typebookPartpt_PT
dc.description.versionpublishedpt_PT
dc.peerreviewedyespt_PT
degois.publication.firstPage230pt_PT
degois.publication.lastPage241pt_PT
degois.publication.titleIbPRIA 2022: Iberian Conference on Pattern Recognition and Image Analysis: Lecture Notes in Computer Sciencept_PT
degois.publication.volume13256-
dc.identifier.doi10.1007/978-3-031-04881-4_19pt_PT
dc.identifier.esbn978-3-031-04881-4pt_PT
Appears in Collections:DFis - Capítulo de livro
DETI - Capítulo de livro
IEETA - Capítulo de livro

Files in This Item:
File Description SizeFormat 
2022_ibPRIA_pss_ac.pdf2.37 MBAdobe PDFrestrictedAccess


FacebookTwitterLinkedIn
Formato BibTex MendeleyEndnote Degois 

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.