Please use this identifier to cite or link to this item:
http://hdl.handle.net/10773/38006
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Pereira, Lara | pt_PT |
dc.contributor.author | Brás, Susana | pt_PT |
dc.contributor.author | Sebastião, Raquel | pt_PT |
dc.date.accessioned | 2023-06-13T13:57:21Z | - |
dc.date.available | 2023-06-13T13:57:21Z | - |
dc.date.issued | 2022 | - |
dc.identifier.isbn | 978-3-031-04880-7 | pt_PT |
dc.identifier.uri | http://hdl.handle.net/10773/38006 | - |
dc.description.abstract | Emotions are a high interesting subject for the development of areas such as health and education. As a result, methods that allow their understanding, characterization, and classification have been under the attention in recent years. The main objective of this work is to investigate the feasibility of characterizing emotions from facial electromyogram (EMG) signals. For that, we rely on the EMG signals, from the frontal and zygomatic muscles, collected on 37 participants while emotional conditions were induced by visual content, namely fear, joy, or neutral. Using only the entropy of the EMG signals, from the frontal and zygomatic muscles, we can distinguish, respectively, neutral and joy conditions for 70% and 84% of the participants, fear and joy conditions for 81% and 92% of the participants and neutral, and fear conditions for 65% and 70% of the participants. These results show that opposite emotional conditions are easier to distinguish through the information of EMG signals. Moreover, we can also conclude that the information from the zygomatic muscle allowed to characterized more participants with respect to the 3 emotional conditions induced. The characterization of emotions through EMG signals opens the possibility for a classification system for emotion classification relying only on EMG information. This has the advantages of micro-expressions detection, signal constant collection, and no need to acquire face images. This work is a first step towards the automatic classification of emotions based solely on facial EMG. | pt_PT |
dc.description.sponsorship | This work is also funded by national funds, European Regional Development Fund, FSE through COMPETE2020, through FCT, in the scope of the framework contract foreseen in the numbers 4, 5, and 6 of the article 23, of the Decree-Law 57/2016, of August 29, changed by Law 57/2017, of July 19 | pt_PT |
dc.language.iso | eng | pt_PT |
dc.publisher | Springer | pt_PT |
dc.relation | info:eu-repo/grantAgreement/FCT/CEEC IND 2018/CEECIND%2F03986%2F2018%2FCP1559%2FCT0028/PT | pt_PT |
dc.relation | info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UIDB%2F00127%2F2020/PT | pt_PT |
dc.rights | restrictedAccess | pt_PT |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | pt_PT |
dc.subject | EMG | pt_PT |
dc.subject | Emotion characterization | pt_PT |
dc.subject | Entropy | pt_PT |
dc.title | Characterization of emotions through facial electromyogram signals | pt_PT |
dc.type | bookPart | pt_PT |
dc.description.version | published | pt_PT |
dc.peerreviewed | yes | pt_PT |
degois.publication.firstPage | 230 | pt_PT |
degois.publication.lastPage | 241 | pt_PT |
degois.publication.title | IbPRIA 2022: Iberian Conference on Pattern Recognition and Image Analysis: Lecture Notes in Computer Science | pt_PT |
degois.publication.volume | 13256 | - |
dc.identifier.doi | 10.1007/978-3-031-04881-4_19 | pt_PT |
dc.identifier.esbn | 978-3-031-04881-4 | pt_PT |
Appears in Collections: | DFis - Capítulo de livro DETI - Capítulo de livro IEETA - Capítulo de livro |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
2022_ibPRIA_pss_ac.pdf | 2.37 MB | Adobe PDF | ![]() |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.