Matches in SemOpenAlex for { <https://semopenalex.org/work/W4382135186> ?p ?o ?g. }
- W4382135186 endingPage "2795" @default.
- W4382135186 startingPage "2795" @default.
- W4382135186 abstract "Emotions expressed by humans can be identified from facial expressions, speech signals, or physiological signals. Among them, the use of physiological signals for emotion classification is a notable emerging area of research. In emotion recognition, a person’s electrocardiogram (ECG) and galvanic skin response (GSR) signals cannot be manipulated, unlike facial and voice signals. Moreover, wearables such as smartwatches and wristbands enable the detection of emotions in people’s naturalistic environment. During the COVID-19 pandemic, it was necessary to detect people’s emotions in order to ensure that appropriate actions were taken according to the prevailing situation and achieve societal balance. Experimentally, the duration of the emotion stimulus period and the social and non-social contexts of participants influence the emotion classification process. Hence, classification of emotions when participants are exposed to the elicitation process for a longer duration and taking into consideration the social context needs to be explored. This work explores the classification of emotions using five pretrained convolutional neural network (CNN) models: MobileNet, NASNetMobile, DenseNet 201, InceptionResnetV2, and EfficientNetB7. The continuous wavelet transform (CWT) coefficients were detected from ECG and GSR recordings from the AMIGOS database with suitable filtering. Scalograms of the sum of frequency coefficients versus time were obtained and converted into images. Emotions were classified using the pre-trained CNN models. The valence and arousal emotion classification accuracy obtained using ECG and GSR data were, respectively, 91.27% and 91.45% using the InceptionResnetV2 CNN classifier and 99.19% and 98.39% using the MobileNet CNN classifier. Other studies have not explored the use of scalograms to represent ECG and GSR CWT features for emotion classification using deep learning models. Additionally, this study provides a novel classification of emotions built on individual and group settings using ECG data. When the participants watched long-duration emotion elicitation videos individually and in groups, the accuracy was around 99.8%. MobileNet had the highest accuracy and shortest execution time. These subject-independent classification methods enable emotion classification independent of varying human behavior." @default.
- W4382135186 created "2023-06-27" @default.
- W4382135186 creator A5067448898 @default.
- W4382135186 creator A5072018008 @default.
- W4382135186 date "2023-06-24" @default.
- W4382135186 modified "2023-10-14" @default.
- W4382135186 title "Emotion Classification Based on CWT of ECG and GSR Signals Using Various CNN Models" @default.
- W4382135186 cites W2002055708 @default.
- W4382135186 cites W2122098299 @default.
- W4382135186 cites W2149628368 @default.
- W4382135186 cites W2547146855 @default.
- W4382135186 cites W2731964405 @default.
- W4382135186 cites W2778978785 @default.
- W4382135186 cites W2782273434 @default.
- W4382135186 cites W2915279277 @default.
- W4382135186 cites W2917094047 @default.
- W4382135186 cites W2919854899 @default.
- W4382135186 cites W2932628637 @default.
- W4382135186 cites W2946526173 @default.
- W4382135186 cites W2964350391 @default.
- W4382135186 cites W2968915073 @default.
- W4382135186 cites W2990834556 @default.
- W4382135186 cites W3005055041 @default.
- W4382135186 cites W3016893795 @default.
- W4382135186 cites W3026748937 @default.
- W4382135186 cites W3035120387 @default.
- W4382135186 cites W3048648033 @default.
- W4382135186 cites W3094942519 @default.
- W4382135186 cites W3123326053 @default.
- W4382135186 cites W3125546619 @default.
- W4382135186 cites W3171630147 @default.
- W4382135186 cites W3211139983 @default.
- W4382135186 cites W3215098849 @default.
- W4382135186 cites W4211211720 @default.
- W4382135186 cites W4211233477 @default.
- W4382135186 cites W4221116071 @default.
- W4382135186 cites W4224003776 @default.
- W4382135186 cites W4280564661 @default.
- W4382135186 cites W4307370703 @default.
- W4382135186 cites W4308409665 @default.
- W4382135186 cites W4311630662 @default.
- W4382135186 cites W4312226636 @default.
- W4382135186 cites W4313259443 @default.
- W4382135186 cites W4319596491 @default.
- W4382135186 cites W4320497620 @default.
- W4382135186 cites W4321599321 @default.
- W4382135186 doi "https://doi.org/10.3390/electronics12132795" @default.
- W4382135186 hasPublicationYear "2023" @default.
- W4382135186 type Work @default.
- W4382135186 citedByCount "1" @default.
- W4382135186 countsByYear W43821351862023 @default.
- W4382135186 crossrefType "journal-article" @default.
- W4382135186 hasAuthorship W4382135186A5067448898 @default.
- W4382135186 hasAuthorship W4382135186A5072018008 @default.
- W4382135186 hasBestOaLocation W43821351861 @default.
- W4382135186 hasConcept C119857082 @default.
- W4382135186 hasConcept C121332964 @default.
- W4382135186 hasConcept C149635348 @default.
- W4382135186 hasConcept C150594956 @default.
- W4382135186 hasConcept C153180895 @default.
- W4382135186 hasConcept C154945302 @default.
- W4382135186 hasConcept C15744967 @default.
- W4382135186 hasConcept C168900304 @default.
- W4382135186 hasConcept C169760540 @default.
- W4382135186 hasConcept C195704467 @default.
- W4382135186 hasConcept C206310091 @default.
- W4382135186 hasConcept C2777438025 @default.
- W4382135186 hasConcept C28490314 @default.
- W4382135186 hasConcept C36951298 @default.
- W4382135186 hasConcept C41008148 @default.
- W4382135186 hasConcept C62520636 @default.
- W4382135186 hasConcept C81363708 @default.
- W4382135186 hasConcept C95623464 @default.
- W4382135186 hasConceptScore W4382135186C119857082 @default.
- W4382135186 hasConceptScore W4382135186C121332964 @default.
- W4382135186 hasConceptScore W4382135186C149635348 @default.
- W4382135186 hasConceptScore W4382135186C150594956 @default.
- W4382135186 hasConceptScore W4382135186C153180895 @default.
- W4382135186 hasConceptScore W4382135186C154945302 @default.
- W4382135186 hasConceptScore W4382135186C15744967 @default.
- W4382135186 hasConceptScore W4382135186C168900304 @default.
- W4382135186 hasConceptScore W4382135186C169760540 @default.
- W4382135186 hasConceptScore W4382135186C195704467 @default.
- W4382135186 hasConceptScore W4382135186C206310091 @default.
- W4382135186 hasConceptScore W4382135186C2777438025 @default.
- W4382135186 hasConceptScore W4382135186C28490314 @default.
- W4382135186 hasConceptScore W4382135186C36951298 @default.
- W4382135186 hasConceptScore W4382135186C41008148 @default.
- W4382135186 hasConceptScore W4382135186C62520636 @default.
- W4382135186 hasConceptScore W4382135186C81363708 @default.
- W4382135186 hasConceptScore W4382135186C95623464 @default.
- W4382135186 hasIssue "13" @default.
- W4382135186 hasLocation W43821351861 @default.
- W4382135186 hasOpenAccess W4382135186 @default.
- W4382135186 hasPrimaryLocation W43821351861 @default.
- W4382135186 hasRelatedWork W2129455854 @default.
- W4382135186 hasRelatedWork W2151942619 @default.
- W4382135186 hasRelatedWork W2154129660 @default.