Matches in SemOpenAlex for { <https://semopenalex.org/work/W4295872149> ?p ?o ?g. }
- W4295872149 endingPage "121" @default.
- W4295872149 startingPage "107" @default.
- W4295872149 abstract "Affective computing is an increasing interdisciplinary research field that provides great potential to recognize, understand and express human emotions. Recently, multimodal analysis starts to gain more popularity in affective studies, which could provide a more comprehensive view of emotion dynamics based on the diverse and complementary information from different data modalities. However, the stability and generalizability of current multimodal analysis methods have not been thoroughly developed yet. In this paper, we propose a novel multimodal analysis method (EEG-AVE: EEG with audio-visual embedding) for cross-individual affective detection, where EEG signals are exploited to identify the emotion-related individual preferences and audio-visual information is leveraged to estimate the intrinsic emotions involved in the multimedia content. EEG-AVE is composed of two main modules. For EEG-based individual preferences prediction module, a multi-scale domain adversarial neural network is developed to explore the shared dynamic, informative, and domain-invariant EEG features across individuals. For video-based intrinsic emotions estimation module, a deep audio-visual feature-based hypergraph clustering method is proposed to examine the latent relationship between semantic audio-visual features and emotions. Through an embedding model, both estimated individual preferences and intrinsic emotions are incorporated with shared weights and further contribute to affective detection across individuals. Experiments on two well-known emotional databases indicate that the proposed EEG-AVE model achieves a better performance under a leave-one-individual-out cross-validation individual-independent evaluation protocol. The results demonstrate that EEG-AVE is an effective model with good reliability and generalizability, which has practical significance in the development of multimodal analysis in affective computing." @default.
- W4295872149 created "2022-09-15" @default.
- W4295872149 creator A5000646590 @default.
- W4295872149 creator A5015677940 @default.
- W4295872149 creator A5028625458 @default.
- W4295872149 creator A5036446937 @default.
- W4295872149 creator A5041151211 @default.
- W4295872149 creator A5066716873 @default.
- W4295872149 creator A5068909220 @default.
- W4295872149 date "2022-10-01" @default.
- W4295872149 modified "2023-10-17" @default.
- W4295872149 title "Cross-individual affective detection using EEG signals with audio-visual embedding" @default.
- W4295872149 cites W1947251450 @default.
- W4295872149 cites W1970727126 @default.
- W4295872149 cites W2002055708 @default.
- W4295872149 cites W2122098299 @default.
- W4295872149 cites W2126552487 @default.
- W4295872149 cites W2135933687 @default.
- W4295872149 cites W2139557692 @default.
- W4295872149 cites W2165611870 @default.
- W4295872149 cites W2267653472 @default.
- W4295872149 cites W2396728763 @default.
- W4295872149 cites W2414603974 @default.
- W4295872149 cites W2526153516 @default.
- W4295872149 cites W2584561145 @default.
- W4295872149 cites W2606226674 @default.
- W4295872149 cites W2624340939 @default.
- W4295872149 cites W2703895418 @default.
- W4295872149 cites W2726279040 @default.
- W4295872149 cites W2790404832 @default.
- W4295872149 cites W2790898510 @default.
- W4295872149 cites W2888735433 @default.
- W4295872149 cites W2941401350 @default.
- W4295872149 cites W2964338223 @default.
- W4295872149 cites W2976001129 @default.
- W4295872149 cites W2982126608 @default.
- W4295872149 cites W3014215018 @default.
- W4295872149 cites W3016153892 @default.
- W4295872149 cites W3027581678 @default.
- W4295872149 cites W3043308633 @default.
- W4295872149 cites W3108087271 @default.
- W4295872149 cites W3164439356 @default.
- W4295872149 cites W4226079592 @default.
- W4295872149 doi "https://doi.org/10.1016/j.neucom.2022.09.078" @default.
- W4295872149 hasPublicationYear "2022" @default.
- W4295872149 type Work @default.
- W4295872149 citedByCount "1" @default.
- W4295872149 countsByYear W42958721492023 @default.
- W4295872149 crossrefType "journal-article" @default.
- W4295872149 hasAuthorship W4295872149A5000646590 @default.
- W4295872149 hasAuthorship W4295872149A5015677940 @default.
- W4295872149 hasAuthorship W4295872149A5028625458 @default.
- W4295872149 hasAuthorship W4295872149A5036446937 @default.
- W4295872149 hasAuthorship W4295872149A5041151211 @default.
- W4295872149 hasAuthorship W4295872149A5066716873 @default.
- W4295872149 hasAuthorship W4295872149A5068909220 @default.
- W4295872149 hasBestOaLocation W42958721492 @default.
- W4295872149 hasConcept C118552586 @default.
- W4295872149 hasConcept C119857082 @default.
- W4295872149 hasConcept C138496976 @default.
- W4295872149 hasConcept C138885662 @default.
- W4295872149 hasConcept C153180895 @default.
- W4295872149 hasConcept C154945302 @default.
- W4295872149 hasConcept C15744967 @default.
- W4295872149 hasConcept C27158222 @default.
- W4295872149 hasConcept C2776401178 @default.
- W4295872149 hasConcept C28490314 @default.
- W4295872149 hasConcept C41008148 @default.
- W4295872149 hasConcept C41608201 @default.
- W4295872149 hasConcept C41895202 @default.
- W4295872149 hasConcept C522805319 @default.
- W4295872149 hasConcept C59404180 @default.
- W4295872149 hasConcept C6438553 @default.
- W4295872149 hasConceptScore W4295872149C118552586 @default.
- W4295872149 hasConceptScore W4295872149C119857082 @default.
- W4295872149 hasConceptScore W4295872149C138496976 @default.
- W4295872149 hasConceptScore W4295872149C138885662 @default.
- W4295872149 hasConceptScore W4295872149C153180895 @default.
- W4295872149 hasConceptScore W4295872149C154945302 @default.
- W4295872149 hasConceptScore W4295872149C15744967 @default.
- W4295872149 hasConceptScore W4295872149C27158222 @default.
- W4295872149 hasConceptScore W4295872149C2776401178 @default.
- W4295872149 hasConceptScore W4295872149C28490314 @default.
- W4295872149 hasConceptScore W4295872149C41008148 @default.
- W4295872149 hasConceptScore W4295872149C41608201 @default.
- W4295872149 hasConceptScore W4295872149C41895202 @default.
- W4295872149 hasConceptScore W4295872149C522805319 @default.
- W4295872149 hasConceptScore W4295872149C59404180 @default.
- W4295872149 hasConceptScore W4295872149C6438553 @default.
- W4295872149 hasLocation W42958721491 @default.
- W4295872149 hasLocation W42958721492 @default.
- W4295872149 hasOpenAccess W4295872149 @default.
- W4295872149 hasPrimaryLocation W42958721491 @default.
- W4295872149 hasRelatedWork W2382607599 @default.
- W4295872149 hasRelatedWork W2546942002 @default.
- W4295872149 hasRelatedWork W2592385986 @default.
- W4295872149 hasRelatedWork W2940256401 @default.
- W4295872149 hasRelatedWork W2964189431 @default.