Matches in SemOpenAlex for { <https://semopenalex.org/work/W3186724373> ?p ?o ?g. }
- W3186724373 endingPage "107752" @default.
- W3186724373 startingPage "107752" @default.
- W3186724373 abstract "Emotion produces complex neural processes and physiological changes under appropriate event stimulation. Physiological signals have the advantage of better reflecting a person’s actual emotional state than facial expressions or voice signals. An electroencephalogram (EEG) is a signal obtained by collecting, amplifying, and recording the human brain’s weak bioelectric signals on the scalp. The eye-tracking (E.T.) signal records the potential difference between the retina and the cornea and the potential generated by the eye movement muscle. Furthermore, the different modalities of physiological signals will contain various information representations of human emotions. Finding this different modal information is of great help to get higher recognition accuracy. The E.T. and EEG signals are synchronized and fused in this research, and an effective deep learning (DL) method was used to combine different modalities. This article proposes a technique based on a fusion model of the Gaussian mixed model (GMM) with the Butterworth and Chebyshev signal filter. Features extraction on EEG and E.T. are subsequently calculated. Secondly, the self-similarity (SSIM), energy (E), complexity (C), high order crossing (HOC), and power spectral density (PSD) for EGG, and electrooculography power density estimation ((EOG-PDE), center gravity frequency (CGF), frequency variance (F.V.), root mean square frequency (RMSF) for E.T. are selected hereafter; the max–min method is applied for vector normalization. Finally, a deep gradient neural network (DGNN) for EEG and E.T. multimodal signal classification is proposed. The proposed neural network predicted the emotions under the eight emotions event stimuli experiment with 88.10% accuracy. For the evaluation indices of accuracy (Ac), precision (Pr), recall (Re), F-measurement (Fm), precision–recall (P.R.) curve, true-positive rate (TPR) of receiver operating characteristic curve (ROC), the area under the curve (AUC), true-accept rate (TAR), and interaction on union (IoU), the proposed method also performs with high efficiency compared with several typical neural networks including the artificial neural network (ANN), SqueezeNet, GoogleNet, ResNet-50, DarkNet-53, ResNet-18, Inception-ResNet, Inception-v3, and ResNet-101." @default.
- W3186724373 created "2021-08-02" @default.
- W3186724373 creator A5000975435 @default.
- W3186724373 creator A5001607102 @default.
- W3186724373 creator A5002654622 @default.
- W3186724373 creator A5035130915 @default.
- W3186724373 creator A5071669728 @default.
- W3186724373 date "2021-10-01" @default.
- W3186724373 modified "2023-10-11" @default.
- W3186724373 title "Emotion classification on eye-tracking and electroencephalograph fused signals employing deep gradient neural networks" @default.
- W3186724373 cites W1969138120 @default.
- W3186724373 cites W1972087155 @default.
- W3186724373 cites W1981616671 @default.
- W3186724373 cites W2032254851 @default.
- W3186724373 cites W2043096108 @default.
- W3186724373 cites W2050216055 @default.
- W3186724373 cites W2067460630 @default.
- W3186724373 cites W2097117768 @default.
- W3186724373 cites W2128495200 @default.
- W3186724373 cites W2132920211 @default.
- W3186724373 cites W2140079265 @default.
- W3186724373 cites W2149628368 @default.
- W3186724373 cites W2396728763 @default.
- W3186724373 cites W2409159920 @default.
- W3186724373 cites W2565944610 @default.
- W3186724373 cites W2778520330 @default.
- W3186724373 cites W2801723110 @default.
- W3186724373 cites W2810819381 @default.
- W3186724373 cites W2887148375 @default.
- W3186724373 cites W2898824151 @default.
- W3186724373 cites W2918140510 @default.
- W3186724373 cites W2962958625 @default.
- W3186724373 cites W2964028220 @default.
- W3186724373 cites W2991306121 @default.
- W3186724373 cites W3008580105 @default.
- W3186724373 cites W3012314385 @default.
- W3186724373 cites W3015960903 @default.
- W3186724373 cites W3017363702 @default.
- W3186724373 cites W3021442095 @default.
- W3186724373 cites W3030703790 @default.
- W3186724373 cites W3031127640 @default.
- W3186724373 cites W3035471470 @default.
- W3186724373 cites W3037184110 @default.
- W3186724373 cites W3082415811 @default.
- W3186724373 cites W3090789868 @default.
- W3186724373 cites W3093340428 @default.
- W3186724373 cites W3093580851 @default.
- W3186724373 cites W3094638189 @default.
- W3186724373 cites W3104827393 @default.
- W3186724373 cites W3110091040 @default.
- W3186724373 cites W3114253296 @default.
- W3186724373 cites W3119615685 @default.
- W3186724373 cites W3143842863 @default.
- W3186724373 doi "https://doi.org/10.1016/j.asoc.2021.107752" @default.
- W3186724373 hasPublicationYear "2021" @default.
- W3186724373 type Work @default.
- W3186724373 sameAs 3186724373 @default.
- W3186724373 citedByCount "11" @default.
- W3186724373 countsByYear W31867243732022 @default.
- W3186724373 countsByYear W31867243732023 @default.
- W3186724373 crossrefType "journal-article" @default.
- W3186724373 hasAuthorship W3186724373A5000975435 @default.
- W3186724373 hasAuthorship W3186724373A5001607102 @default.
- W3186724373 hasAuthorship W3186724373A5002654622 @default.
- W3186724373 hasAuthorship W3186724373A5035130915 @default.
- W3186724373 hasAuthorship W3186724373A5071669728 @default.
- W3186724373 hasBestOaLocation W31867243732 @default.
- W3186724373 hasConcept C118552586 @default.
- W3186724373 hasConcept C12267149 @default.
- W3186724373 hasConcept C136886441 @default.
- W3186724373 hasConcept C144024400 @default.
- W3186724373 hasConcept C153050134 @default.
- W3186724373 hasConcept C153180895 @default.
- W3186724373 hasConcept C154945302 @default.
- W3186724373 hasConcept C15744967 @default.
- W3186724373 hasConcept C163507328 @default.
- W3186724373 hasConcept C19165224 @default.
- W3186724373 hasConcept C2778681526 @default.
- W3186724373 hasConcept C28490314 @default.
- W3186724373 hasConcept C31972630 @default.
- W3186724373 hasConcept C41008148 @default.
- W3186724373 hasConcept C50644808 @default.
- W3186724373 hasConcept C522805319 @default.
- W3186724373 hasConceptScore W3186724373C118552586 @default.
- W3186724373 hasConceptScore W3186724373C12267149 @default.
- W3186724373 hasConceptScore W3186724373C136886441 @default.
- W3186724373 hasConceptScore W3186724373C144024400 @default.
- W3186724373 hasConceptScore W3186724373C153050134 @default.
- W3186724373 hasConceptScore W3186724373C153180895 @default.
- W3186724373 hasConceptScore W3186724373C154945302 @default.
- W3186724373 hasConceptScore W3186724373C15744967 @default.
- W3186724373 hasConceptScore W3186724373C163507328 @default.
- W3186724373 hasConceptScore W3186724373C19165224 @default.
- W3186724373 hasConceptScore W3186724373C2778681526 @default.
- W3186724373 hasConceptScore W3186724373C28490314 @default.
- W3186724373 hasConceptScore W3186724373C31972630 @default.
- W3186724373 hasConceptScore W3186724373C41008148 @default.
- W3186724373 hasConceptScore W3186724373C50644808 @default.