Matches in SemOpenAlex for { <https://semopenalex.org/work/W3116273309> ?p ?o ?g. }
- W3116273309 endingPage "52" @default.
- W3116273309 startingPage "52" @default.
- W3116273309 abstract "Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environments. To address this, we propose a correlation-based emotion recognition algorithm (CorrNet) to recognize the valence and arousal (V-A) of each instance (fine-grained segment of signals) using only wearable, physiological signals (e.g., electrodermal activity, heart rate). CorrNet takes advantage of features both inside each instance (intra-modality features) and between different instances for the same video stimuli (correlation-based features). We first test our approach on an indoor-desktop affect dataset (CASE), and thereafter on an outdoor-mobile affect dataset (MERCA) which we collected using a smart wristband and wearable eyetracker. Results show that for subject-independent binary classification (high-low), CorrNet yields promising recognition accuracies: 76.37% and 74.03% for V-A on CASE, and 70.29% and 68.15% for V-A on MERCA. Our findings show: (1) instance segment lengths between 1–4 s result in highest recognition accuracies (2) accuracies between laboratory-grade and wearable sensors are comparable, even under low sampling rates (≤64 Hz) (3) large amounts of neutral V-A labels, an artifact of continuous affect annotation, result in varied recognition performance." @default.
- W3116273309 created "2021-01-05" @default.
- W3116273309 creator A5013038112 @default.
- W3116273309 creator A5024676475 @default.
- W3116273309 creator A5030623043 @default.
- W3116273309 creator A5037486611 @default.
- W3116273309 creator A5049516313 @default.
- W3116273309 date "2020-12-24" @default.
- W3116273309 modified "2023-10-01" @default.
- W3116273309 title "CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors" @default.
- W3116273309 cites W1689711448 @default.
- W3116273309 cites W1963552713 @default.
- W3116273309 cites W1966797434 @default.
- W3116273309 cites W1970269181 @default.
- W3116273309 cites W1985948319 @default.
- W3116273309 cites W2002055708 @default.
- W3116273309 cites W2006550435 @default.
- W3116273309 cites W2030385993 @default.
- W3116273309 cites W2038840031 @default.
- W3116273309 cites W2039254933 @default.
- W3116273309 cites W2061272711 @default.
- W3116273309 cites W2064188028 @default.
- W3116273309 cites W2076063813 @default.
- W3116273309 cites W2077697924 @default.
- W3116273309 cites W2098673521 @default.
- W3116273309 cites W2102623553 @default.
- W3116273309 cites W2105464873 @default.
- W3116273309 cites W2113640817 @default.
- W3116273309 cites W2117645142 @default.
- W3116273309 cites W2118283585 @default.
- W3116273309 cites W2122098299 @default.
- W3116273309 cites W2134539427 @default.
- W3116273309 cites W2149628368 @default.
- W3116273309 cites W2164186291 @default.
- W3116273309 cites W2164699598 @default.
- W3116273309 cites W2167557160 @default.
- W3116273309 cites W2173627972 @default.
- W3116273309 cites W2278113816 @default.
- W3116273309 cites W2307909173 @default.
- W3116273309 cites W2344213027 @default.
- W3116273309 cites W2473636365 @default.
- W3116273309 cites W2607348396 @default.
- W3116273309 cites W2620011291 @default.
- W3116273309 cites W2640350988 @default.
- W3116273309 cites W2666784499 @default.
- W3116273309 cites W2726643871 @default.
- W3116273309 cites W2735453686 @default.
- W3116273309 cites W2735917556 @default.
- W3116273309 cites W2738226240 @default.
- W3116273309 cites W2783583489 @default.
- W3116273309 cites W2794867039 @default.
- W3116273309 cites W2795439263 @default.
- W3116273309 cites W2806792888 @default.
- W3116273309 cites W2808649502 @default.
- W3116273309 cites W2809533018 @default.
- W3116273309 cites W2810418809 @default.
- W3116273309 cites W2865118579 @default.
- W3116273309 cites W2889729260 @default.
- W3116273309 cites W2901027086 @default.
- W3116273309 cites W2907610496 @default.
- W3116273309 cites W2910478421 @default.
- W3116273309 cites W2915893085 @default.
- W3116273309 cites W2930120178 @default.
- W3116273309 cites W2941641520 @default.
- W3116273309 cites W2941811706 @default.
- W3116273309 cites W2946322152 @default.
- W3116273309 cites W2946526173 @default.
- W3116273309 cites W2968749391 @default.
- W3116273309 cites W2971526326 @default.
- W3116273309 cites W2971555734 @default.
- W3116273309 cites W2979903110 @default.
- W3116273309 cites W2990834556 @default.
- W3116273309 cites W3030187186 @default.
- W3116273309 cites W3034233088 @default.
- W3116273309 cites W3094594436 @default.
- W3116273309 cites W3103291722 @default.
- W3116273309 cites W4206774691 @default.
- W3116273309 cites W4242470988 @default.
- W3116273309 doi "https://doi.org/10.3390/s21010052" @default.
- W3116273309 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/7795677" @default.
- W3116273309 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33374281" @default.
- W3116273309 hasPublicationYear "2020" @default.
- W3116273309 type Work @default.
- W3116273309 sameAs 3116273309 @default.
- W3116273309 citedByCount "23" @default.
- W3116273309 countsByYear W31162733092021 @default.
- W3116273309 countsByYear W31162733092022 @default.
- W3116273309 countsByYear W31162733092023 @default.
- W3116273309 crossrefType "journal-article" @default.
- W3116273309 hasAuthorship W3116273309A5013038112 @default.
- W3116273309 hasAuthorship W3116273309A5024676475 @default.
- W3116273309 hasAuthorship W3116273309A5030623043 @default.
- W3116273309 hasAuthorship W3116273309A5037486611 @default.
- W3116273309 hasAuthorship W3116273309A5049516313 @default.
- W3116273309 hasBestOaLocation W31162733091 @default.
- W3116273309 hasConcept C117220453 @default.
- W3116273309 hasConcept C121332964 @default.
- W3116273309 hasConcept C136764020 @default.