Matches in SemOpenAlex for { <https://semopenalex.org/work/W4385804947> ?p ?o ?g. }
- W4385804947 abstract "Emotion recognition is the task of classifying perceived emotions in people. Previous works have utilized various nonverbal cues to extract features from images and correlate them to emotions. Of these cues, situational context is particularly crucial in emotion perception since it can directly influence the emotion of a person. In this paper, we propose an approach for high-level context representation extraction from images. The model relies on a single cue and a single encoding stream to correlate this representation with emotions. Our model competes with the state-of-the-art, achieving an mAP of 0.3002 on the EMOTIC dataset while also being capable of execution on consumer-grade hardware at ≈ 90 frames per second. Overall, our approach is more efficient than previous models and can be easily deployed to address real-world problems related to emotion recognition." @default.
- W4385804947 created "2023-08-15" @default.
- W4385804947 creator A5004401466 @default.
- W4385804947 creator A5016094018 @default.
- W4385804947 creator A5028863879 @default.
- W4385804947 creator A5042918332 @default.
- W4385804947 date "2023-06-01" @default.
- W4385804947 modified "2023-10-14" @default.
- W4385804947 title "High-level context representation for emotion recognition in images" @default.
- W4385804947 cites W1966797434 @default.
- W4385804947 cites W1993790380 @default.
- W4385804947 cites W2028602962 @default.
- W4385804947 cites W2081580037 @default.
- W4385804947 cites W2096086610 @default.
- W4385804947 cites W2103153725 @default.
- W4385804947 cites W2154132835 @default.
- W4385804947 cites W2250539671 @default.
- W4385804947 cites W2507296351 @default.
- W4385804947 cites W2739474071 @default.
- W4385804947 cites W2907891676 @default.
- W4385804947 cites W2946410230 @default.
- W4385804947 cites W2964751875 @default.
- W4385804947 cites W3001529617 @default.
- W4385804947 cites W3101469666 @default.
- W4385804947 cites W3138516171 @default.
- W4385804947 cites W4237730655 @default.
- W4385804947 cites W4281936364 @default.
- W4385804947 cites W4312399047 @default.
- W4385804947 cites W4312609734 @default.
- W4385804947 doi "https://doi.org/10.1109/cvprw59228.2023.00038" @default.
- W4385804947 hasPublicationYear "2023" @default.
- W4385804947 type Work @default.
- W4385804947 citedByCount "0" @default.
- W4385804947 crossrefType "proceedings-article" @default.
- W4385804947 hasAuthorship W4385804947A5004401466 @default.
- W4385804947 hasAuthorship W4385804947A5016094018 @default.
- W4385804947 hasAuthorship W4385804947A5028863879 @default.
- W4385804947 hasAuthorship W4385804947A5042918332 @default.
- W4385804947 hasBestOaLocation W43858049472 @default.
- W4385804947 hasConcept C125411270 @default.
- W4385804947 hasConcept C151730666 @default.
- W4385804947 hasConcept C153180895 @default.
- W4385804947 hasConcept C154945302 @default.
- W4385804947 hasConcept C15744967 @default.
- W4385804947 hasConcept C162324750 @default.
- W4385804947 hasConcept C169760540 @default.
- W4385804947 hasConcept C17744445 @default.
- W4385804947 hasConcept C180747234 @default.
- W4385804947 hasConcept C183322885 @default.
- W4385804947 hasConcept C187736073 @default.
- W4385804947 hasConcept C199539241 @default.
- W4385804947 hasConcept C26760741 @default.
- W4385804947 hasConcept C2776359362 @default.
- W4385804947 hasConcept C2777438025 @default.
- W4385804947 hasConcept C2779343474 @default.
- W4385804947 hasConcept C2780451532 @default.
- W4385804947 hasConcept C2781238097 @default.
- W4385804947 hasConcept C41008148 @default.
- W4385804947 hasConcept C52622490 @default.
- W4385804947 hasConcept C77805123 @default.
- W4385804947 hasConcept C86803240 @default.
- W4385804947 hasConcept C9114305 @default.
- W4385804947 hasConcept C94625758 @default.
- W4385804947 hasConceptScore W4385804947C125411270 @default.
- W4385804947 hasConceptScore W4385804947C151730666 @default.
- W4385804947 hasConceptScore W4385804947C153180895 @default.
- W4385804947 hasConceptScore W4385804947C154945302 @default.
- W4385804947 hasConceptScore W4385804947C15744967 @default.
- W4385804947 hasConceptScore W4385804947C162324750 @default.
- W4385804947 hasConceptScore W4385804947C169760540 @default.
- W4385804947 hasConceptScore W4385804947C17744445 @default.
- W4385804947 hasConceptScore W4385804947C180747234 @default.
- W4385804947 hasConceptScore W4385804947C183322885 @default.
- W4385804947 hasConceptScore W4385804947C187736073 @default.
- W4385804947 hasConceptScore W4385804947C199539241 @default.
- W4385804947 hasConceptScore W4385804947C26760741 @default.
- W4385804947 hasConceptScore W4385804947C2776359362 @default.
- W4385804947 hasConceptScore W4385804947C2777438025 @default.
- W4385804947 hasConceptScore W4385804947C2779343474 @default.
- W4385804947 hasConceptScore W4385804947C2780451532 @default.
- W4385804947 hasConceptScore W4385804947C2781238097 @default.
- W4385804947 hasConceptScore W4385804947C41008148 @default.
- W4385804947 hasConceptScore W4385804947C52622490 @default.
- W4385804947 hasConceptScore W4385804947C77805123 @default.
- W4385804947 hasConceptScore W4385804947C86803240 @default.
- W4385804947 hasConceptScore W4385804947C9114305 @default.
- W4385804947 hasConceptScore W4385804947C94625758 @default.
- W4385804947 hasFunder F4320321091 @default.
- W4385804947 hasLocation W43858049471 @default.
- W4385804947 hasLocation W43858049472 @default.
- W4385804947 hasOpenAccess W4385804947 @default.
- W4385804947 hasPrimaryLocation W43858049471 @default.
- W4385804947 hasRelatedWork W1964120219 @default.
- W4385804947 hasRelatedWork W2000165426 @default.
- W4385804947 hasRelatedWork W2114557664 @default.
- W4385804947 hasRelatedWork W2144059113 @default.
- W4385804947 hasRelatedWork W2146076056 @default.
- W4385804947 hasRelatedWork W2385132419 @default.
- W4385804947 hasRelatedWork W2772780115 @default.
- W4385804947 hasRelatedWork W2811390910 @default.