Matches in SemOpenAlex for { <https://semopenalex.org/work/W84800558> ?p ?o ?g. }
- W84800558 abstract "How Do Static and Dynamic Emotional Faces Prime Incremental Semantic Interpretation?: Comparing Older and Younger Adults Katja Munster (Katja.Muenster@uni-bielefeld.de) 2,3 Maria Nella Carminati (mcarmina@techfak.uni-bielefeld.de) 1,3 Pia Knoeferle (knoeferl@cit-ec.uni-bielefeld.de) 1,2,3 1 SFB 673 “Alignment in Communication” 2 Cognitive Interaction Technology Excellence Center 3 Department of Linguistics CITEC, Inspiration 1, Bielefeld University 33615 Bielefeld, Germany Abstract Using eye-tracking, two studies investigated whether a dynamic vs. static emotional facial expression can influence how a listener interprets a subsequent emotionally-valenced utterance in relation to a visual context. Crucially, we assessed whether such facial priming changes with the comprehender’s age (younger vs. older adults). Participants inspected a static (Experiment 1, Carminati & Knoeferle, 2013) or a dynamic (Experiment 2) facial expression that was either happy or sad. After inspecting the face, participants saw two pictures of opposite valence (positive and negative; presented at the same time) and heard an either positively or negatively valenced sentence describing one of these two pictures. Participants’ task was to look at the display, understand the sentence, and to decide whether the facial expression matched the sentence. The emotional face influenced visual attention on the pictures and during the processing of the sentence, and these influences were modulated by age. Older adults were more strongly influenced by the positive prime face whereas younger adults were more strongly influenced by the negative facial expression. These results suggest that the negativity and the positivity bias observed in visual attention in young and older adults respectively extend to face-sentence priming. However, static and dynamic emotional faces had similar priming effects on sentence processing. Keywords: Eye-tracking; sentence processing; emotional priming; dynamic vs. static facial expressions Introduction Monitoring people’s gaze in a visual context provides a unique opportunity for examining the incremental integration of visual and linguistic information (Tanenhaus et al., 1995). Non-linguistic visual information can rapidly guide visual attention during incremental language processing in young adults (e.g., Chambers, Tanenhaus, & Magnuson, 2004; Knoeferle et al., 2005; Sedivy et al., 1999; Spivey et al., 2002). Similar incremental effects of visual context information emerged in event-related brain potentials (ERPs) for older adults (e.g., Wassenaar & Hagoort, 2007). However, the bulk of research has focused on assessing how object- and action-related information in the visual context influences spoken language comprehension. By contrast, we know little about how social and visual cues of a speaker in the visual context (e.g., through his/her dynamic emotional facial expression) can affect a listener’s utterance comprehension 1 . In principle, a speaker’s facial expression of emotion could help a listener to rapidly interpret his/her utterances. With a view to investigating sentence processing across the lifespan and in relation to emotional visual cues, we assessed whether older adults exploit static and dynamic emotional facial cues with a similar time course and in a similar fashion as younger adults. The rapid integration of multiple emotional cues (facial, pictorial and sentential) during incremental sentence processing seems particularly challenging, yet such integration appears to occur effortlessly in natural language interaction. Here we examine how this integration is achieved using a properly controlled experimental setting. To motivate our studies in more detail, we first review relevant literature on emotion processing, on the recognition of dynamic facial emotion expressions, and on emotion processing in young relative to older adults. Affective Words and Face-Word Emotion Priming Humans seem to attend more readily to emotional compared with neutral stimuli. For instance, participants in a study by Kissler, Herbert, Pyke, and Junghofer (2007) read words while their event-related brain potentials were measured. Positive and negative compared with neutral words elicited enhanced negative mean amplitude ERPs, peaking at around 250 ms after word onset. On the assumption that enhanced cortical potentials index increased attention, valenced relative to neutral information seems to immediately catch our attention (see e.g., Kissler & Keil, 2008 for evidence on endogenous saccades to emotional vs. neutral pictures; Nummenmaa, Hyona, & Calvo, 2006 for eye-tracking (but see the rather substantial literature on gesture interpretation)" @default.
- W84800558 created "2016-06-24" @default.
- W84800558 creator A5007173669 @default.
- W84800558 creator A5022177590 @default.
- W84800558 creator A5027562110 @default.
- W84800558 date "2014-01-01" @default.
- W84800558 modified "2023-09-26" @default.
- W84800558 title "How Do Static and Dynamic Emotional Faces Prime Incremental Semantic Interpretation?: Comparing Older and Younger Adults" @default.
- W84800558 cites W1965671839 @default.
- W84800558 cites W1977109735 @default.
- W84800558 cites W1981142040 @default.
- W84800558 cites W2000736129 @default.
- W84800558 cites W2001394283 @default.
- W84800558 cites W2004221457 @default.
- W84800558 cites W2014481042 @default.
- W84800558 cites W2020755048 @default.
- W84800558 cites W2035883433 @default.
- W84800558 cites W2045096378 @default.
- W84800558 cites W2055178271 @default.
- W84800558 cites W2064290905 @default.
- W84800558 cites W2066879037 @default.
- W84800558 cites W2070375420 @default.
- W84800558 cites W2070955680 @default.
- W84800558 cites W20770702 @default.
- W84800558 cites W2077435037 @default.
- W84800558 cites W2079087810 @default.
- W84800558 cites W2080080403 @default.
- W84800558 cites W2084115469 @default.
- W84800558 cites W2085834106 @default.
- W84800558 cites W2089451133 @default.
- W84800558 cites W2103472153 @default.
- W84800558 cites W2103611183 @default.
- W84800558 cites W2117385529 @default.
- W84800558 cites W2124138991 @default.
- W84800558 cites W2130130286 @default.
- W84800558 cites W2130811578 @default.
- W84800558 cites W2150276468 @default.
- W84800558 cites W2156425116 @default.
- W84800558 cites W2168711577 @default.
- W84800558 cites W2974550657 @default.
- W84800558 hasPublicationYear "2014" @default.
- W84800558 type Work @default.
- W84800558 sameAs 84800558 @default.
- W84800558 citedByCount "2" @default.
- W84800558 countsByYear W848005582015 @default.
- W84800558 crossrefType "journal-article" @default.
- W84800558 hasAuthorship W84800558A5007173669 @default.
- W84800558 hasAuthorship W84800558A5022177590 @default.
- W84800558 hasAuthorship W84800558A5027562110 @default.
- W84800558 hasConcept C121332964 @default.
- W84800558 hasConcept C138885662 @default.
- W84800558 hasConcept C15744967 @default.
- W84800558 hasConcept C168900304 @default.
- W84800558 hasConcept C169760540 @default.
- W84800558 hasConcept C169900460 @default.
- W84800558 hasConcept C180747234 @default.
- W84800558 hasConcept C195704467 @default.
- W84800558 hasConcept C2777530160 @default.
- W84800558 hasConcept C2780378701 @default.
- W84800558 hasConcept C41895202 @default.
- W84800558 hasConcept C46312422 @default.
- W84800558 hasConcept C62520636 @default.
- W84800558 hasConceptScore W84800558C121332964 @default.
- W84800558 hasConceptScore W84800558C138885662 @default.
- W84800558 hasConceptScore W84800558C15744967 @default.
- W84800558 hasConceptScore W84800558C168900304 @default.
- W84800558 hasConceptScore W84800558C169760540 @default.
- W84800558 hasConceptScore W84800558C169900460 @default.
- W84800558 hasConceptScore W84800558C180747234 @default.
- W84800558 hasConceptScore W84800558C195704467 @default.
- W84800558 hasConceptScore W84800558C2777530160 @default.
- W84800558 hasConceptScore W84800558C2780378701 @default.
- W84800558 hasConceptScore W84800558C41895202 @default.
- W84800558 hasConceptScore W84800558C46312422 @default.
- W84800558 hasConceptScore W84800558C62520636 @default.
- W84800558 hasIssue "36" @default.
- W84800558 hasLocation W848005581 @default.
- W84800558 hasOpenAccess W84800558 @default.
- W84800558 hasPrimaryLocation W848005581 @default.
- W84800558 hasRelatedWork W1483632504 @default.
- W84800558 hasRelatedWork W1966988515 @default.
- W84800558 hasRelatedWork W1988776539 @default.
- W84800558 hasRelatedWork W2018690609 @default.
- W84800558 hasRelatedWork W2034920246 @default.
- W84800558 hasRelatedWork W2043218625 @default.
- W84800558 hasRelatedWork W2049225497 @default.
- W84800558 hasRelatedWork W2051115339 @default.
- W84800558 hasRelatedWork W2060361098 @default.
- W84800558 hasRelatedWork W2085774679 @default.
- W84800558 hasRelatedWork W2145582964 @default.
- W84800558 hasRelatedWork W2183079735 @default.
- W84800558 hasRelatedWork W2597380873 @default.
- W84800558 hasRelatedWork W2921255963 @default.
- W84800558 hasRelatedWork W2982503072 @default.
- W84800558 hasRelatedWork W3018117925 @default.
- W84800558 hasRelatedWork W3116936693 @default.
- W84800558 hasRelatedWork W3135567139 @default.
- W84800558 hasRelatedWork W3153739810 @default.
- W84800558 hasRelatedWork W3192189439 @default.
- W84800558 hasVolume "36" @default.