Matches in SemOpenAlex for { <https://semopenalex.org/work/W2394871320> ?p ?o ?g. }
Showing items 1 to 91 of
91
with 100 items per page.
- W2394871320 abstract "Motion in Vision and Language: Seeing Visual Motion can influence Processing of Motion Verbs Carolin Dudschig (carolin.dudschig@uni-tuebingen.de) Department of Psychology, Schleichstr. 4 72076 Tubingen, Germany Jan Souman (jansouman@hotmail.com) Department of Psychology, Schleichstr. 4 72076 Tubingen, Germany Barbara Kaup (barbara.kaup@uni-tuebingen.de) Department of Psychology, Schleichstr. 4 72076 Tubingen, Germany Abstract In contrast to symbolic models of language understanding, embodied models of language comprehension suggest that language is closely connected with visual and motor processing. In the current study we show that motion words, such as rise or fall, are processed faster if displayed against a background of compatible motion (e.g., upward vs. downward random dot motion with 60% motion coherence). However, this interaction between semantic processing and visual processing only occurred if the word and the motion display were presented simultaneously. If the visual motion display was short-lived and occurred 100 or 200 ms after word-onset, no interactions between language and visual motion were found. We suggest that only in situations that do not allow ignoring or strategically suppressing the visual motion display, supra-threshold visual motion can affect language comprehension. Keywords: Language processing; motion verbs; vision; visual motion processing; embodiment; grounding. Introduction Embodied models of language understanding propose a close connection between language and perceptuomotor processes in the brain (e.g., Barsalou, 1999). Recently, compelling evidence supported the close association between language and other cognitive functions (e.g., Zwaan, Stanfield & Yaxley, 2002). In the motor domain converging evidence suggests that language facilitates compatible motor actions (e.g., Glenberg & Kaschak, 2002) and that language comprehension involves cortical motor areas that are also involved in performing the described actions (e.g., Hauk, Jonsrude, & Pulvermuller, 2004). For example, Glenberg and Kaschak showed that processing sentences such as “Close the drawer” can interfere with motoric responses incompatible with the motion implied in the sentence (e.g., arm movement towards my body). Similar effects have been reported in studies using motion verbs (e.g., rise, climb) or nouns implicitly implying a location (e.g., bird vs. shoe), whereby upward verbs and nouns facilitate upward arm movements (Dudschig, Lachmair, de la Vega, De Filippis, & Kaup, 2012a; Lachmair, Dudschig, De Filippis, de la Vega & Kaup, 2011). In contrast to the effects of language on motor processing, in the perceptual domain there is rather mixed evidence regarding the relation between language and visual processing. In particular, evidence regarding the influence of non-linguistic factors on language processing is rare. This direction of cause is particularly important, as these findings would suggest that mechanisms underlying non-linguistic processes are required and recruited during language processing. Studies in the visual domain typically investigate the influence of language on perceptual detection or discrimination tasks. For example, it has been shown that words referring to entities with a typical location (e.g., hat vs. shoe) can influence visual target perception in upper or lower screen locations (e.g., Dudschig, Lachmair, de la Vega, De Filippis, & Kaup, 2012b; Estes, Verges & Barsalou, 2008). Similar results have been reported for valence words (e.g., Meier & Robinson, 2004) and religious concepts (e.g., Chasteen, Burdzy & Pratt, 2010). Additionally, there have been studies demonstrating that visual simulation can also occur during sentence processing and subsequently affect visual discrimination performance (Bergen, Lindsay, Matlock & Narayanan, 2007). Recently, it has been shown that not only visual discrimination performance but also eye-movements can be affected by words referring to entities in the upper or lower field of vision (Dudschig, Souman, Lachmair, de la Vega, & Kaup, 2013). More specifically, upward saccades are faster following words referring to entities in the upper visual field (e.g., bird) and in contrast, downward saccades are faster following words referring to entities in the lower visual field (e.g., shoe). Importantly, the relation between language and visual processing was also reported in the other causal direction: Perceiving visual motion patterns can affect language processing. For example, Kaschak, Madden, Therriault, Yaxley, Aveyard, Blanchard and Zwaan (2005) first reported the effects of visual motion perception on language comprehension. In their study, participants viewed visual motion patterns (e.g., upward vs. downward moving" @default.
- W2394871320 created "2016-06-24" @default.
- W2394871320 creator A5032594189 @default.
- W2394871320 creator A5077765751 @default.
- W2394871320 creator A5079026374 @default.
- W2394871320 date "2013-01-01" @default.
- W2394871320 modified "2023-09-26" @default.
- W2394871320 title "Motion in Vision and Language: Seeing Visual Motion can influence Processing of Motion Verbs" @default.
- W2394871320 cites W1970573088 @default.
- W2394871320 cites W1975909437 @default.
- W2394871320 cites W1977236022 @default.
- W2394871320 cites W1998208785 @default.
- W2394871320 cites W2024686567 @default.
- W2394871320 cites W2034947045 @default.
- W2394871320 cites W2056805596 @default.
- W2394871320 cites W2065548002 @default.
- W2394871320 cites W2108724306 @default.
- W2394871320 cites W2115658287 @default.
- W2394871320 cites W2133743411 @default.
- W2394871320 cites W2144920283 @default.
- W2394871320 cites W2148707512 @default.
- W2394871320 cites W2148848230 @default.
- W2394871320 cites W2159812997 @default.
- W2394871320 cites W2160310735 @default.
- W2394871320 cites W2167293745 @default.
- W2394871320 hasPublicationYear "2013" @default.
- W2394871320 type Work @default.
- W2394871320 sameAs 2394871320 @default.
- W2394871320 citedByCount "1" @default.
- W2394871320 countsByYear W23948713202015 @default.
- W2394871320 crossrefType "journal-article" @default.
- W2394871320 hasAuthorship W2394871320A5032594189 @default.
- W2394871320 hasAuthorship W2394871320A5077765751 @default.
- W2394871320 hasAuthorship W2394871320A5079026374 @default.
- W2394871320 hasConcept C100609095 @default.
- W2394871320 hasConcept C104114177 @default.
- W2394871320 hasConcept C154945302 @default.
- W2394871320 hasConcept C15744967 @default.
- W2394871320 hasConcept C169760540 @default.
- W2394871320 hasConcept C169900460 @default.
- W2394871320 hasConcept C180747234 @default.
- W2394871320 hasConcept C199360897 @default.
- W2394871320 hasConcept C26760741 @default.
- W2394871320 hasConcept C2778251979 @default.
- W2394871320 hasConcept C41008148 @default.
- W2394871320 hasConcept C46312422 @default.
- W2394871320 hasConcept C48575856 @default.
- W2394871320 hasConcept C511192102 @default.
- W2394871320 hasConceptScore W2394871320C100609095 @default.
- W2394871320 hasConceptScore W2394871320C104114177 @default.
- W2394871320 hasConceptScore W2394871320C154945302 @default.
- W2394871320 hasConceptScore W2394871320C15744967 @default.
- W2394871320 hasConceptScore W2394871320C169760540 @default.
- W2394871320 hasConceptScore W2394871320C169900460 @default.
- W2394871320 hasConceptScore W2394871320C180747234 @default.
- W2394871320 hasConceptScore W2394871320C199360897 @default.
- W2394871320 hasConceptScore W2394871320C26760741 @default.
- W2394871320 hasConceptScore W2394871320C2778251979 @default.
- W2394871320 hasConceptScore W2394871320C41008148 @default.
- W2394871320 hasConceptScore W2394871320C46312422 @default.
- W2394871320 hasConceptScore W2394871320C48575856 @default.
- W2394871320 hasConceptScore W2394871320C511192102 @default.
- W2394871320 hasIssue "35" @default.
- W2394871320 hasLocation W23948713201 @default.
- W2394871320 hasOpenAccess W2394871320 @default.
- W2394871320 hasPrimaryLocation W23948713201 @default.
- W2394871320 hasRelatedWork W1554239800 @default.
- W2394871320 hasRelatedWork W1984050099 @default.
- W2394871320 hasRelatedWork W2019030296 @default.
- W2394871320 hasRelatedWork W2034307116 @default.
- W2394871320 hasRelatedWork W2043074908 @default.
- W2394871320 hasRelatedWork W2065084799 @default.
- W2394871320 hasRelatedWork W2071022079 @default.
- W2394871320 hasRelatedWork W2083750631 @default.
- W2394871320 hasRelatedWork W2087440290 @default.
- W2394871320 hasRelatedWork W2094321928 @default.
- W2394871320 hasRelatedWork W2095887773 @default.
- W2394871320 hasRelatedWork W2109544244 @default.
- W2394871320 hasRelatedWork W2112158592 @default.
- W2394871320 hasRelatedWork W2114527337 @default.
- W2394871320 hasRelatedWork W2122115729 @default.
- W2394871320 hasRelatedWork W2129591733 @default.
- W2394871320 hasRelatedWork W2164330927 @default.
- W2394871320 hasRelatedWork W249449331 @default.
- W2394871320 hasRelatedWork W2799544851 @default.
- W2394871320 hasRelatedWork W1926411199 @default.
- W2394871320 hasVolume "35" @default.
- W2394871320 isParatext "false" @default.
- W2394871320 isRetracted "false" @default.
- W2394871320 magId "2394871320" @default.
- W2394871320 workType "article" @default.