Matches in SemOpenAlex for { <https://semopenalex.org/work/W2576877007> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W2576877007 abstract "Gesture Production during Spatial Tasks: Its Not All About Difficulty Autumn B. Hostetter (Autumn.Hostetter@kzoo.edu) Kalamazoo College, Department of Psychology, 1200 Academy Street Kalamazoo, MI 49006 USA Erin L. Sullivan (Erin.Sullivan08@kzoo.edu) Kalamazoo College, Department of Psychology, 1200 Academy Street Kalamazoo, MI 49006 USA Abstract Previous research has shown that speakers gesture more when describing imagistic information than when describing non- imagistic information. One explanation for this finding is that spatial information is more difficult to describe verbally than non-spatial information. In the present study, we designed two novel tasks, one verbal and one spatial, in which difficulty and spatiality were not confounded. In Experiment 1, we demonstrated that the spatial task is actually less difficult and leads to fewer errors than the verbal task. In Experiment 2, we found that speakers produced more representational gestures on the spatial task than the verbal task, even though it was not more difficult. Results suggest that speakers do not gesture on spatial tasks simply because they are more difficult. Keywords: gesture; mental imagery; task difficulty Introduction Representational gestures are movements of the hands and arms that represent motor or spatial concepts (e.g., moving a hand in an arc to show the trajectory of a falling object or drawing the shape of a triangle in the air to indicate a triangle). There is accumulating evidence that speakers are particularly likely to produce representational gestures when they are describing spatial information. For example, Beattie and Shovelton (2002) found that a property they termed imageability affected the probability that a clause of speech would be accompanied by gesture. Essentially, clauses that were most frequently accompanied by gesture were also most evocative of a mental image. Similarly, Feyereisen and Havard (1999) found that participants gestured significantly more when asked questions that would presumably activate mental images (e.g., Could you describe the room in which you live most often?) than questions that focused on abstract concepts (e.g., What do you think about the death penalty in the United States?). There are a variety of explanations for the co- occurrence of gestures with speech about spatial ideas. First, speakers may gesture with spatial concepts because gestures about spatial concepts are particularly likely to benefit a listener’s comprehension. However, Pine, Gurney, and Fletcher (2010) found that speakers gestured primarily with spatial concepts even when a listener could not see them, suggesting that the co-occurrence of gestures with spatial information is likely more than communicative. Second, gestures may occur with spatial information because spatial information affords gesture in a way that non-spatial information does not. For example, if a speaker is describing how big his fish was, he can easily display the size of the fish in gesture. If the speaker is describing how the fish tasted, on the other hand, it is much more difficult for him to represent the taste of the fish in his gesture. Indeed, Krauss, Dushay, Chen, and Rauscher (1995) found that very few of the speakers in their study gestured when describing the taste of a particular tea. While this explanation is often ignored by gesture researchers, it is likely part of the reason for the frequent occurrence of gesture with speech about spatial information. However, in addition to the fact that spatial information likely affords more gestures than non-spatial information, spatial representations in the mind of the speaker may also lead more naturally to gesture than non-spatial representations. When speakers think about spatial information, areas of visual and motor cortex are activated in much the same way as when the speakers actually interact with the physical world. When speakers imagine themselves performing a particular action, for example, the same areas of their motor cortex are activated as when they actually perform the action (Willems, Hagoort, & Cassasanto, 2010). Further, imagining how something rotates relies on the same motor areas involved in physically rotating an object (Wexler, Kosslyn, & Berthoz, 1998). Such evidence suggests that spatial cognition is embodied, or rooted in the way our bodies interact with the physical world (see Wilson, A third possibility is thus that the gestures people produce when they speak about spatial information are a manifestation of the embodied cognitive processes that are involved in thinking about spatial information. Several current theories about the cognitive origin of gestures propose that gestures arise from imagistic representations in the mind of the speaker (e.g., Hostetter & Alibali, 2008; Kita & Ozyurek, 2003). One such view, termed the Gesture as Simulated Action (GSA) framework, suggests that people produce representational gestures when their motor cortex is activated as the result of visual and motor simulations during thinking and speaking. When this motor activation is strong enough to cross a certain threshold (Hostetter & Alibali, 2008), the speaker produces a gesture. This threshold varies between individuals and situations, so what" @default.
- W2576877007 created "2017-01-26" @default.
- W2576877007 creator A5067255652 @default.
- W2576877007 creator A5071710930 @default.
- W2576877007 date "2011-01-01" @default.
- W2576877007 modified "2023-09-23" @default.
- W2576877007 title "Gestures are Produced During Spatial Tasks" @default.
- W2576877007 hasPublicationYear "2011" @default.
- W2576877007 type Work @default.
- W2576877007 sameAs 2576877007 @default.
- W2576877007 citedByCount "0" @default.
- W2576877007 crossrefType "journal-article" @default.
- W2576877007 hasAuthorship W2576877007A5067255652 @default.
- W2576877007 hasAuthorship W2576877007A5071710930 @default.
- W2576877007 hasConcept C145633318 @default.
- W2576877007 hasConcept C154945302 @default.
- W2576877007 hasConcept C15744967 @default.
- W2576877007 hasConcept C159620131 @default.
- W2576877007 hasConcept C162324750 @default.
- W2576877007 hasConcept C169760540 @default.
- W2576877007 hasConcept C169900460 @default.
- W2576877007 hasConcept C180747234 @default.
- W2576877007 hasConcept C187736073 @default.
- W2576877007 hasConcept C205649164 @default.
- W2576877007 hasConcept C207347870 @default.
- W2576877007 hasConcept C2777371692 @default.
- W2576877007 hasConcept C2780451532 @default.
- W2576877007 hasConcept C2781238097 @default.
- W2576877007 hasConcept C41008148 @default.
- W2576877007 hasConcept C46312422 @default.
- W2576877007 hasConcept C62649853 @default.
- W2576877007 hasConceptScore W2576877007C145633318 @default.
- W2576877007 hasConceptScore W2576877007C154945302 @default.
- W2576877007 hasConceptScore W2576877007C15744967 @default.
- W2576877007 hasConceptScore W2576877007C159620131 @default.
- W2576877007 hasConceptScore W2576877007C162324750 @default.
- W2576877007 hasConceptScore W2576877007C169760540 @default.
- W2576877007 hasConceptScore W2576877007C169900460 @default.
- W2576877007 hasConceptScore W2576877007C180747234 @default.
- W2576877007 hasConceptScore W2576877007C187736073 @default.
- W2576877007 hasConceptScore W2576877007C205649164 @default.
- W2576877007 hasConceptScore W2576877007C207347870 @default.
- W2576877007 hasConceptScore W2576877007C2777371692 @default.
- W2576877007 hasConceptScore W2576877007C2780451532 @default.
- W2576877007 hasConceptScore W2576877007C2781238097 @default.
- W2576877007 hasConceptScore W2576877007C41008148 @default.
- W2576877007 hasConceptScore W2576877007C46312422 @default.
- W2576877007 hasConceptScore W2576877007C62649853 @default.
- W2576877007 hasIssue "33" @default.
- W2576877007 hasLocation W25768770071 @default.
- W2576877007 hasOpenAccess W2576877007 @default.
- W2576877007 hasPrimaryLocation W25768770071 @default.
- W2576877007 hasRelatedWork W1992903665 @default.
- W2576877007 hasRelatedWork W2000196606 @default.
- W2576877007 hasRelatedWork W2086860589 @default.
- W2576877007 hasRelatedWork W2258362684 @default.
- W2576877007 hasRelatedWork W2406303455 @default.
- W2576877007 hasRelatedWork W2544772610 @default.
- W2576877007 hasRelatedWork W2567175992 @default.
- W2576877007 hasRelatedWork W2581416028 @default.
- W2576877007 hasRelatedWork W2734417770 @default.
- W2576877007 hasRelatedWork W2766755810 @default.
- W2576877007 hasRelatedWork W2782907791 @default.
- W2576877007 hasRelatedWork W2785035223 @default.
- W2576877007 hasRelatedWork W2891098396 @default.
- W2576877007 hasRelatedWork W2922080701 @default.
- W2576877007 hasRelatedWork W3024364909 @default.
- W2576877007 hasRelatedWork W3028451740 @default.
- W2576877007 hasRelatedWork W3083011872 @default.
- W2576877007 hasRelatedWork W3083460844 @default.
- W2576877007 hasRelatedWork W3122645841 @default.
- W2576877007 hasRelatedWork W3185245952 @default.
- W2576877007 hasVolume "33" @default.
- W2576877007 isParatext "false" @default.
- W2576877007 isRetracted "false" @default.
- W2576877007 magId "2576877007" @default.
- W2576877007 workType "article" @default.