Matches in SemOpenAlex for { <https://semopenalex.org/work/W2767410147> ?p ?o ?g. }
- W2767410147 abstract "Feature-Semantic Gradients in Lexical Categorization Revealed by Graded Manual Responses Rick Dale (rad28@cornell.edu) Nicholas C. Hindy (nch24@cornell.edu) Michael J. Spivey (spivey@cornell.edu) Department of Psychology, Cornell University, Ithaca, NY, 14853 Abstract research over the past 10 years has shown that eye movements offer a semi-continuous measure of ongoing cognitive processing (Ballard, Hayhoe, & Pelz, 1995; Tanenhaus, Spivey-Knowlton, Eberhard, & Sedivy, 1995; Underwood, 2005). Aggregate data from eye movements often indicate a graded nature inherent to cognition in general. Similar findings demonstrate that manual motor output can reveal graded representations. The force and velocity of manual responses vary concomitantly with frequency in a lexical decision task (Abrams & Balota, 1991; Balota & Abrams, 1995), and response and stimulus probability in simple reaction-time tasks (Mattes, Ulrich, & Miller, 2002; Ulrich, Mattes, & Miller, 1999; see also Osman, Kornblum, & Meyer, 1986; Balota, Boland, & Shields, 1989). And in experimental work similar to the saccade trajectory experiments described above, Tipper, Howard, and Jackson (1997) have shown that arm trajectories can curve depending on the visual distractor context in which reaching motions are made (see also Tipper et al., 1992; Sheliga et al., 1997). More recently, Spivey, Grosjean, and Knoblich (2005) and Dale, Kehoe, and Spivey (in press) used computer-mouse trajectories to show that graded manual output reveals temporal continuity in the underlying cognitive processes in spoken word recognition and categorization. In the latter two studies, manual trajectories were measured through streaming x-y coordinates of computer- mouse movement, and revealed attraction to other response choices in the visual display. For example, in Dale et al. (in press), mouse trajectories were recorded during lexical and pictorial categorization of animal exemplars. Participants categorized an animal by clicking the mouse on one of two category choices. Mouse-movement trajectories consisted of a movement from the bottom center of the screen, to the correct target on the upper left- or right-hand corner of the screen (beside which was a competing category label). Target trials used atypical animals (e.g., whale) with an incorrect competitor category that had considerable overlap in terms of semantic and visual features (e.g., fish). Though participants responded by clicking the appropriate category (e.g., mammal), mouse-movement trajectories exhibited substantial attraction toward the competitor category. Competing activation of the incorrect category in these trials was evident even in the properties of the resultant motor Participants performed a categorization task in which basic- level animal names (e.g., cat) were assigned to their superordinate categories (e.g., mammal). Manual motor output was measured by sampling computer-mouse movement while participants clicked on the correct superordinate category label, and not on a simultaneously presented incorrect category. Animal names were selected from the concept-name set of McRae, de Sa, & Seidenberg (1997), in which each concept is associated with a sparse semantic feature vector. If the competing category label draws motor attraction during the categorization task, this attraction should be predicted by feature-semantic measures based on animals’ proximity to the incorrect category. This proximity was computed by comparing each animal’s feature vector to the mean vector of alternative category choices (e.g., cat’s vector to the central tendency of all reptile vectors). Dependent measures were computed from mouse-movement trajectories. Degree of trajectory curvature correlated with the proximity of an animal’s vector to the mean vector of alternative categories, but only in a particular feature- semantic space. Results suggest that continuous motor output may systematically reflect underlying cognitive processing. Keywords: Categorization, representation, motor output typicality, semantics, Introduction An increasing amount of research reveals that dynamic characteristics of motor output reflect underlying cognitive processing, rather than simply reflecting the discrete decision resulting from that processing. For example, when the cognitive system directs manual output amidst an array of graspable objects, the arm’s movement does not always proceed in ballistic fashion toward a single selected object, but may vary continuously depending on the nature of underlying processing. Both manual output and oculomotor responses demonstrate these dynamic characteristics intrinsic to the temporal extent of a response, not just the final outcome of the response. For example, Doyle and Walker (2001) demonstrate that saccadic eye movements reflect attentional processing of visual cues in a simple fixation experiment. Saccade trajectories to the same location exhibit very subtle differential curvature depending on the position of distractor or cue stimuli (see also Sheliga, Riggio, & Rizzolatti, 1995). Additionally, considerable" @default.
- W2767410147 created "2017-11-17" @default.
- W2767410147 creator A5009359203 @default.
- W2767410147 creator A5068731124 @default.
- W2767410147 creator A5084091340 @default.
- W2767410147 date "2006-01-01" @default.
- W2767410147 modified "2023-09-22" @default.
- W2767410147 title "Feature-Semantic Gradients in Lexical Categorization Revealed by Graded Manual Responses" @default.
- W2767410147 cites W1774048885 @default.
- W2767410147 cites W1967729480 @default.
- W2767410147 cites W1981206913 @default.
- W2767410147 cites W1983578042 @default.
- W2767410147 cites W1989304657 @default.
- W2767410147 cites W1991216449 @default.
- W2767410147 cites W2009498295 @default.
- W2767410147 cites W2011827429 @default.
- W2767410147 cites W2036691323 @default.
- W2767410147 cites W2038839297 @default.
- W2767410147 cites W2042938519 @default.
- W2767410147 cites W2053127376 @default.
- W2767410147 cites W2067223903 @default.
- W2767410147 cites W2067867446 @default.
- W2767410147 cites W2083124176 @default.
- W2767410147 cites W2083303100 @default.
- W2767410147 cites W2093458032 @default.
- W2767410147 cites W2102707045 @default.
- W2767410147 cites W2107958481 @default.
- W2767410147 cites W2108807532 @default.
- W2767410147 cites W2113881303 @default.
- W2767410147 cites W2119844732 @default.
- W2767410147 cites W2130130286 @default.
- W2767410147 cites W2136797351 @default.
- W2767410147 cites W2152444902 @default.
- W2767410147 cites W2169529893 @default.
- W2767410147 cites W2171235789 @default.
- W2767410147 hasPublicationYear "2006" @default.
- W2767410147 type Work @default.
- W2767410147 sameAs 2767410147 @default.
- W2767410147 citedByCount "0" @default.
- W2767410147 crossrefType "journal-article" @default.
- W2767410147 hasAuthorship W2767410147A5009359203 @default.
- W2767410147 hasAuthorship W2767410147A5068731124 @default.
- W2767410147 hasAuthorship W2767410147A5084091340 @default.
- W2767410147 hasConcept C126398093 @default.
- W2767410147 hasConcept C151730666 @default.
- W2767410147 hasConcept C153050134 @default.
- W2767410147 hasConcept C154945302 @default.
- W2767410147 hasConcept C15744967 @default.
- W2767410147 hasConcept C169760540 @default.
- W2767410147 hasConcept C169900460 @default.
- W2767410147 hasConcept C180747234 @default.
- W2767410147 hasConcept C188147891 @default.
- W2767410147 hasConcept C2779343474 @default.
- W2767410147 hasConcept C2779524336 @default.
- W2767410147 hasConcept C41008148 @default.
- W2767410147 hasConcept C46312422 @default.
- W2767410147 hasConcept C86803240 @default.
- W2767410147 hasConcept C94124525 @default.
- W2767410147 hasConceptScore W2767410147C126398093 @default.
- W2767410147 hasConceptScore W2767410147C151730666 @default.
- W2767410147 hasConceptScore W2767410147C153050134 @default.
- W2767410147 hasConceptScore W2767410147C154945302 @default.
- W2767410147 hasConceptScore W2767410147C15744967 @default.
- W2767410147 hasConceptScore W2767410147C169760540 @default.
- W2767410147 hasConceptScore W2767410147C169900460 @default.
- W2767410147 hasConceptScore W2767410147C180747234 @default.
- W2767410147 hasConceptScore W2767410147C188147891 @default.
- W2767410147 hasConceptScore W2767410147C2779343474 @default.
- W2767410147 hasConceptScore W2767410147C2779524336 @default.
- W2767410147 hasConceptScore W2767410147C41008148 @default.
- W2767410147 hasConceptScore W2767410147C46312422 @default.
- W2767410147 hasConceptScore W2767410147C86803240 @default.
- W2767410147 hasConceptScore W2767410147C94124525 @default.
- W2767410147 hasIssue "28" @default.
- W2767410147 hasLocation W27674101471 @default.
- W2767410147 hasOpenAccess W2767410147 @default.
- W2767410147 hasPrimaryLocation W27674101471 @default.
- W2767410147 hasRelatedWork W113673499 @default.
- W2767410147 hasRelatedWork W1540293728 @default.
- W2767410147 hasRelatedWork W1895824185 @default.
- W2767410147 hasRelatedWork W1973899547 @default.
- W2767410147 hasRelatedWork W197730313 @default.
- W2767410147 hasRelatedWork W2012714258 @default.
- W2767410147 hasRelatedWork W2013429447 @default.
- W2767410147 hasRelatedWork W2033158692 @default.
- W2767410147 hasRelatedWork W2062094064 @default.
- W2767410147 hasRelatedWork W2088696690 @default.
- W2767410147 hasRelatedWork W2140519006 @default.
- W2767410147 hasRelatedWork W2167079075 @default.
- W2767410147 hasRelatedWork W2395842061 @default.
- W2767410147 hasRelatedWork W2573537852 @default.
- W2767410147 hasRelatedWork W2576452649 @default.
- W2767410147 hasRelatedWork W2765325433 @default.
- W2767410147 hasRelatedWork W2768809476 @default.
- W2767410147 hasRelatedWork W2902663244 @default.
- W2767410147 hasRelatedWork W3139431667 @default.
- W2767410147 hasRelatedWork W91064846 @default.
- W2767410147 hasVolume "28" @default.
- W2767410147 isParatext "false" @default.
- W2767410147 isRetracted "false" @default.