Matches in SemOpenAlex for { <https://semopenalex.org/work/W2767282675> ?p ?o ?g. }
- W2767282675 abstract "Diagnostic Object Motion Weakens Representations of Static Form Benjamin Balas (bjbalas@mit.edu) Pawan Sinha (psinha@mit.edu) Department of Brain and Cognitive Sciences, 43 Vassar St. Cambridge, MA 02140 USA especially the existence of a motion benefit for unfamiliar faces and the possible differences between rigid and non- rigid motion (Christie & Bruce, 1988; Pike, Kemp, Towell, & Phillips, 1997; Schiff, 1986). The overall picture appears to be quite complex, but it seems fair to say that in some circumstances object motion is relied upon for categorization when static form is impoverished. A second issue regarding the use of motion and form for recognition relates to what happens when motion cues and form cues conflict somehow. By setting motion and form against one another, we can determine the relative weight allotted to each under clear viewing conditions. Currently, there is some evidence that the motion of an object may take precedence over static form cues. For example, a “chimeric” point-light walker with static cues indicative of one gender (as defined by shoulder-hip ratio) and dynamic cues indicative of the other is categorized according to its movement rather than its form (Thornton, Vuong, & Bulthoff, 2003). Similarly, in face perception there is evidence that infants use dynamic information more than static form as a cue for identity (Spencer, O'Brien, Johnston, & Hill, 2006). Infants will not dishabituate to an old motion pattern superimposed on a new face, indicating that the novelty of the form does not compensate for the familiarity of the motion. Finally, there are several results demonstrating that the direction of rotation for an unfamiliar object becomes an important cue for recognition after relatively little training (Stone, 1998; Vuong & Tarr, 2004). Specifically, reversing the direction of rotation has a strong impact on recognition ability, despite the fact that the same static information is available during training and test periods. Object motion overshadows form in this task, in that the violation of expected object motion has strong consequences even though form is preserved. These lines of work indicate that observers use object motion for recognition, and even suggest that it is given more importance than static form. In the current study, we extend this idea by examining whether or not observed object motion during training can affect test performance with static images. If object motion provides independent features for recognition, the absence of dynamic features at test should eliminate the effects of dynamic training. However, if dynamic training can affect later static performance, that provides good evidence for an interaction between object motion and the encoding of static form. Presently, it is unclear whether or not observed object motion can affect static recognition. During rigid rotation, it has been suggested that “structure-from-motion” might Abstract Past studies have shown that information about how objects move can play an important role in their recognition. Flow-fields associated with an object’s intrinsic motion, and also the sequence of views it presents over time can be used to identify the object and also link its disparate appearances. In the current study, we demonstrate that diagnostic object motion is such a perceptually significant cue that it can actually impair classification by de- emphasizing static figural information. Our stimuli comprise exemplars from a synthetic object category. The exemplars can be distinguished from each other on the basis of both static and dynamic cues. When object dynamics perfectly correlate with category membership during training, observers tested at static image classification display significantly longer RTs than observers trained with non-diagnostic object motion. This demonstrates that object motion is a particularly salient aspect of object appearance, capable of suppressing equally useful qualities such as static form, color, or texture. Keywords: Object recognition; object motion; categorization Introduction To what extent does object motion play a role in object recognition? This apparently simple question has a complicated answer. In particular, while there is a great deal of evidence suggesting human observers can and do use intrinsic object motion as a cue for identity, it remains unclear how motion and form interact during the acquisition of object concepts. In the current study, we attempt to address this issue by investigating the effects of diagnostic and non-diagnostic motion on the categorization of static object. Observers do use object motion to categorize stimuli. Though this can be seen in the results of studies using clearly viewed objects (Newell, Wallraven, & Huber, 2004), it is particularly evident when static form is degraded. An extreme version of this is the perception of “point-light walkers” (Johansson, 1973). In the absence of static cues for identity and gender, observers make good use of dynamic input to categorize walkers (Kozlowski & Cutting, 1977). A similar result obtains for face recognition. An “average” face that is made to undergo the idiosyncratic motions of a particular individual can be identified as that individual by naive observers (Hill & Johnston, 2001; Knappmeyer, Thornton, & Bulthoff, 2003). Finally, there are many studies suggesting that observation of a familiar moving face or body facilitates recognition under degraded viewing conditions (Burton, 1999; Knight & Johnson, 1997; Lander & Bruce, 2000). There remain several open issues," @default.
- W2767282675 created "2017-11-17" @default.
- W2767282675 creator A5022839055 @default.
- W2767282675 creator A5057633538 @default.
- W2767282675 date "2007-01-01" @default.
- W2767282675 modified "2023-09-23" @default.
- W2767282675 title "Diagnostic Object Motion Weakens Representations of Static Form" @default.
- W2767282675 cites W1975998885 @default.
- W2767282675 cites W1977324640 @default.
- W2767282675 cites W1985520156 @default.
- W2767282675 cites W1986340760 @default.
- W2767282675 cites W1986874010 @default.
- W2767282675 cites W1989201386 @default.
- W2767282675 cites W1997501629 @default.
- W2767282675 cites W2000364989 @default.
- W2767282675 cites W2011960721 @default.
- W2767282675 cites W2016351257 @default.
- W2767282675 cites W2028762931 @default.
- W2767282675 cites W2049428132 @default.
- W2767282675 cites W2053931351 @default.
- W2767282675 cites W2057623852 @default.
- W2767282675 cites W2099634219 @default.
- W2767282675 cites W2106995791 @default.
- W2767282675 cites W2112372463 @default.
- W2767282675 cites W2116848344 @default.
- W2767282675 cites W2120630712 @default.
- W2767282675 cites W2141573556 @default.
- W2767282675 cites W2144543461 @default.
- W2767282675 cites W2156154638 @default.
- W2767282675 cites W2167636130 @default.
- W2767282675 cites W2025175823 @default.
- W2767282675 hasPublicationYear "2007" @default.
- W2767282675 type Work @default.
- W2767282675 sameAs 2767282675 @default.
- W2767282675 citedByCount "0" @default.
- W2767282675 crossrefType "journal-article" @default.
- W2767282675 hasAuthorship W2767282675A5022839055 @default.
- W2767282675 hasAuthorship W2767282675A5057633538 @default.
- W2767282675 hasConcept C104114177 @default.
- W2767282675 hasConcept C138885662 @default.
- W2767282675 hasConcept C154945302 @default.
- W2767282675 hasConcept C15744967 @default.
- W2767282675 hasConcept C169760540 @default.
- W2767282675 hasConcept C180747234 @default.
- W2767282675 hasConcept C20864712 @default.
- W2767282675 hasConcept C26760741 @default.
- W2767282675 hasConcept C2778738651 @default.
- W2767282675 hasConcept C2779304628 @default.
- W2767282675 hasConcept C2781238097 @default.
- W2767282675 hasConcept C31972630 @default.
- W2767282675 hasConcept C41008148 @default.
- W2767282675 hasConcept C41895202 @default.
- W2767282675 hasConcept C46312422 @default.
- W2767282675 hasConcept C48575856 @default.
- W2767282675 hasConcept C77805123 @default.
- W2767282675 hasConcept C94124525 @default.
- W2767282675 hasConceptScore W2767282675C104114177 @default.
- W2767282675 hasConceptScore W2767282675C138885662 @default.
- W2767282675 hasConceptScore W2767282675C154945302 @default.
- W2767282675 hasConceptScore W2767282675C15744967 @default.
- W2767282675 hasConceptScore W2767282675C169760540 @default.
- W2767282675 hasConceptScore W2767282675C180747234 @default.
- W2767282675 hasConceptScore W2767282675C20864712 @default.
- W2767282675 hasConceptScore W2767282675C26760741 @default.
- W2767282675 hasConceptScore W2767282675C2778738651 @default.
- W2767282675 hasConceptScore W2767282675C2779304628 @default.
- W2767282675 hasConceptScore W2767282675C2781238097 @default.
- W2767282675 hasConceptScore W2767282675C31972630 @default.
- W2767282675 hasConceptScore W2767282675C41008148 @default.
- W2767282675 hasConceptScore W2767282675C41895202 @default.
- W2767282675 hasConceptScore W2767282675C46312422 @default.
- W2767282675 hasConceptScore W2767282675C48575856 @default.
- W2767282675 hasConceptScore W2767282675C77805123 @default.
- W2767282675 hasConceptScore W2767282675C94124525 @default.
- W2767282675 hasIssue "29" @default.
- W2767282675 hasLocation W27672826751 @default.
- W2767282675 hasOpenAccess W2767282675 @default.
- W2767282675 hasPrimaryLocation W27672826751 @default.
- W2767282675 hasRelatedWork W188900068 @default.
- W2767282675 hasRelatedWork W1964611493 @default.
- W2767282675 hasRelatedWork W1973807968 @default.
- W2767282675 hasRelatedWork W1999249890 @default.
- W2767282675 hasRelatedWork W2020360005 @default.
- W2767282675 hasRelatedWork W2025182937 @default.
- W2767282675 hasRelatedWork W2093859439 @default.
- W2767282675 hasRelatedWork W2095550040 @default.
- W2767282675 hasRelatedWork W2109626523 @default.
- W2767282675 hasRelatedWork W2120414517 @default.
- W2767282675 hasRelatedWork W2147992717 @default.
- W2767282675 hasRelatedWork W2314568509 @default.
- W2767282675 hasRelatedWork W2592484920 @default.
- W2767282675 hasRelatedWork W2767345573 @default.
- W2767282675 hasRelatedWork W2954367099 @default.
- W2767282675 hasRelatedWork W2990733171 @default.
- W2767282675 hasRelatedWork W3203421553 @default.
- W2767282675 hasRelatedWork W361316 @default.
- W2767282675 hasRelatedWork W46123646 @default.
- W2767282675 hasRelatedWork W2783323118 @default.
- W2767282675 hasVolume "29" @default.
- W2767282675 isParatext "false" @default.