Matches in SemOpenAlex for { <https://semopenalex.org/work/W2564496691> ?p ?o ?g. }
Showing items 1 to 85 of
85
with 100 items per page.
- W2564496691 abstract "Multisensory Associative-Pair Learning: Evidence for ‘Unitization’ as a specialized mechanism Elan Barenholtz (elan.barenholtz@fau.edu) Department of Psychology, 777 Glades Road Boca Raton, FL 33433 Meredith Davidson (mdavid14@fau.edu) Department of Psychology, 777 Glades Road Boca Raton, FL 33433 David Lewkowicz (lewkowic@fau.edu) Department of Psychology, 777 Glades Road Boca Raton, FL 33433 Abstract Eichenbaum, 1997; Eichenbaum, 1997; Eichenbaum & Bunsey, 1995). This view is represented in a number of theories of face recognition which hold that associating the face and voice of an individual depends on integrating distinct informational streams into a single, ‘Personal Identity Node’, or PIN (Bruce & Young, 1986; Burton, Bruce, & Johnston, 1990; Ellis, Jones, & Mosdell, 1997). Unitizing multisensory properties may make multisensory object-knowledge more efficient, since each observed property of that object may be associated with all other, previously observed, properties via a single link, rather than maintaining associations among many disparate properties. An additional potential advantage to a unitized representation, implicit in the PIN model, is that it may help to organize associations that go beyond specific stimulus- stimulus pairings to more abstract properties of an underlying ‘object’. For example, if one has encountered a specific auditory utterance of an individual, along with his or her face, it would be advantageous to associate a different utterance by the same individual with that face. Presumably, this depends on extracting ‘invariant’ properties of the underlying voice from the sample. Representing individual face and voice stimuli as properties of the same underlying individual may facilitate this process. Despite the potential theoretical advantages to unitization, there has been no direct behavioral support for the idea that multisensory unitization is a specialized form of associative learning. In the current study, we compared associative learning of visual/auditory pairs under conditions where the members of the pair were either likely or unlikely to belong to the same object by virtue of their membership in the same or different category. Specifically, we compared face/voice learning when the members of each pair were of the same or opposite gender (Experiment 1) or the same or different species (Experiment 2). We reasoned that since only congruent pairs are consistent with belonging to the same object (for example, our experience is that people with male faces always have male voices) they would be likely to be Learning about objects typically involves the association of multisensory attributes. Here, we present three experiments supporting the existence of a specialized form of associative learning that depends on ‘unitization’. When multisensory pairs (e.g. faces and voices) were likely to both belong to a single object, learning was superior than when the pairs were not likely to belong to the same object. Experiment 1 found that learning of face-voice pairs was superior when the members of each pair were the same gender vs. opposite gender. Experiment 2 found a similar result when the paired associates were pictures and vocalizations of the same species vs. different species (dogs and birds). In Experiment 3, gender-incongruent video and audio stimuli were dubbed, producing an artificially unitized stimulus reducing the congruency advantage. Overall, these results suggest that unitizing multisensory attributes into a single object or identity is a specialized form of associative learning Introduction Learning about objects typically involves the detection and association of multisensory attributes. For example, we may be able to identify certain foods based on their visual, gustatory, tactile as well as olfactory properties. Likewise, ‘knowing’ a person typically means being able to associate his or her face with his or her voice. How do we encode the multisensory properties of objects? One possibility is that such “object knowledge” simply consists of a network of associations among each of an object’s unisensory properties. According to this view, our knowledge about unitary objects may depend on the same learning mechanisms as other types of object memory, such as associations between different objects or between objects and other properties of the environments in which they appear. A second possibility is that multiple unisensory object properties are all linked via an intermediate ‘supramodal’ representation of the object (Mesulam, 1998). According to this view, associating intra-object information is a special class of associative learning, involving the creation of a ‘unitized’ representation (Cohen, Poldrack, &" @default.
- W2564496691 created "2017-01-06" @default.
- W2564496691 creator A5029263332 @default.
- W2564496691 creator A5035096794 @default.
- W2564496691 creator A5055719182 @default.
- W2564496691 date "2011-01-01" @default.
- W2564496691 modified "2023-09-24" @default.
- W2564496691 title "Multisensory Associative-Pair Learning: Evidence for 'Unitization' as a specialized mechanism" @default.
- W2564496691 cites W1497204367 @default.
- W2564496691 cites W1873033581 @default.
- W2564496691 cites W1978109396 @default.
- W2564496691 cites W2011096173 @default.
- W2564496691 cites W2019111214 @default.
- W2564496691 cites W2050379054 @default.
- W2564496691 cites W2058580374 @default.
- W2564496691 cites W2073367016 @default.
- W2564496691 cites W2114033154 @default.
- W2564496691 cites W2123341385 @default.
- W2564496691 cites W2126883639 @default.
- W2564496691 cites W2148601705 @default.
- W2564496691 cites W2149095485 @default.
- W2564496691 cites W2158422865 @default.
- W2564496691 cites W312185384 @default.
- W2564496691 cites W578997765 @default.
- W2564496691 hasPublicationYear "2011" @default.
- W2564496691 type Work @default.
- W2564496691 sameAs 2564496691 @default.
- W2564496691 citedByCount "0" @default.
- W2564496691 crossrefType "journal-article" @default.
- W2564496691 hasAuthorship W2564496691A5029263332 @default.
- W2564496691 hasAuthorship W2564496691A5035096794 @default.
- W2564496691 hasAuthorship W2564496691A5055719182 @default.
- W2564496691 hasConcept C154945302 @default.
- W2564496691 hasConcept C15744967 @default.
- W2564496691 hasConcept C159423971 @default.
- W2564496691 hasConcept C180747234 @default.
- W2564496691 hasConcept C188147891 @default.
- W2564496691 hasConcept C202444582 @default.
- W2564496691 hasConcept C2775852435 @default.
- W2564496691 hasConcept C2779918689 @default.
- W2564496691 hasConcept C2983526489 @default.
- W2564496691 hasConcept C33923547 @default.
- W2564496691 hasConcept C41008148 @default.
- W2564496691 hasConcept C46312422 @default.
- W2564496691 hasConceptScore W2564496691C154945302 @default.
- W2564496691 hasConceptScore W2564496691C15744967 @default.
- W2564496691 hasConceptScore W2564496691C159423971 @default.
- W2564496691 hasConceptScore W2564496691C180747234 @default.
- W2564496691 hasConceptScore W2564496691C188147891 @default.
- W2564496691 hasConceptScore W2564496691C202444582 @default.
- W2564496691 hasConceptScore W2564496691C2775852435 @default.
- W2564496691 hasConceptScore W2564496691C2779918689 @default.
- W2564496691 hasConceptScore W2564496691C2983526489 @default.
- W2564496691 hasConceptScore W2564496691C33923547 @default.
- W2564496691 hasConceptScore W2564496691C41008148 @default.
- W2564496691 hasConceptScore W2564496691C46312422 @default.
- W2564496691 hasIssue "33" @default.
- W2564496691 hasLocation W25644966911 @default.
- W2564496691 hasOpenAccess W2564496691 @default.
- W2564496691 hasPrimaryLocation W25644966911 @default.
- W2564496691 hasRelatedWork W1493549571 @default.
- W2564496691 hasRelatedWork W1968702067 @default.
- W2564496691 hasRelatedWork W2011690062 @default.
- W2564496691 hasRelatedWork W2044261945 @default.
- W2564496691 hasRelatedWork W2047311563 @default.
- W2564496691 hasRelatedWork W2050890463 @default.
- W2564496691 hasRelatedWork W2060398448 @default.
- W2564496691 hasRelatedWork W2074663372 @default.
- W2564496691 hasRelatedWork W2076150739 @default.
- W2564496691 hasRelatedWork W2086402272 @default.
- W2564496691 hasRelatedWork W2093844615 @default.
- W2564496691 hasRelatedWork W2112445545 @default.
- W2564496691 hasRelatedWork W2142393318 @default.
- W2564496691 hasRelatedWork W2142505231 @default.
- W2564496691 hasRelatedWork W2302647434 @default.
- W2564496691 hasRelatedWork W2419453286 @default.
- W2564496691 hasRelatedWork W2531369708 @default.
- W2564496691 hasRelatedWork W2572733356 @default.
- W2564496691 hasRelatedWork W2786082069 @default.
- W2564496691 hasRelatedWork W2999159143 @default.
- W2564496691 hasVolume "33" @default.
- W2564496691 isParatext "false" @default.
- W2564496691 isRetracted "false" @default.
- W2564496691 magId "2564496691" @default.
- W2564496691 workType "article" @default.