Matches in SemOpenAlex for { <https://semopenalex.org/work/W2111860260> ?p ?o ?g. }
- W2111860260 abstract "Michael Esterman and Steven YantisDepartment of Psychological and Brain Sciences, JohnsHopkins University, Baltimore, MD 21218-2686, USASelective visual attention directed to a location (even in theabsence of a stimulus) increases activity in the correspondingregions of visual cortex and enhances the speed and accuracy oftarget perception. We further explored top-down influences onperceptual representations by manipulating observers’ expectationsabout the category of an upcoming target. Observers vieweda display in which an object (either a face or a house) graduallyemerged from a state of phase-scrambled noise; a cue establishedexpectation about the object category. Observers were faster tocategorize faces (gender discrimination) or houses (structuraldiscrimination) when the category of the partially scrambled objectmatched their expectation. Functional magnetic resonance imagingrevealed that this expectation was associated with anticipatoryincreases in category-specific visual cortical activity, even in theabsence of object- or category-specific visual information. Expect-ing a face evoked increased activity in face-selective corticalregions in the fusiform gyrus and superior temporal sulcus.Conversely, expecting a house increased activity in parahippocam-pal gyrus. These results suggest that visual anticipation facilitatessubsequent perception by recruiting, in advance, the same corticalmechanisms as those involved in perception.Keywords: attention, expectation, fMRI, object perceptionIntroductionThe perception and interpretation of the visual world is alwayscolored by previous experience and by current expectations.Helmholtz (1867/1910) argued that observers routinely gen-erate hypotheses about what they expect to see, based on priorexperience and current goals. These expectations can be richwith spatial, featural, and object-specific content. Many studieshave demonstrated that perception is facilitated when a targetappears at an expected location (Posner et al. 1980; Prinzmetalet al. 2005). Similarly, expectation of visual features such asshape, color, and motion facilitates recognition and discrimi-nation (e.g., Ball and Sekuler 1981; Corbetta et al. 1990; Leonardand Egeth 2008).Higher level visual priming of complex stimuli such as facesand houses has recently been shown to facilitate objectrecognition as well. For example, Puri and Wojciulik (2008)had participants discriminate between normal and distortedimages of faces and houses. When they were told in advance toexpect a particular famous face, participants were faster andmore accurate at the discrimination when the target matchedtheir expectation. In many of these examples, not only didcorrect expectation lead to better performance but alsoincorrect expectation impaired performance. In addition toknowing ‘‘where’’ and ‘‘what’’ to expect, we can also generatehypotheses about ‘‘when’’ a stimulus will appear. This kind oftemporal cueing or expectation also facilitates visual percep-tion (e.g., Correa et al. 2005; for review, see Nobre et al. 2007).The neural mechanisms of spatial and feature-based visualexpectationhavebeenstudiedextensively.Forexample,directingattentiontoalocationinanticipationofatargetleadstoincreasedneural activity in spatially specific regions of visual cortex (Lucketal.1997;Kastneretal.1999;Hopfingeretal.2000).Furthermore,this effect is modulated by task difficulty, such that anticipatinga more difficult discrimination leads to a greater baseline shift incorticalactivityevenintheabsenceofvisualstimulation(Serencesetal.2004).Changesinbaselineactivityhavealsobeenobservedinthe feature domain: anticipation of the onset of motion leads toanticipatory activity in motion selection visual cortex, area MT/MST (Chawla et al. 1999; Shulman et al. 1999), and converselyactivity in color-selective V4 increases—in the absence ofcolor—when a color task is anticipated. Together, these studiessuggest that one of the mechanisms of perceptual expectation isanticipatory neural activity in brain regions associated withstimulus-specificinformation.Ithasbeendemonstratedthattheseanticipatory effects in the visual brain are associated with, andpresumably controlledby, regions of parietaland frontalcortices,such as intraparietal sulcus (IPS), frontal eye fields, and lateralprefrontal cortex (PFC) (e.g., Kastner et al. 1999; Hopfinger et al.2000). However, it has not clearly been demonstrated thatanticipatory effects extend to more abstract categorical visualrepresentations.Object recognition in the real world is often ambiguous andcontinuously unfolding in time. One way in which this aspectof the recognition process has been studied is with ambiguousimages that gradually become disambiguated over severalseconds. In the 1960s, Bruner and Potter (1964) found thatwhen gradually reducing the blurriness of an image, theblurrier the image was when the sequence began, the longerit took for participants to recognize the object. Bruner andPotter argued that longer exposure to blurry objects lead to‘‘false’’ and premature hypotheses or expectations, impairingthe recognition process.This method has usefully been applied to investigate theneural basis of expectation on object recognition usingfunctional magnetic resonance imaging (fMRI). For example,Eger et al. (2007) found that word cues did speed therecognition of objects that were slowly coming into focus,but this was not accompanied by anticipatory changes in visualcortex when controlling for speed of recognition. However, inthat study, the primes were verbal rather than visual. Forexample, expecting a ‘‘rabbit’’ to appear does not necessarilylead to a useful visual template for object recognition becausethere are nearly an infinite variety of specific visual images thatcould correspond to ‘‘rabbit’’." @default.
- W2111860260 created "2016-06-24" @default.
- W2111860260 creator A5003593297 @default.
- W2111860260 creator A5028963810 @default.
- W2111860260 date "2010-01-01" @default.
- W2111860260 modified "2023-09-27" @default.
- W2111860260 title "Perceptual Expectation Evokes Category" @default.
- W2111860260 cites W1481105901 @default.
- W2111860260 cites W1588833135 @default.
- W2111860260 cites W1619178479 @default.
- W2111860260 cites W1708732123 @default.
- W2111860260 cites W1831445803 @default.
- W2111860260 cites W1873033581 @default.
- W2111860260 cites W1886182875 @default.
- W2111860260 cites W1968635050 @default.
- W2111860260 cites W1980914952 @default.
- W2111860260 cites W1985153917 @default.
- W2111860260 cites W1985236820 @default.
- W2111860260 cites W1996398666 @default.
- W2111860260 cites W1999110283 @default.
- W2111860260 cites W2014870330 @default.
- W2111860260 cites W2017856636 @default.
- W2111860260 cites W2022613290 @default.
- W2111860260 cites W2024268742 @default.
- W2111860260 cites W2037652044 @default.
- W2111860260 cites W2044179284 @default.
- W2111860260 cites W2048335925 @default.
- W2111860260 cites W2056989262 @default.
- W2111860260 cites W2068718477 @default.
- W2111860260 cites W2081923799 @default.
- W2111860260 cites W2085663684 @default.
- W2111860260 cites W2087044777 @default.
- W2111860260 cites W2095909460 @default.
- W2111860260 cites W2096955942 @default.
- W2111860260 cites W2098982334 @default.
- W2111860260 cites W2100879297 @default.
- W2111860260 cites W2111609296 @default.
- W2111860260 cites W2121449541 @default.
- W2111860260 cites W2123260548 @default.
- W2111860260 cites W2123341385 @default.
- W2111860260 cites W2124275365 @default.
- W2111860260 cites W2124413776 @default.
- W2111860260 cites W2124807472 @default.
- W2111860260 cites W2125523734 @default.
- W2111860260 cites W2127870911 @default.
- W2111860260 cites W2128294552 @default.
- W2111860260 cites W2129071654 @default.
- W2111860260 cites W2131744263 @default.
- W2111860260 cites W2131998780 @default.
- W2111860260 cites W2134927309 @default.
- W2111860260 cites W2145458017 @default.
- W2111860260 cites W2149577107 @default.
- W2111860260 cites W2150171475 @default.
- W2111860260 cites W2156417566 @default.
- W2111860260 cites W2164239909 @default.
- W2111860260 cites W2165582963 @default.
- W2111860260 cites W2167957356 @default.
- W2111860260 hasPublicationYear "2010" @default.
- W2111860260 type Work @default.
- W2111860260 sameAs 2111860260 @default.
- W2111860260 citedByCount "0" @default.
- W2111860260 crossrefType "journal-article" @default.
- W2111860260 hasAuthorship W2111860260A5003593297 @default.
- W2111860260 hasAuthorship W2111860260A5028963810 @default.
- W2111860260 hasConcept C154945302 @default.
- W2111860260 hasConcept C15744967 @default.
- W2111860260 hasConcept C161657702 @default.
- W2111860260 hasConcept C169760540 @default.
- W2111860260 hasConcept C178253425 @default.
- W2111860260 hasConcept C180747234 @default.
- W2111860260 hasConcept C198313034 @default.
- W2111860260 hasConcept C200928527 @default.
- W2111860260 hasConcept C26760741 @default.
- W2111860260 hasConcept C2777655717 @default.
- W2111860260 hasConcept C2779226451 @default.
- W2111860260 hasConcept C2779345533 @default.
- W2111860260 hasConcept C2779918689 @default.
- W2111860260 hasConcept C2781238097 @default.
- W2111860260 hasConcept C41008148 @default.
- W2111860260 hasConcept C46312422 @default.
- W2111860260 hasConceptScore W2111860260C154945302 @default.
- W2111860260 hasConceptScore W2111860260C15744967 @default.
- W2111860260 hasConceptScore W2111860260C161657702 @default.
- W2111860260 hasConceptScore W2111860260C169760540 @default.
- W2111860260 hasConceptScore W2111860260C178253425 @default.
- W2111860260 hasConceptScore W2111860260C180747234 @default.
- W2111860260 hasConceptScore W2111860260C198313034 @default.
- W2111860260 hasConceptScore W2111860260C200928527 @default.
- W2111860260 hasConceptScore W2111860260C26760741 @default.
- W2111860260 hasConceptScore W2111860260C2777655717 @default.
- W2111860260 hasConceptScore W2111860260C2779226451 @default.
- W2111860260 hasConceptScore W2111860260C2779345533 @default.
- W2111860260 hasConceptScore W2111860260C2779918689 @default.
- W2111860260 hasConceptScore W2111860260C2781238097 @default.
- W2111860260 hasConceptScore W2111860260C41008148 @default.
- W2111860260 hasConceptScore W2111860260C46312422 @default.
- W2111860260 hasLocation W21118602601 @default.
- W2111860260 hasOpenAccess W2111860260 @default.
- W2111860260 hasPrimaryLocation W21118602601 @default.
- W2111860260 hasRelatedWork W146946113 @default.