Matches in SemOpenAlex for { <https://semopenalex.org/work/W2019970108> ?p ?o ?g. }
Showing items 1 to 78 of
78
with 100 items per page.
- W2019970108 endingPage "R543" @default.
- W2019970108 startingPage "R542" @default.
- W2019970108 abstract "Motion of the image of an object across the retina (or a camera sensor) may be due to movement of the object, movement of the observation point or a combination of the two. Humans are able to routinely distinguish between these causes and correctly perceive whether an object of interest is in motion or scene-stationary. The important question is how this ubiquitous and difficult problem is solved. We have investigated whether the brain can resolve the ambiguity by comparing the retinal motion of the object of interest — the target — to that of scene objects. We find that relative retinal motion can indeed be used, and suggest that the processing may be done by cortical areas sensitive to optic flow [1Warren W.H. Hannon D.J. Direction of self-motion is perceived from optical flow.Nature. 1988; 336: 162-168Crossref Scopus (408) Google Scholar, 2Vaina L.M. Beardsley S.A. Rushton S.K. Optic Flow and Beyond. Kluwer, 2004Crossref Google Scholar]. Moving the scene relative to a stationary observer — rather than the observer within the scene — provides a way to focus on the role of relative retinal motion by excluding the contribution of other sources of movement information. Some particularly ingenious researchers have found ways to move physical rooms or their ‘virtual’ equivalents [3Lee D.N. Lishman J.R. Visual proprioceptive control of stance.J. Human Movement Studies. 1975; 1: 87-95Google Scholar, 4Howard I.P. Childerson L. The contribution of motion, the visual frame and visual polarity to sensations of body tilt.Perception. 1994; 23: 753-762Crossref PubMed Scopus (112) Google Scholar, 5Harris L.R. Jenkin M.R. Dyde R.T. Jenkin H.L. Failure to update spatial location correctly using visual cues alone.J. Vis. 2004; 4: 381aCrossref Google Scholar] around static observers. In this study, we employed an alternative, simpler solution and moved a virtual scene composed of an array of cubes (Figure 1, right panel), presented on a CRT to a stationary observer. The scene was rendered in stereo and viewed through shutter glasses which produced a compelling percept of three-dimensional objects floating in space. In every presentation, a stationary target object was placed directly ahead of the stationary observer. Scene objects were moved over the screen to produce the pattern of retinal motion that would result if the observer undertook the natural action of maintaining fixation on the target object while moving sideways (Figure 1, left panel). This is a particularly interesting action for two reasons: firstly, it produces a complex pattern of retinal motion (Figure 1, right panel); and secondly, because of the geometrical consequences of the relationship between target distance and scene-relative movement (Figure 1, left panel). Examine the left panel illustrating the simulated observer movement and note that the target object is shown at three distances: F1, F2 and N (filled circles). Because the target remains directly ahead of the observer during the simulated movement, it must be moving within the scene. Geometry dictates that a target object at F1 is moving faster through the scene than a target object at F2. Further, a target object at N is moving in the opposite direction to target objects at F1 and F2. This is a very useful relationship because, if the brain does use relative motion to calculate scene-relative movement, then by simply changing the distance of the target object in our experiment, we should be able to produce predictable changes in perceived target velocity. Rather than rely on a subjective report of perceived speed we attempted to tap the observer’s immediate percept of movement. To do so, we made use of a measure employed in a similar situation by Smeets and Brenner [6Smeets J.B.J. Brenner E. The difference between the perception of absolute and relative motion: a reaction time study.Vision Res. 1994; 34: 191-195Crossref PubMed Scopus (55) Google Scholar]. It has previously been shown that the time it takes to detect movement is a function of speed; fast movements are detected more quickly [6Smeets J.B.J. Brenner E. The difference between the perception of absolute and relative motion: a reaction time study.Vision Res. 1994; 34: 191-195Crossref PubMed Scopus (55) Google Scholar]. Because of the geometric considerations discussed above, if the brain uses relative motion and an observer is asked to press a button as soon as target movement is detected, then the button press should occur sooner when the target is at distance F1 than when it is at F2. Furthermore, we should be able to manipulate which of two buttons, indicating target direction, an observer will press by placing the target at either N or F1. Note that the motion on the retina is identical in all three cases; only the binocular disparity of the target differs. The relative motion predictions were supported by the data. The average — mean median of five observers — response time at F1 (629 msec) was significantly (t(4)=–2.77; p<0.05; one-tailed) shorter than at F2 (664msec), consistent with the target being perceived as moving faster at F1. Furthermore, the reported direction of movement was consistent with the perception of the target moving in opposite directions at F1 and N: in the with-head direction 95% of the time at N compared to 6% of the time at F1 (t(4)=–17.03; p<0.001; one-tailed). These results are in line with the geometric predictions and indicate that observers can indeed use the relative retinal motion of scene objects to detect movement of an object of interest during self-movement. The results do not isolate an underlying mechanism or algorithm responsible for this ability but, as noted earlier, a candidate may have already been identified. Researchers interested in the visual guidance of locomotion have demonstrated the brain’s sensitivity to optic flow — the patterns of relative motion that are characteristic of self-movement (see [2Vaina L.M. Beardsley S.A. Rushton S.K. Optic Flow and Beyond. Kluwer, 2004Crossref Google Scholar] for a recent review). If the retinal motion due to self-movement could be identified and isolated by such a mechanism, then only a simple calculation is required to separate it out. Any remaining motion can then be attributed to movement of an object within the scene. We thank Tom Freeman, Alex Holcombe, Bob Snowden and two anonymous referees for comments on earlier versions of this paper. Download .pdf (.02 MB) Help with pdf files Supplemental document S1." @default.
- W2019970108 created "2016-06-24" @default.
- W2019970108 creator A5027406448 @default.
- W2019970108 creator A5090697753 @default.
- W2019970108 date "2005-07-01" @default.
- W2019970108 modified "2023-10-13" @default.
- W2019970108 title "Moving observers, relative retinal motion and the detection of object movement" @default.
- W2019970108 cites W1965267253 @default.
- W2019970108 cites W2007156328 @default.
- W2019970108 cites W2086378563 @default.
- W2019970108 cites W2117863599 @default.
- W2019970108 cites W2140643899 @default.
- W2019970108 doi "https://doi.org/10.1016/j.cub.2005.07.020" @default.
- W2019970108 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/16051158" @default.
- W2019970108 hasPublicationYear "2005" @default.
- W2019970108 type Work @default.
- W2019970108 sameAs 2019970108 @default.
- W2019970108 citedByCount "88" @default.
- W2019970108 countsByYear W20199701082012 @default.
- W2019970108 countsByYear W20199701082013 @default.
- W2019970108 countsByYear W20199701082014 @default.
- W2019970108 countsByYear W20199701082015 @default.
- W2019970108 countsByYear W20199701082016 @default.
- W2019970108 countsByYear W20199701082017 @default.
- W2019970108 countsByYear W20199701082018 @default.
- W2019970108 countsByYear W20199701082019 @default.
- W2019970108 countsByYear W20199701082020 @default.
- W2019970108 countsByYear W20199701082021 @default.
- W2019970108 countsByYear W20199701082022 @default.
- W2019970108 countsByYear W20199701082023 @default.
- W2019970108 crossrefType "journal-article" @default.
- W2019970108 hasAuthorship W2019970108A5027406448 @default.
- W2019970108 hasAuthorship W2019970108A5090697753 @default.
- W2019970108 hasBestOaLocation W20199701081 @default.
- W2019970108 hasConcept C104114177 @default.
- W2019970108 hasConcept C121332964 @default.
- W2019970108 hasConcept C154945302 @default.
- W2019970108 hasConcept C24890656 @default.
- W2019970108 hasConcept C2780226923 @default.
- W2019970108 hasConcept C2780827179 @default.
- W2019970108 hasConcept C2781238097 @default.
- W2019970108 hasConcept C31972630 @default.
- W2019970108 hasConcept C41008148 @default.
- W2019970108 hasConcept C55493867 @default.
- W2019970108 hasConcept C86803240 @default.
- W2019970108 hasConceptScore W2019970108C104114177 @default.
- W2019970108 hasConceptScore W2019970108C121332964 @default.
- W2019970108 hasConceptScore W2019970108C154945302 @default.
- W2019970108 hasConceptScore W2019970108C24890656 @default.
- W2019970108 hasConceptScore W2019970108C2780226923 @default.
- W2019970108 hasConceptScore W2019970108C2780827179 @default.
- W2019970108 hasConceptScore W2019970108C2781238097 @default.
- W2019970108 hasConceptScore W2019970108C31972630 @default.
- W2019970108 hasConceptScore W2019970108C41008148 @default.
- W2019970108 hasConceptScore W2019970108C55493867 @default.
- W2019970108 hasConceptScore W2019970108C86803240 @default.
- W2019970108 hasIssue "14" @default.
- W2019970108 hasLocation W20199701081 @default.
- W2019970108 hasLocation W20199701082 @default.
- W2019970108 hasOpenAccess W2019970108 @default.
- W2019970108 hasPrimaryLocation W20199701081 @default.
- W2019970108 hasRelatedWork W1512699712 @default.
- W2019970108 hasRelatedWork W2087879686 @default.
- W2019970108 hasRelatedWork W2348328675 @default.
- W2019970108 hasRelatedWork W2353407213 @default.
- W2019970108 hasRelatedWork W2479613937 @default.
- W2019970108 hasRelatedWork W3032712071 @default.
- W2019970108 hasRelatedWork W3034884163 @default.
- W2019970108 hasRelatedWork W4387265590 @default.
- W2019970108 hasRelatedWork W2184114188 @default.
- W2019970108 hasRelatedWork W2738039334 @default.
- W2019970108 hasVolume "15" @default.
- W2019970108 isParatext "false" @default.
- W2019970108 isRetracted "false" @default.
- W2019970108 magId "2019970108" @default.
- W2019970108 workType "article" @default.