Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312181736> ?p ?o ?g. }
- W4312181736 endingPage "146.e4" @default.
- W4312181736 startingPage "134" @default.
- W4312181736 abstract "•Color-biased regions in the ventral visual pathway are food selective•Two ventral food streams begin in V4 and diverge medially and laterally of the FFA•Food-selective streams use both visual form and color to represent food Color-biased regions have been found between face- and place-selective areas in the ventral visual pathway. To investigate the function of the color-biased regions in a pathway responsible for object recognition, we analyzed the natural scenes dataset (NSD), a large 7T fMRI dataset from 8 participants who each viewed up to 30,000 trials of images of colored natural scenes over more than 30 scanning sessions. In a whole-brain analysis, we correlated the average color saturation of the images with voxel responses, revealing color-biased regions that diverge into two streams, beginning in V4 and extending medially and laterally relative to the fusiform face area in both hemispheres. We drew regions of interest (ROIs) for the two streams and found that the images for each ROI that evoked the largest responses had certain characteristics: they contained food, circular objects, warmer hues, and had higher color saturation. Further analyses showed that food images were the strongest predictor of activity in these regions, implying the existence of medial and lateral ventral food streams (VFSs). We found that color also contributed independently to voxel responses, suggesting that the medial and lateral VFSs use both color and form to represent food. Our findings illustrate how high-resolution datasets such as the NSD can be used to disentangle the multifaceted contributions of many visual features to the neural representations of natural scenes. Color-biased regions have been found between face- and place-selective areas in the ventral visual pathway. To investigate the function of the color-biased regions in a pathway responsible for object recognition, we analyzed the natural scenes dataset (NSD), a large 7T fMRI dataset from 8 participants who each viewed up to 30,000 trials of images of colored natural scenes over more than 30 scanning sessions. In a whole-brain analysis, we correlated the average color saturation of the images with voxel responses, revealing color-biased regions that diverge into two streams, beginning in V4 and extending medially and laterally relative to the fusiform face area in both hemispheres. We drew regions of interest (ROIs) for the two streams and found that the images for each ROI that evoked the largest responses had certain characteristics: they contained food, circular objects, warmer hues, and had higher color saturation. Further analyses showed that food images were the strongest predictor of activity in these regions, implying the existence of medial and lateral ventral food streams (VFSs). We found that color also contributed independently to voxel responses, suggesting that the medial and lateral VFSs use both color and form to represent food. Our findings illustrate how high-resolution datasets such as the NSD can be used to disentangle the multifaceted contributions of many visual features to the neural representations of natural scenes. The ventral visual pathway is specialized for the perception and recognition of visual objects, e.g., faces,1Kanwisher N. McDermott J. Chun M.M. The fusiform face area: a module in human extrastriate cortex specialized for face perception.J. Neurosci. 1997; 17: 4302-4311https://doi.org/10.1523/JNEUROSCI.17-11-04302.1997Crossref PubMed Google Scholar,2Kanwisher N. Yovel G. The fusiform face area: a cortical region specialized for the perception of faces.Philos. Trans. R. Soc. Lond. B Biol. Sci. 2006; 361: 2109-2128https://doi.org/10.1098/rstb.2006.1934Crossref PubMed Scopus (1112) Google Scholar places,3Epstein R. Harris A. Stanley D. Kanwisher N. The parahippocampal place area?.Neuron. 1999; 23: 115-125https://doi.org/10.1016/S0896-6273(00)80758-8Abstract Full Text Full Text PDF PubMed Scopus (629) Google Scholar,4Epstein R. Kanwisher N. The parahippocampal place area: a cortical representation of the local visual environment.Nature. 1998; 7: 6-9https://doi.org/10.1016/s1053-8119(18)31174-1Crossref Scopus (8) Google Scholar bodies,5Peelen M.V. Downing P.E. The neural basis of visual body perception.Nat. Rev. Neurosci. 2007; 8: 636-648https://doi.org/10.1038/nrn2195Crossref PubMed Scopus (507) Google Scholar,6Downing P.E. Jiang Y. Shuman M. Kanwisher N. A cortical area specialized for visual processing of the human body.Science. 2001; 293: 2470-2473https://doi.org/10.1167/1.3.341Crossref PubMed Scopus (14) Google Scholar and words.7Kay K.N. Yeatman J.D. Bottom-up and top-down computations in word- and face-selective cortex.eLife. 2017; 6: e22341https://doi.org/10.7554/eLife.22341Crossref PubMed Scopus (83) Google Scholar,8Dehaene S. Cohen L. The unique role of the visual word form area in reading.Trends Cogn. Sci. 2011; 15: 254-262https://doi.org/10.1016/j.tics.2011.04.003Abstract Full Text Full Text PDF PubMed Scopus (885) Google Scholar Color is an important feature of objects,9Witzel C. Gegenfurtner K.R. Color perception: objects, constancy, and categories.Annu. Rev. Vis. Sci. 2018; 4: 475-499https://doi.org/10.1146/annurev-vision-091517-034231Crossref PubMed Scopus (90) Google Scholar,10Tanaka J. Weiskopf D. Williams P. The role of color in high-level vision.Trends Cogn. Sci. 2001; 5: 211-215https://doi.org/10.1016/S1364-6613(00)01626-0Abstract Full Text Full Text PDF PubMed Scopus (222) Google Scholar and color-biased regions have been found in the ventral visual pathway anterior to V4.11Lafer-Sousa R. Conway B.R. Kanwisher N.G. Color-biased regions of the ventral visual pathway lie between face-and place-selective regions in humans, as in macaques.J. Neurosci. 2016; 36: 1682-1697https://doi.org/10.1523/JNEUROSCI.3164-15.2016Crossref PubMed Scopus (105) Google Scholar,12Lafer-Sousa R. Conway B.R. Parallel, multi-stage processing of colors, faces and shapes in macaque inferior temporal cortex.Nat. Neurosci. 2013; 16: 1870-1878https://doi.org/10.1038/nn.3555Crossref PubMed Scopus (156) Google Scholar,13Zeki S. Marini L. Three cortical stages of colour processing in the human brain.Brain. 1998; 121: 1669-1685https://doi.org/10.1093/brain/121.9.1669Crossref PubMed Scopus (241) Google Scholar,14Conway B.R. The organization and operation of inferior temporal cortex.Annu. Rev. Vis. Sci. 2018; 4: 381-402Crossref PubMed Scopus (100) Google Scholar,15Conway B.R. Moeller S. Tsao D.Y. Specialized color modules in macaque extrastriate cortex.Neuron. 2007; 56: 560-573https://doi.org/10.1016/j.neuron.2007.10.008Abstract Full Text Full Text PDF PubMed Scopus (189) Google Scholar,16Beauchamp M.S. Haxby J.V. Jennings J.E. DeYoe E.A. An fMRI version of the farnsworth-munsell 100-hue test reveals multiple color-selective areas in human ventral occipitotemporal cortex.Cereb. Cortex. 1999; 9: 257-263https://doi.org/10.1093/cercor/9.3.257Crossref PubMed Scopus (175) Google Scholar,17Chao L.L. Martin A. Cortical regions associated with perceiving, naming, and knowing about colors.J. Cogn. Neurosci. 1999; 11: 25-35https://doi.org/10.1162/089892999563229Crossref PubMed Scopus (202) Google Scholar,18Taylor J.M. Xu Y. Representation of color, form, and their conjunction across the human ventral visual pathway.NeuroImage. 2022; 251: 118941https://doi.org/10.1016/j.neuroimage.2022.118941Crossref PubMed Scopus (6) Google Scholar,19Chang L. Bao P. Tsao D.Y. The representation of colored objects in macaque color patches.Nat. Commun. 2017; 8: 2064https://doi.org/10.1038/s41467-017-01912-7Crossref PubMed Scopus (30) Google Scholar Are there supraordinate object specialisms associated with the color biases observed in these regions? The processing of color information begins in the retina with a comparison of the activities of the three classes of cone that are sensitive to short (S), medium (M), and long (L) wavelengths of light. Subsequently, different classes of retinal ganglion cells send luminance and color information to the lateral geniculate nucleus which projects to V1.20Conway B.R. Eskew R.T. Martin P.R. Stockman A. A tour of contemporary color vision research.Vision Res. 2018; 151: 2-6https://doi.org/10.1016/j.visres.2018.06.009Crossref PubMed Scopus (22) Google Scholar In the early visual cortices such as V1, V2, V3, and V4v, responsiveness to hue and saturation as color attributes has been studied using functional magnetic resonance imaging (fMRI).16Beauchamp M.S. Haxby J.V. Jennings J.E. DeYoe E.A. An fMRI version of the farnsworth-munsell 100-hue test reveals multiple color-selective areas in human ventral occipitotemporal cortex.Cereb. Cortex. 1999; 9: 257-263https://doi.org/10.1093/cercor/9.3.257Crossref PubMed Scopus (175) Google Scholar,21Hadjikhani N. Liu A.K. Dale A.M. Cavanagh P. Tootell R.B.H. Retinotopy and color sensitivity in human visual cortical area V8.Nat. Neurosci. 1998; 1: 235-241Crossref PubMed Scopus (426) Google Scholar,22Brouwer G.J. Heeger D.J. Decoding and reconstructing color from responses in human visual cortex.J. Neurosci. 2009; 29: 13992-14003https://doi.org/10.1523/JNEUROSCI.3577-09.2009Crossref PubMed Scopus (334) Google Scholar,23Bannert M.M. Bartels A. Human V4 activity patterns predict behavioral performance in imagery of object color.J. Neurosci. 2018; 38: 3657-3668https://doi.org/10.1523/JNEUROSCI.2307-17.2018Crossref PubMed Scopus (19) Google Scholar,24Bannert M.M. Bartels A. Decoding the yellow of a gray banana.Curr. Biol. 2013; 23: 2268-2272https://doi.org/10.1016/j.cub.2013.09.016Abstract Full Text Full Text PDF PubMed Scopus (82) Google Scholar,25Wade A. Augath M. Logothetis N. Wandell B. fMRI measurements of color in macaque and human.J. Vis. 2008; 8 (6.1–619)https://doi.org/10.1167/8.10.6Crossref Scopus (68) Google Scholar,26Brewer A.A. Liu J. Wade A.R. Wandell B.A. Visual field maps and stimulus selectivity in human ventral occipital cortex.Nat. Neurosci. 2005; 8: 1102-1109https://doi.org/10.1038/nn1507Crossref PubMed Scopus (323) Google Scholar,27Engel S.A. Adaptation of oriented and unoriented color-selective neurons in human visual areas.Neuron. 2005; 45: 613-623https://doi.org/10.1016/j.neuron.2005.01.014Abstract Full Text Full Text PDF PubMed Scopus (78) Google Scholar,28Engel S. Zhang X. Wandell B. Colour tuning in human visual cortex measured with functional magnetic resonance imaging.Nature. 1997; 388: 68-71https://doi.org/10.1038/40398Crossref PubMed Scopus (293) Google Scholar V1 to V3 respond to color among other features,29Mullen K.T. Dumoulin S.O. Mcmahon K.L. Zubicaray G.I. De Hess R.F. Selectivity of human retinotopic visual cortex to S-cone-opponent, L.Eur. J. Neurosci. 2007; 25: 491-502https://doi.org/10.1111/j.1460-9568.2007.05302.xCrossref PubMed Scopus (82) Google Scholar,30Barnett M.A. Aguirre G.K. Brainard D.H. A quadratic model captures the human v1 response to variations in chromatic direction and contrast.eLife. 2021; 10: e65590https://doi.org/10.7554/eLife.65590Crossref PubMed Google Scholar whereas V4 and the ventral occipital region (VO; anterior to V4) are thought to be specialized for processing color.31Mullen K.T. The response to colour in the human visual cortex: the fMRI approach.Curr. Opin. Behav. Sci. 2019; 30: 141-148https://doi.org/10.1016/j.cobeha.2019.08.001Crossref Scopus (5) Google Scholar Voxel activity patterns in V4, VO1, and VO2 can strongly distinguish chromatic from achromatic stimuli,32Goddard E. Mullen K.T. fMRI representational similarity analysis reveals graded preferences for chromatic and achromatic stimulus contrast across human visual cortex.NeuroImage. 2020; 215: 116780https://doi.org/10.1016/j.neuroimage.2020.116780Crossref PubMed Scopus (6) Google Scholar and clustering and representational similarity analyses have provided evidence for a representation of color in these areas.32Goddard E. Mullen K.T. fMRI representational similarity analysis reveals graded preferences for chromatic and achromatic stimulus contrast across human visual cortex.NeuroImage. 2020; 215: 116780https://doi.org/10.1016/j.neuroimage.2020.116780Crossref PubMed Scopus (6) Google Scholar,33Brouwer G.J. Heeger D.J. Categorical clustering of the neural representation of color.J. Neurosci. 2013; 33: 15454-15465https://doi.org/10.1523/JNEUROSCI.2472-13.2013Crossref PubMed Scopus (129) Google Scholar,34Bohon K.S. Hermann K.L. Hansen T. Conway B.R. Representation of perceptual color space in macaque posterior inferior temporal cortex (the V4 complex).eNeuro. 2016; 3: 1-28https://doi.org/10.1523/ENEURO.0039-16.2016Crossref Scopus (58) Google Scholar More cognitive color tasks are also associated with V4, such as mental imagery for color23Bannert M.M. Bartels A. Human V4 activity patterns predict behavioral performance in imagery of object color.J. Neurosci. 2018; 38: 3657-3668https://doi.org/10.1523/JNEUROSCI.2307-17.2018Crossref PubMed Scopus (19) Google Scholar and color memory.24Bannert M.M. Bartels A. Decoding the yellow of a gray banana.Curr. Biol. 2013; 23: 2268-2272https://doi.org/10.1016/j.cub.2013.09.016Abstract Full Text Full Text PDF PubMed Scopus (82) Google Scholar As color information progresses through visual cortical regions, its representation likely becomes transformed to aid cognitive tasks such as object perception,12Lafer-Sousa R. Conway B.R. Parallel, multi-stage processing of colors, faces and shapes in macaque inferior temporal cortex.Nat. Neurosci. 2013; 16: 1870-1878https://doi.org/10.1038/nn.3555Crossref PubMed Scopus (156) Google Scholar,14Conway B.R. The organization and operation of inferior temporal cortex.Annu. Rev. Vis. Sci. 2018; 4: 381-402Crossref PubMed Scopus (100) Google Scholar,35Vandenbroucke A.R.E. Fahrenfort J.J. Meuwese J.D.I. Scholte H.S. Lamme V.A.F. Prior knowledge about objects determines neural color representation in human visual cortex.Cereb. Cortex. 2016; 26: 1401-1408https://doi.org/10.1093/cercor/bhu224Crossref PubMed Scopus (35) Google Scholar,36Rosenthal I. Ratnasingam S. Haile T. Eastman S. Fuller-Deets J. Conway B.R. Color statistics of objects, and color tuning of object cortex in macaque monkey.J. Vis. 2018; 18: 1https://doi.org/10.1167/18.11.1Crossref PubMed Scopus (31) Google Scholar and color representations in these regions are known to be modulated by other object features such as shape and animacy.36Rosenthal I. Ratnasingam S. Haile T. Eastman S. Fuller-Deets J. Conway B.R. Color statistics of objects, and color tuning of object cortex in macaque monkey.J. Vis. 2018; 18: 1https://doi.org/10.1167/18.11.1Crossref PubMed Scopus (31) Google Scholar In particular, Rosenthal et al.36Rosenthal I. Ratnasingam S. Haile T. Eastman S. Fuller-Deets J. Conway B.R. Color statistics of objects, and color tuning of object cortex in macaque monkey.J. Vis. 2018; 18: 1https://doi.org/10.1167/18.11.1Crossref PubMed Scopus (31) Google Scholar found that the color tuning properties of neurons in macaque IT correlated with the warm colors typical of salient objects.37Gibson E. Futrell R. Jara-Ettinger J. Mahowald K. Bergen L. Ratnasingam S. Gibson M. Piantadosi S.T. Conway B.R. Color naming across languages reflects color use.Proc. Natl. Acad. Sci. USA. 2017; 114: 10785-10790https://doi.org/10.1073/pnas.1619666114Crossref PubMed Scopus (127) Google Scholar Most studies of color perception present simple stimuli such as color patches, rather than color as it occurs in natural scenes. However, in daily life, our visual systems encounter colors as part of conjunctions of object features integrated in context within natural scenes. With simple stimuli, color is dissociated from its regular context and meaning: such stimuli have basic spatial form, may be selected from a restricted color gamut, and are typically presented on a uniform surround. Visual responses to carefully controlled colored stimuli might be quite different from responses to colors in their complex, naturalistic settings. For example, for colored patches, decoding accuracy drops progressively from V1 to V4,22Brouwer G.J. Heeger D.J. Decoding and reconstructing color from responses in human visual cortex.J. Neurosci. 2009; 29: 13992-14003https://doi.org/10.1523/JNEUROSCI.3577-09.2009Crossref PubMed Scopus (334) Google Scholar,23Bannert M.M. Bartels A. Human V4 activity patterns predict behavioral performance in imagery of object color.J. Neurosci. 2018; 38: 3657-3668https://doi.org/10.1523/JNEUROSCI.2307-17.2018Crossref PubMed Scopus (19) Google Scholar whereas for colored object categories, decoding accuracy increases through the same areas.35Vandenbroucke A.R.E. Fahrenfort J.J. Meuwese J.D.I. Scholte H.S. Lamme V.A.F. Prior knowledge about objects determines neural color representation in human visual cortex.Cereb. Cortex. 2016; 26: 1401-1408https://doi.org/10.1093/cercor/bhu224Crossref PubMed Scopus (35) Google Scholar To understand how the brain represents color in its usual contexts and to understand the functions of the color-biased regions in the ventral visual pathway, it is therefore crucial to use complex stimuli containing a variety of object categories such as natural scenes.11Lafer-Sousa R. Conway B.R. Kanwisher N.G. Color-biased regions of the ventral visual pathway lie between face-and place-selective regions in humans, as in macaques.J. Neurosci. 2016; 36: 1682-1697https://doi.org/10.1523/JNEUROSCI.3164-15.2016Crossref PubMed Scopus (105) Google Scholar,12Lafer-Sousa R. Conway B.R. Parallel, multi-stage processing of colors, faces and shapes in macaque inferior temporal cortex.Nat. Neurosci. 2013; 16: 1870-1878https://doi.org/10.1038/nn.3555Crossref PubMed Scopus (156) Google Scholar,13Zeki S. Marini L. Three cortical stages of colour processing in the human brain.Brain. 1998; 121: 1669-1685https://doi.org/10.1093/brain/121.9.1669Crossref PubMed Scopus (241) Google Scholar We aimed to characterize the neural representation of color and its association with the representation of objects and other image properties as they are encountered in natural scenes. The natural scenes dataset (NSD)38Allen E.J. St-yves G. Wu Y. Breedlove J.L. Prince J.S. Dowdle L.T. Nau M. Caron B. Pestilli F. Charest I. et al.A massive 7T fMRI dataset to bridge cognitive neuroscience and artificial intelligence.Nat. Neurosci. 2022; 25: 116-126https://doi.org/10.1038/s41593-021-00962-xCrossref PubMed Scopus (44) Google Scholar provides a unique opportunity for this endeavor. It is an unprecedented large-scale fMRI dataset in which each participant viewed thousands of colored (and some grayscale) natural scenes over 30–40 sessions in a 7T scanner. This dataset therefore has impressively high signal-to-noise and statistical power.39Naselaris T. Allen E. Kay K. Extensive sampling for complete models of individual brains.Curr. Opin. Behav. Sci. 2021; 40: 45-51https://doi.org/10.1016/j.cobeha.2020.12.008Crossref Scopus (42) Google Scholar However, images of natural scenes are high dimensional, and visual features can correlate with one another strongly, making it challenging to accurately disentangle the contributions of different features. Nonetheless, with its huge number of well-characterized and segmented stimulus images, the NSD is one of the best datasets currently available to uncover the neural representations underlying perception of natural scenes.38Allen E.J. St-yves G. Wu Y. Breedlove J.L. Prince J.S. Dowdle L.T. Nau M. Caron B. Pestilli F. Charest I. et al.A massive 7T fMRI dataset to bridge cognitive neuroscience and artificial intelligence.Nat. Neurosci. 2022; 25: 116-126https://doi.org/10.1038/s41593-021-00962-xCrossref PubMed Scopus (44) Google Scholar,40Lin T.Y. Maire M. Belongie S. Hays J. Perona P. Ramanan D. Dollár P. Zitnick C.L. Microsoft COCO: common objects in context.Preprint at arXiv. 2014; https://doi.org/10.1007/978-3-319-10602-1_48Crossref Scopus (16650) Google Scholar Our analyses revealed two streams in the ventral visual pathway that exhibit responses to color in the NSD images. We found that both streams were primarily responsive to food objects, implying that color is a key part of the neural representation of food in these ventral visual areas. Our findings are bolstered by two recent papers also finding strong evidence for food selectivity in these regions of the ventral visual pathway using distinct data-driven approaches with the NSD41Khosla M. Murty N.A.R. Kanwisher N. A highly selective response to food in human visual cortex revealed by hypothesis-free voxel decomposition.Curr. Biol. 2022; 32: 4159-4171.e9https://doi.org/10.1016/j.cub.2022.08.009Abstract Full Text Full Text PDF PubMed Scopus (6) Google Scholar,42Jain N. Wang A. Henderson M.M. Lin R. Jacob S. Tarr M.J. Wehbe L. Food for thought: selectivity for food in human ventral visual cortex.Preprint at bioRxiv. 2022; https://doi.org/10.1101/2022.05.22.492983Crossref Scopus (0) Google Scholar and an additional fMRI study presenting isolated food images.42Jain N. Wang A. Henderson M.M. Lin R. Jacob S. Tarr M.J. Wehbe L. Food for thought: selectivity for food in human ventral visual cortex.Preprint at bioRxiv. 2022; https://doi.org/10.1101/2022.05.22.492983Crossref Scopus (0) Google Scholar To isolate responses to chromatic compared with achromatic information in the NSD images, we conducted a whole-brain correlation between the average color saturation of each NSD image and the BOLD signal change observed at each voxel (Figure 1A). Since saturation and luminance (Figures 2A and S1A) are correlated in natural scenes,43Long F. Yang Z. Purves D. Spectral statistics in natural scenes predict hue, saturation, and brightness.Proc. Natl. Acad. Sci. USA. 2006; 103: 6013-6018Crossref PubMed Scopus (60) Google Scholar we used the mean luminance of each image as a covariate. The correlations were Bonferroni corrected for each participant based on the number of voxels in participant-native space. We also conducted an analysis to measure split-half reliability, where voxel-by-voxel correlation coefficients for average saturation and voxel responses were correlated over the whole brain for odd and even images.Figure 2Montages of images evoking the lowest and highest ROI responses and showing variation in each image statisticShow full caption(A) Montages of the 68 images with the highest (right montages) and lowest (left montages) values for the image statistics included in the correlation analysis: average saturation and average luminance for participant 1. See Figure S1A for montages for the other participants.(B) Montages of the 68 images for the lateral and medial ROIs for both hemispheres that evoked the highest (right montages) and lowest (left montages) averaged Z scored voxel responses for participant 1. See Figure S1B for montages for the other participants.(C) Montages of the 68 images with the lowest and highest values for four further image statistics for participant 1: number of food pixels (food), number of pixels forming circular objects (circle), average warmth rating of pixel colors (warmth), and luminance entropy. The right column contains montages of images with the highest values for each image statistic and the left column contains montages of images with the lowest values for each image statistic. See Figure S1A for montages for the other participants.View Large Image Figure ViewerDownload Hi-res image Download (PPT) (A) Montages of the 68 images with the highest (right montages) and lowest (left montages) values for the image statistics included in the correlation analysis: average saturation and average luminance for participant 1. See Figure S1A for montages for the other participants. (B) Montages of the 68 images for the lateral and medial ROIs for both hemispheres that evoked the highest (right montages) and lowest (left montages) averaged Z scored voxel responses for participant 1. See Figure S1B for montages for the other participants. (C) Montages of the 68 images with the lowest and highest values for four further image statistics for participant 1: number of food pixels (food), number of pixels forming circular objects (circle), average warmth rating of pixel colors (warmth), and luminance entropy. The right column contains montages of images with the highest values for each image statistic and the left column contains montages of images with the lowest values for each image statistic. See Figure S1A for montages for the other participants. For all participants, we found areas showing positive correlations between saturation and voxel responses in the ventral visual pathway (Figure 1), with strong correlations in V4 and diverging into two distinct streams which we divided into medial and lateral regions of interest (ROIs). The medial ROI is located between face and place areas (fLoc-defined areas are shown in Figure 1A and the ROI boundaries in Figure 1B; see fLoc-experiment by Allen et al.38Allen E.J. St-yves G. Wu Y. Breedlove J.L. Prince J.S. Dowdle L.T. Nau M. Caron B. Pestilli F. Charest I. et al.A massive 7T fMRI dataset to bridge cognitive neuroscience and artificial intelligence.Nat. Neurosci. 2022; 25: 116-126https://doi.org/10.1038/s41593-021-00962-xCrossref PubMed Scopus (44) Google Scholar), and is roughly in agreement with the location of the color-biased regions identified by Lafer-Sousa et al.11Lafer-Sousa R. Conway B.R. Kanwisher N.G. Color-biased regions of the ventral visual pathway lie between face-and place-selective regions in humans, as in macaques.J. Neurosci. 2016; 36: 1682-1697https://doi.org/10.1523/JNEUROSCI.3164-15.2016Crossref PubMed Scopus (105) Google Scholar (Figure 1B). Our whole brain split-half reliability analysis on the correlation between voxel responses and saturation showed high reliability, with r = 0.82 (range = 0.71–0.89 for different participants). For all 8 participants, there were also areas that showed negative correlations between saturation and voxel responses, specifically the PPA and the region located between the lateral and medial ROIs that showed positive correlations (Figure 1A). For seven participants, there was an area of negative correlation lateral of the lateral ROI, roughly corresponding to area MT. For six participants (and one further participant in the left hemisphere only), there was a positive correlation with saturation in prefrontal regions (Figure 1A), reminiscent of other findings on color processing in the prefrontal cortex.44Bird C.M. Berens S.C. Horner A.J. Franklin A. Categorical encoding of color in the brain.Proc. Natl. Acad. Sci. USA. 2014; 111: 4590-4595Crossref PubMed Scopus (60) Google Scholar,45Persichetti A.S. Thompson-Schill S.L. Butt O.H. Brainard D.H. Aguirre G.K. Functional magnetic resonance imaging adaptation reveals a noncategorical representation of hue in early visual cortex.J. Vis. 2015; 15: 18https://doi.org/10.1167/15.6.18Crossref PubMed Scopus (19) Google Scholar,46Haile T.M. Bohon K.S. Romero M.C. Conway B.R. Visual stimulus-driven functional organization of macaque prefrontal cortex.NeuroImage. 2019; 188: 427-444https://doi.org/10.1016/j.neuroimage.2018.11.060Crossref PubMed Scopus (20) Google Scholar Several participants also showed significant correlations between saturation and voxel responses in earlier visual areas V1–V3. Our correlation analysis between BOLD and saturation revealed areas responsive to color in the ventral visual pathway for all participants. To better understand stimulus representation in these areas, we created montages of the images that evoked the highest and lowest voxel responses for these areas, split into four ROIs (medial and lateral, left and right hemispheres; Figure 2B for participant 1 and Figure S1B for the other participants). By inspecting the montages, we identified multiple image properties present in images evoking the highest responses but not in images evoking the lowest responses. These properties were food such as bananas, donuts, and pizzas; circular objects such as plates, clocks, and stop signs; warm colors such as reds and oranges; and luminance entropy (how well or poorly luminance values in one location can predict the values in nearby locations47Mather G. Aesthetic image statistics vary with artistic genre.Vision (Basel). 2020; 4: 10https://doi.org/10.3390/vision4010010Crossref PubMed Scopus (5) Google Scholar,48Gonzalez R.C. Woods R.E. Eddins S.L. Digital Image Processing Using MATLAB. Pearson Education, 2004Google Scholar). These image characteristics were consistent across all participants, the medial and lateral ROIs, and both hemispheres, suggesting that the four ROIs all process a similar type of visual information. In order to allow a quantitative analysis of voxel responses to the image properties that appeared to distinguish images that evoked the highest and lowest voxel responses in our ROIs, we calculated an image statistic for each image property. We also included mean luminance as an image statistic as it was used as a covariate in the correlation analysis with saturation. Our image statistics were mean saturation, pixel count for food objects, pixel count for circular objects, mean warmth ratings over the colors of all pixels, luminance entropy, and mean luminance (see STAR Methods for a detailed description). For food and circular objects, we used pixel count contained within the segmented objects to create continuous variables that could be entered into further analyses along with the other continuous variables. Our assumption was that there is a monotonic relationship between the pixel sizes of these objects and voxel responses, although we did not assume that the relationship has any particular form. There is some evidence to suggest that this is a reasonable assumption,49Kay K.N. Weiner K.S. Grill-Spector K. Attention reduces spatial uncertaint" @default.
- W4312181736 created "2023-01-04" @default.
- W4312181736 creator A5012667644 @default.
- W4312181736 creator A5018802558 @default.
- W4312181736 creator A5024434620 @default.
- W4312181736 creator A5062313576 @default.
- W4312181736 creator A5063867123 @default.
- W4312181736 creator A5065009127 @default.
- W4312181736 creator A5073966731 @default.
- W4312181736 creator A5075909735 @default.
- W4312181736 date "2023-01-01" @default.
- W4312181736 modified "2023-10-16" @default.
- W4312181736 title "Color-biased regions in the ventral visual pathway are food selective" @default.
- W4312181736 cites W1511448892 @default.
- W4312181736 cites W1617511168 @default.
- W4312181736 cites W1619178479 @default.
- W4312181736 cites W1965282326 @default.
- W4312181736 cites W1967326917 @default.
- W4312181736 cites W1971147682 @default.
- W4312181736 cites W1974297074 @default.
- W4312181736 cites W1977834787 @default.
- W4312181736 cites W1979480974 @default.
- W4312181736 cites W2000089809 @default.
- W4312181736 cites W2003386556 @default.
- W4312181736 cites W2007228552 @default.
- W4312181736 cites W2027963110 @default.
- W4312181736 cites W2046073295 @default.
- W4312181736 cites W2046486037 @default.
- W4312181736 cites W2052986837 @default.
- W4312181736 cites W2065671757 @default.
- W4312181736 cites W2066722744 @default.
- W4312181736 cites W2074394981 @default.
- W4312181736 cites W2078252209 @default.
- W4312181736 cites W2088927213 @default.
- W4312181736 cites W2094048962 @default.
- W4312181736 cites W2097076392 @default.
- W4312181736 cites W2097963807 @default.
- W4312181736 cites W2098256125 @default.
- W4312181736 cites W2098547130 @default.
- W4312181736 cites W2100789735 @default.
- W4312181736 cites W2106362337 @default.
- W4312181736 cites W2108101506 @default.
- W4312181736 cites W2112620393 @default.
- W4312181736 cites W2119125877 @default.
- W4312181736 cites W2119429959 @default.
- W4312181736 cites W2123341385 @default.
- W4312181736 cites W2125935651 @default.
- W4312181736 cites W2133002477 @default.
- W4312181736 cites W2134516218 @default.
- W4312181736 cites W2136132248 @default.
- W4312181736 cites W2136590778 @default.
- W4312181736 cites W2137468158 @default.
- W4312181736 cites W2137576217 @default.
- W4312181736 cites W2142768220 @default.
- W4312181736 cites W2144251503 @default.
- W4312181736 cites W2149414844 @default.
- W4312181736 cites W2155857287 @default.
- W4312181736 cites W2156503254 @default.
- W4312181736 cites W2165834371 @default.
- W4312181736 cites W2271018268 @default.
- W4312181736 cites W2499800833 @default.
- W4312181736 cites W2517612884 @default.
- W4312181736 cites W2556283591 @default.
- W4312181736 cites W2591032847 @default.
- W4312181736 cites W2754844311 @default.
- W4312181736 cites W2766719700 @default.
- W4312181736 cites W2791717679 @default.
- W4312181736 cites W2883476439 @default.
- W4312181736 cites W2886520251 @default.
- W4312181736 cites W2886637971 @default.
- W4312181736 cites W2894638638 @default.
- W4312181736 cites W2903069655 @default.
- W4312181736 cites W2972983554 @default.
- W4312181736 cites W3003344648 @default.
- W4312181736 cites W3015245853 @default.
- W4312181736 cites W3033737070 @default.
- W4312181736 cites W3110534329 @default.
- W4312181736 cites W3127523145 @default.
- W4312181736 cites W3174137462 @default.
- W4312181736 cites W3176215240 @default.
- W4312181736 cites W3188133518 @default.
- W4312181736 cites W3193174058 @default.
- W4312181736 cites W4200613736 @default.
- W4312181736 cites W4220997173 @default.
- W4312181736 cites W4293044477 @default.
- W4312181736 cites W4310113667 @default.
- W4312181736 cites W4321003712 @default.
- W4312181736 cites W617939786 @default.
- W4312181736 doi "https://doi.org/10.1016/j.cub.2022.11.063" @default.
- W4312181736 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/36574774" @default.
- W4312181736 hasPublicationYear "2023" @default.
- W4312181736 type Work @default.
- W4312181736 citedByCount "13" @default.
- W4312181736 countsByYear W43121817362023 @default.
- W4312181736 crossrefType "journal-article" @default.
- W4312181736 hasAuthorship W4312181736A5012667644 @default.
- W4312181736 hasAuthorship W4312181736A5018802558 @default.
- W4312181736 hasAuthorship W4312181736A5024434620 @default.