Matches in SemOpenAlex for { <https://semopenalex.org/work/W4220795574> ?p ?o ?g. }
- W4220795574 abstract "Animals have evolved sophisticated visual circuits to solve a vital inference problem: detecting whether or not a visual signal corresponds to an object on a collision course. Such events are detected by specific circuits sensitive to visual looming, or objects increasing in size. Various computational models have been developed for these circuits, but how the collision-detection inference problem itself shapes the computational structures of these circuits remains unknown. Here, inspired by the distinctive structures of LPLC2 neurons in the visual system of Drosophila, we build anatomically-constrained shallow neural network models and train them to identify visual signals that correspond to impending collisions. Surprisingly, the optimization arrives at two distinct, opposing solutions, only one of which matches the actual dendritic weighting of LPLC2 neurons. Both solutions can solve the inference problem with high accuracy when the population size is large enough. The LPLC2-like solutions reproduces experimentally observed LPLC2 neuron responses for many stimuli, and reproduces canonical tuning of loom sensitive neurons, even though the models are never trained on neural data. Thus, LPLC2 neuron properties and tuning are predicted by optimizing an anatomically-constrained neural network to detect impending collisions. More generally, these results illustrate how optimizing inference tasks that are important for an animal's perceptual goals can reveal and explain computational properties of specific sensory neurons." @default.
- W4220795574 created "2022-04-03" @default.
- W4220795574 creator A5058533213 @default.
- W4220795574 creator A5060219657 @default.
- W4220795574 creator A5060234658 @default.
- W4220795574 creator A5061843649 @default.
- W4220795574 creator A5067424300 @default.
- W4220795574 date "2022-01-13" @default.
- W4220795574 modified "2023-10-03" @default.
- W4220795574 title "Shallow neural networks trained to detect collisions recover features of visual loom-selective neurons" @default.
- W4220795574 cites W1538983388 @default.
- W4220795574 cites W1637700972 @default.
- W4220795574 cites W1911565940 @default.
- W4220795574 cites W1965871702 @default.
- W4220795574 cites W1976526581 @default.
- W4220795574 cites W1977675569 @default.
- W4220795574 cites W1984430718 @default.
- W4220795574 cites W1985604572 @default.
- W4220795574 cites W1986323200 @default.
- W4220795574 cites W2005108691 @default.
- W4220795574 cites W2007263747 @default.
- W4220795574 cites W2027654885 @default.
- W4220795574 cites W2033522594 @default.
- W4220795574 cites W2040739363 @default.
- W4220795574 cites W2042755403 @default.
- W4220795574 cites W2045998272 @default.
- W4220795574 cites W2048807455 @default.
- W4220795574 cites W2053604779 @default.
- W4220795574 cites W2056356727 @default.
- W4220795574 cites W2058616551 @default.
- W4220795574 cites W2060336927 @default.
- W4220795574 cites W2062074945 @default.
- W4220795574 cites W2077132224 @default.
- W4220795574 cites W2085927826 @default.
- W4220795574 cites W2093647388 @default.
- W4220795574 cites W2105464873 @default.
- W4220795574 cites W2121684702 @default.
- W4220795574 cites W2129036096 @default.
- W4220795574 cites W2130491336 @default.
- W4220795574 cites W2142025603 @default.
- W4220795574 cites W2145889472 @default.
- W4220795574 cites W2153436178 @default.
- W4220795574 cites W2157825442 @default.
- W4220795574 cites W2202142365 @default.
- W4220795574 cites W2251407979 @default.
- W4220795574 cites W2252248415 @default.
- W4220795574 cites W2267456908 @default.
- W4220795574 cites W2274405424 @default.
- W4220795574 cites W2280498916 @default.
- W4220795574 cites W2302230753 @default.
- W4220795574 cites W2318490268 @default.
- W4220795574 cites W2418359239 @default.
- W4220795574 cites W2528241240 @default.
- W4220795574 cites W2528662291 @default.
- W4220795574 cites W2566875547 @default.
- W4220795574 cites W2605475473 @default.
- W4220795574 cites W2623401085 @default.
- W4220795574 cites W2699457337 @default.
- W4220795574 cites W2753297919 @default.
- W4220795574 cites W2767680158 @default.
- W4220795574 cites W2781555497 @default.
- W4220795574 cites W2802438942 @default.
- W4220795574 cites W2899567519 @default.
- W4220795574 cites W2902499998 @default.
- W4220795574 cites W2906697496 @default.
- W4220795574 cites W2910016083 @default.
- W4220795574 cites W2919639595 @default.
- W4220795574 cites W2954634850 @default.
- W4220795574 cites W2963561338 @default.
- W4220795574 cites W2978368159 @default.
- W4220795574 cites W2981051561 @default.
- W4220795574 cites W2995517184 @default.
- W4220795574 cites W2998856166 @default.
- W4220795574 cites W2999804811 @default.
- W4220795574 cites W3006089990 @default.
- W4220795574 cites W3008003211 @default.
- W4220795574 cites W3036614154 @default.
- W4220795574 cites W3081065394 @default.
- W4220795574 cites W3102686699 @default.
- W4220795574 cites W3103145119 @default.
- W4220795574 cites W3184823478 @default.
- W4220795574 cites W4246039580 @default.
- W4220795574 doi "https://doi.org/10.7554/elife.72067" @default.
- W4220795574 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/35023828" @default.
- W4220795574 hasPublicationYear "2022" @default.
- W4220795574 type Work @default.
- W4220795574 citedByCount "11" @default.
- W4220795574 countsByYear W42207955742022 @default.
- W4220795574 countsByYear W42207955742023 @default.
- W4220795574 crossrefType "journal-article" @default.
- W4220795574 hasAuthorship W4220795574A5058533213 @default.
- W4220795574 hasAuthorship W4220795574A5060219657 @default.
- W4220795574 hasAuthorship W4220795574A5060234658 @default.
- W4220795574 hasAuthorship W4220795574A5061843649 @default.
- W4220795574 hasAuthorship W4220795574A5067424300 @default.
- W4220795574 hasBestOaLocation W42207955741 @default.
- W4220795574 hasConcept C118403218 @default.
- W4220795574 hasConcept C119857082 @default.
- W4220795574 hasConcept C120665830 @default.
- W4220795574 hasConcept C121332964 @default.