Matches in SemOpenAlex for { <https://semopenalex.org/work/W2886712131> ?p ?o ?g. }
- W2886712131 abstract "Abstract When listening to music, humans can easily identify and move to the beat. Numerous experimental studies have identified brain regions that may be involved with beat perception and representation. Several theoretical and algorithmic approaches have been proposed to account for this ability. Related to, but different from the issue of how we perceive a beat, is the question of how we learn to generate and hold a beat. In this paper, we introduce a neuronal framework for a beat generator that is capable of learning isochronous rhythms over a range of frequencies that are relevant to music and speech. Our approach combines ideas from error-correction and entrainment models to investigate the dynamics of how a biophysically-based neuronal network model synchronizes its period and phase to match that of an external stimulus. The model makes novel use of on-going faster gamma rhythms to form a set of discrete clocks that provide estimates, but not exact information, of how well the beat generator spike times match those of a stimulus sequence. The beat generator is endowed with plasticity allowing it to quickly learn and thereby adjust its spike times to achieve synchronization. Our model makes generalizable predictions about the existence of asymmetries in the synchronization process, as well as specific predictions about resynchronization times after changes in stimulus tempo or phase. Analysis of the model demonstrates that accurate rhythmic time keeping can be achieved over a range of frequencies relevant to music, in a manner that is robust to changes in parameters and to the presence of noise. Author summary Music is integral to human experience and is appreciated across a wide range of cultures. Although many features distinguish different musical traditions, rhythm is central to nearly all. Most humans can detect and move along to the beat through finger or foot tapping, hand clapping or other bodily movements. But many people have a hard time “keeping a beat”, or say they have “no sense of rhythm”. There appears to be a disconnect between our ability to perceive a beat versus our ability to produce a beat, as a drummer would do as part of a musical group. Producing a beat requires beat generation, the process by which we learn how to keep track of the specific time intervals between beats, as well as executing the motor movement needed to produce the sound associated with a beat. In this paper, we begin to explore neural mechanisms that may be responsible for our ability to generate and keep a beat. We develop a computational model that includes different neurons and shows how they cooperate to learn a beat and keep it, even after the stimulus is removed, across a range of frequencies relevant to music. Our dynamical systems model leads to predictions for how the brain may react when learning a beat. Our findings and techniques should be widely applicable to those interested in understanding how the brain processes time, particularly in the context of music." @default.
- W2886712131 created "2018-08-22" @default.
- W2886712131 creator A5010485768 @default.
- W2886712131 creator A5061116254 @default.
- W2886712131 creator A5064999229 @default.
- W2886712131 date "2018-08-21" @default.
- W2886712131 modified "2023-09-23" @default.
- W2886712131 title "A neuromechanistic model for rhythmic beat generation" @default.
- W2886712131 cites W1494552191 @default.
- W2886712131 cites W1571518191 @default.
- W2886712131 cites W1647536797 @default.
- W2886712131 cites W1964849140 @default.
- W2886712131 cites W1979271203 @default.
- W2886712131 cites W1986164882 @default.
- W2886712131 cites W1989524512 @default.
- W2886712131 cites W1991942861 @default.
- W2886712131 cites W1997001665 @default.
- W2886712131 cites W2003890426 @default.
- W2886712131 cites W2005466063 @default.
- W2886712131 cites W2012109990 @default.
- W2886712131 cites W2012493789 @default.
- W2886712131 cites W2014032901 @default.
- W2886712131 cites W2022956279 @default.
- W2886712131 cites W2027203739 @default.
- W2886712131 cites W2028606042 @default.
- W2886712131 cites W2029342452 @default.
- W2886712131 cites W2034780273 @default.
- W2886712131 cites W2040237110 @default.
- W2886712131 cites W2044852871 @default.
- W2886712131 cites W2049527244 @default.
- W2886712131 cites W2050694047 @default.
- W2886712131 cites W2050899337 @default.
- W2886712131 cites W2051974675 @default.
- W2886712131 cites W2052440003 @default.
- W2886712131 cites W2061349228 @default.
- W2886712131 cites W2063634322 @default.
- W2886712131 cites W2067975977 @default.
- W2886712131 cites W2070681963 @default.
- W2886712131 cites W2076825472 @default.
- W2886712131 cites W2077798418 @default.
- W2886712131 cites W2084537927 @default.
- W2886712131 cites W2086360606 @default.
- W2886712131 cites W2098125043 @default.
- W2886712131 cites W2102132106 @default.
- W2886712131 cites W2106286592 @default.
- W2886712131 cites W2107397892 @default.
- W2886712131 cites W2118607666 @default.
- W2886712131 cites W2119092010 @default.
- W2886712131 cites W2124335201 @default.
- W2886712131 cites W2129968619 @default.
- W2886712131 cites W2147101007 @default.
- W2886712131 cites W2150319510 @default.
- W2886712131 cites W2157264145 @default.
- W2886712131 cites W2161205205 @default.
- W2886712131 cites W2162013747 @default.
- W2886712131 cites W2164146725 @default.
- W2886712131 cites W2167362617 @default.
- W2886712131 cites W2242098063 @default.
- W2886712131 cites W2265214182 @default.
- W2886712131 cites W2283536859 @default.
- W2886712131 cites W2338467621 @default.
- W2886712131 cites W2460489485 @default.
- W2886712131 cites W2464490662 @default.
- W2886712131 cites W2484602058 @default.
- W2886712131 cites W2598735547 @default.
- W2886712131 cites W26662613 @default.
- W2886712131 cites W2776229758 @default.
- W2886712131 cites W2804780080 @default.
- W2886712131 cites W2952631121 @default.
- W2886712131 cites W316919195 @default.
- W2886712131 cites W40563048 @default.
- W2886712131 cites W4251409354 @default.
- W2886712131 doi "https://doi.org/10.1101/397075" @default.
- W2886712131 hasPublicationYear "2018" @default.
- W2886712131 type Work @default.
- W2886712131 sameAs 2886712131 @default.
- W2886712131 citedByCount "0" @default.
- W2886712131 crossrefType "posted-content" @default.
- W2886712131 hasAuthorship W2886712131A5010485768 @default.
- W2886712131 hasAuthorship W2886712131A5061116254 @default.
- W2886712131 hasAuthorship W2886712131A5064999229 @default.
- W2886712131 hasBestOaLocation W28867121311 @default.
- W2886712131 hasConcept C121332964 @default.
- W2886712131 hasConcept C135343436 @default.
- W2886712131 hasConcept C154945302 @default.
- W2886712131 hasConcept C15744967 @default.
- W2886712131 hasConcept C169760540 @default.
- W2886712131 hasConcept C180747234 @default.
- W2886712131 hasConcept C189809214 @default.
- W2886712131 hasConcept C24890656 @default.
- W2886712131 hasConcept C26760741 @default.
- W2886712131 hasConcept C2779918689 @default.
- W2886712131 hasConcept C28490314 @default.
- W2886712131 hasConcept C41008148 @default.
- W2886712131 hasConceptScore W2886712131C121332964 @default.
- W2886712131 hasConceptScore W2886712131C135343436 @default.
- W2886712131 hasConceptScore W2886712131C154945302 @default.
- W2886712131 hasConceptScore W2886712131C15744967 @default.
- W2886712131 hasConceptScore W2886712131C169760540 @default.
- W2886712131 hasConceptScore W2886712131C180747234 @default.