Matches in SemOpenAlex for { <https://semopenalex.org/work/W2619501984> ?p ?o ?g. }
Showing items 1 to 95 of
95
with 100 items per page.
- W2619501984 abstract "The Sparsity of Simple Recurrent Networks in Musical Structure Learning Kat R. Agres (kra9@cornell.edu) Department of Psychology, Cornell University, 211 Uris Hall Ithaca, NY 14853 USA Jordan E. DeLong (jed245@cornell.edu) Department of Psychology, Cornell University, 211 Uris Hall Ithaca, NY 14853 USA Michael Spivey (spivey@ucmerced.edu) School of Social Sciences, Humanities, and Arts, UC Merced, P.O. Box 2039 Merced, CA 95344 USA Abstract Evidence suggests that sparse coding allows for a more efficient and effective way to distill structural information about the environment. Our simple recurrent network has demonstrated the same to be true of learning musical structure. Two experiments are presented that examine the learning trajectory of a simple recurrent network exposed to musical input. Both experiments compare the network’s internal representations to behavioral data: Listeners rate the network’s own novel musical output from different points along the learning trajectory. The first study focused on learning the tonal relationships inherent in five simple melodies. The developmental trajectory of the network was studied by examining sparseness of the hidden layer activations and the sophistication of the network’s compositions. The second study used more complex musical input and focused on both tonal and rhythmic relationships in music. We found that increasing sparseness of the hidden layer activations strongly correlated with the increasing sophistication of the network’s output. Interestingly, sparseness was not programmed into the network; this property simply arose from learning the musical input. We argue that sparseness underlies the network’s success: It is the mechanism through which musical characteristics are learned and distilled, and facilitates the network’s ability to produce more complex and stylistic novel compositions over time. Keywords: Musical structure; Simple Recurrent Network; Sparsity. Introduction Work in the field of neural network modeling has been useful in creating simulations of functional machinations of human cognition and behavior. While many different architectures and learning algorithms exist, this paper will primarily focus on Elman’s Simple Recurrent Network (SRN) (1990), which was originally developed to process and predict the appearance of sequentially ordered stimuli. This feature makes the SRN a prime candidate for processing the structure of music. Modeling aspects of musical composition has shown that networks can be trained to ‘compose’ music after learning from many examples. One such network is Mozer’s CONCERT, which is a modified Elman network that is trained on input stimuli and attempts to extract two key features: which notes in the scale are musically appropriate, and which of those selected notes is the best stylistically. While ratings of this network were better than compositions chosen from a transition table, they still were compositions only their mother could love (Mozer, 1994). Other approaches have included aspects such as evolutionary algorithms (Todd, 1999) as well as utilizing self-organizing networks instead of relying on learning rules (Page, 1993). While most studies have concentrated on the success of these networks’ compositions, the studies in this paper will concentrate on the internal state of the network as it learns. Additionally, subjects’ ratings of the network’s compositions over time will be examined, as well as other network statistics, such as sparse coding. Sparse coding is a strategy in which a population of neurons completely encode a stimulus using a low number of active units. Taken to an extreme, this strategy is similar to the concept of a ‘Grandmother Cell’ that responds robustly to only one stimulus, and thus has a very low average firing rate. This is directly in contrast to a fully distributed system where every neuron takes part in encoding every stimulus and fires an average of half of the time. Sparse coding allows for the possibility that as a distributed system learns the structure of the world, it begins encoding in a more sparse and efficient manner. The benefits of sparse coding have been reviewed in depth (Field, 1994; Olshausen and Field, 2004), however this paper will concentrate on two of them. The first reason is that encoding stimuli using fewer neurons allows for a complete representation without the biological demands of having every neuron firing (Levy, 1996). The second reason, which is highlighted in these studies, is that a sparse code develops in order to efficiently mirror the structure of the world. By examining the neural network architecture over the learning trajectory, we can investigate how network sparsity changes with experience. Given the conventions of Western tonality in music (e.g. common chord progressions), as outlined by music theory, the progression of tones in music" @default.
- W2619501984 created "2017-06-05" @default.
- W2619501984 creator A5009359203 @default.
- W2619501984 creator A5056457346 @default.
- W2619501984 creator A5078786239 @default.
- W2619501984 date "2009-01-01" @default.
- W2619501984 modified "2023-09-23" @default.
- W2619501984 title "The Sparsity of Simple Recurrent Networks in Musical Structure Learning" @default.
- W2619501984 cites W134527144 @default.
- W2619501984 cites W1892402995 @default.
- W2619501984 cites W2023723978 @default.
- W2619501984 cites W2067621398 @default.
- W2619501984 cites W2074376560 @default.
- W2619501984 cites W2085927826 @default.
- W2619501984 cites W2087946919 @default.
- W2619501984 cites W2110485445 @default.
- W2619501984 cites W2152435897 @default.
- W2619501984 cites W2174443198 @default.
- W2619501984 cites W834204875 @default.
- W2619501984 hasPublicationYear "2009" @default.
- W2619501984 type Work @default.
- W2619501984 sameAs 2619501984 @default.
- W2619501984 citedByCount "1" @default.
- W2619501984 countsByYear W26195019842020 @default.
- W2619501984 crossrefType "journal-article" @default.
- W2619501984 hasAuthorship W2619501984A5009359203 @default.
- W2619501984 hasAuthorship W2619501984A5056457346 @default.
- W2619501984 hasAuthorship W2619501984A5078786239 @default.
- W2619501984 hasConcept C107038049 @default.
- W2619501984 hasConcept C108583219 @default.
- W2619501984 hasConcept C111472728 @default.
- W2619501984 hasConcept C119857082 @default.
- W2619501984 hasConcept C121332964 @default.
- W2619501984 hasConcept C1276947 @default.
- W2619501984 hasConcept C13662910 @default.
- W2619501984 hasConcept C138885662 @default.
- W2619501984 hasConcept C142362112 @default.
- W2619501984 hasConcept C153349607 @default.
- W2619501984 hasConcept C154945302 @default.
- W2619501984 hasConcept C15744967 @default.
- W2619501984 hasConcept C168725872 @default.
- W2619501984 hasConcept C180747234 @default.
- W2619501984 hasConcept C188147891 @default.
- W2619501984 hasConcept C2780586882 @default.
- W2619501984 hasConcept C41008148 @default.
- W2619501984 hasConcept C43803900 @default.
- W2619501984 hasConcept C558565934 @default.
- W2619501984 hasConceptScore W2619501984C107038049 @default.
- W2619501984 hasConceptScore W2619501984C108583219 @default.
- W2619501984 hasConceptScore W2619501984C111472728 @default.
- W2619501984 hasConceptScore W2619501984C119857082 @default.
- W2619501984 hasConceptScore W2619501984C121332964 @default.
- W2619501984 hasConceptScore W2619501984C1276947 @default.
- W2619501984 hasConceptScore W2619501984C13662910 @default.
- W2619501984 hasConceptScore W2619501984C138885662 @default.
- W2619501984 hasConceptScore W2619501984C142362112 @default.
- W2619501984 hasConceptScore W2619501984C153349607 @default.
- W2619501984 hasConceptScore W2619501984C154945302 @default.
- W2619501984 hasConceptScore W2619501984C15744967 @default.
- W2619501984 hasConceptScore W2619501984C168725872 @default.
- W2619501984 hasConceptScore W2619501984C180747234 @default.
- W2619501984 hasConceptScore W2619501984C188147891 @default.
- W2619501984 hasConceptScore W2619501984C2780586882 @default.
- W2619501984 hasConceptScore W2619501984C41008148 @default.
- W2619501984 hasConceptScore W2619501984C43803900 @default.
- W2619501984 hasConceptScore W2619501984C558565934 @default.
- W2619501984 hasIssue "31" @default.
- W2619501984 hasLocation W26195019841 @default.
- W2619501984 hasOpenAccess W2619501984 @default.
- W2619501984 hasPrimaryLocation W26195019841 @default.
- W2619501984 hasRelatedWork W134527144 @default.
- W2619501984 hasRelatedWork W1529023101 @default.
- W2619501984 hasRelatedWork W154123674 @default.
- W2619501984 hasRelatedWork W1577013761 @default.
- W2619501984 hasRelatedWork W158274995 @default.
- W2619501984 hasRelatedWork W1906285364 @default.
- W2619501984 hasRelatedWork W1982673570 @default.
- W2619501984 hasRelatedWork W2034161986 @default.
- W2619501984 hasRelatedWork W2059927004 @default.
- W2619501984 hasRelatedWork W2067621398 @default.
- W2619501984 hasRelatedWork W2095428299 @default.
- W2619501984 hasRelatedWork W2142980624 @default.
- W2619501984 hasRelatedWork W2185636028 @default.
- W2619501984 hasRelatedWork W2471733366 @default.
- W2619501984 hasRelatedWork W2546667588 @default.
- W2619501984 hasRelatedWork W2559726422 @default.
- W2619501984 hasRelatedWork W2568624200 @default.
- W2619501984 hasRelatedWork W2619494429 @default.
- W2619501984 hasRelatedWork W3204794647 @default.
- W2619501984 hasRelatedWork W47651435 @default.
- W2619501984 hasVolume "31" @default.
- W2619501984 isParatext "false" @default.
- W2619501984 isRetracted "false" @default.
- W2619501984 magId "2619501984" @default.
- W2619501984 workType "article" @default.