Matches in SemOpenAlex for { <https://semopenalex.org/work/W4281661597> ?p ?o ?g. }
- W4281661597 abstract "Neuroscience models commonly have a high number of degrees of freedom and only specific regions within the parameter space are able to produce dynamics of interest. This makes the development of tools and strategies to efficiently find these regions of high importance to advance brain research. Exploring the high dimensional parameter space using numerical simulations has been a frequently used technique in the last years in many areas of computational neuroscience. Today, high performance computing (HPC) can provide a powerful infrastructure to speed up explorations and increase our general understanding of the behavior of the model in reasonable times. Learning to learn (L2L) is a well-known concept in machine learning (ML) and a specific method for acquiring constraints to improve learning performance. This concept can be decomposed into a two loop optimization process where the target of optimization can consist of any program such as an artificial neural network, a spiking network, a single cell model, or a whole brain simulation. In this work, we present L2L as an easy to use and flexible framework to perform parameter and hyper-parameter space exploration of neuroscience models on HPC infrastructure. Learning to learn is an implementation of the L2L concept written in Python. This open-source software allows several instances of an optimization target to be executed with different parameters in an embarrassingly parallel fashion on HPC. L2L provides a set of built-in optimizer algorithms, which make adaptive and efficient exploration of parameter spaces possible. Different from other optimization toolboxes, L2L provides maximum flexibility for the way the optimization target can be executed. In this paper, we show a variety of examples of neuroscience models being optimized within the L2L framework to execute different types of tasks. The tasks used to illustrate the concept go from reproducing empirical data to learning how to solve a problem in a dynamic environment. We particularly focus on simulations with models ranging from the single cell to the whole brain and using a variety of simulation engines like NEST, Arbor, TVB, OpenAIGym, and NetLogo." @default.
- W4281661597 created "2022-06-13" @default.
- W4281661597 creator A5001402143 @default.
- W4281661597 creator A5014488498 @default.
- W4281661597 creator A5016175782 @default.
- W4281661597 creator A5017780892 @default.
- W4281661597 creator A5030773854 @default.
- W4281661597 creator A5036437857 @default.
- W4281661597 creator A5050527512 @default.
- W4281661597 creator A5073083912 @default.
- W4281661597 creator A5075499749 @default.
- W4281661597 creator A5077669626 @default.
- W4281661597 date "2022-05-27" @default.
- W4281661597 modified "2023-10-05" @default.
- W4281661597 title "Exploring Parameter and Hyper-Parameter Spaces of Neuroscience Models on High Performance Computers With Learning to Learn" @default.
- W4281661597 cites W1596936080 @default.
- W4281661597 cites W1663942731 @default.
- W4281661597 cites W2067749416 @default.
- W4281661597 cites W2087861759 @default.
- W4281661597 cites W2100260601 @default.
- W4281661597 cites W2103179919 @default.
- W4281661597 cites W2107262494 @default.
- W4281661597 cites W2123491442 @default.
- W4281661597 cites W2131181615 @default.
- W4281661597 cites W2143534834 @default.
- W4281661597 cites W2145749918 @default.
- W4281661597 cites W2152195021 @default.
- W4281661597 cites W2161460637 @default.
- W4281661597 cites W2417642972 @default.
- W4281661597 cites W2786890178 @default.
- W4281661597 cites W2788296660 @default.
- W4281661597 cites W2807113328 @default.
- W4281661597 cites W2906697496 @default.
- W4281661597 cites W2912474515 @default.
- W4281661597 cites W2947476713 @default.
- W4281661597 cites W2963150511 @default.
- W4281661597 cites W2966284335 @default.
- W4281661597 cites W2991232432 @default.
- W4281661597 cites W3017769710 @default.
- W4281661597 cites W3099967679 @default.
- W4281661597 cites W3102551889 @default.
- W4281661597 cites W3119079027 @default.
- W4281661597 cites W3134038167 @default.
- W4281661597 cites W3199034141 @default.
- W4281661597 cites W4212851507 @default.
- W4281661597 cites W4213308398 @default.
- W4281661597 cites W4301318588 @default.
- W4281661597 doi "https://doi.org/10.3389/fncom.2022.885207" @default.
- W4281661597 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/35720775" @default.
- W4281661597 hasPublicationYear "2022" @default.
- W4281661597 type Work @default.
- W4281661597 citedByCount "3" @default.
- W4281661597 countsByYear W42816615972022 @default.
- W4281661597 countsByYear W42816615972023 @default.
- W4281661597 crossrefType "journal-article" @default.
- W4281661597 hasAuthorship W4281661597A5001402143 @default.
- W4281661597 hasAuthorship W4281661597A5014488498 @default.
- W4281661597 hasAuthorship W4281661597A5016175782 @default.
- W4281661597 hasAuthorship W4281661597A5017780892 @default.
- W4281661597 hasAuthorship W4281661597A5030773854 @default.
- W4281661597 hasAuthorship W4281661597A5036437857 @default.
- W4281661597 hasAuthorship W4281661597A5050527512 @default.
- W4281661597 hasAuthorship W4281661597A5073083912 @default.
- W4281661597 hasAuthorship W4281661597A5075499749 @default.
- W4281661597 hasAuthorship W4281661597A5077669626 @default.
- W4281661597 hasBestOaLocation W42816615971 @default.
- W4281661597 hasConcept C105795698 @default.
- W4281661597 hasConcept C111919701 @default.
- W4281661597 hasConcept C119857082 @default.
- W4281661597 hasConcept C15286952 @default.
- W4281661597 hasConcept C154945302 @default.
- W4281661597 hasConcept C177264268 @default.
- W4281661597 hasConcept C199360897 @default.
- W4281661597 hasConcept C2777904410 @default.
- W4281661597 hasConcept C2778572836 @default.
- W4281661597 hasConcept C2780598303 @default.
- W4281661597 hasConcept C33923547 @default.
- W4281661597 hasConcept C41008148 @default.
- W4281661597 hasConcept C50644808 @default.
- W4281661597 hasConcept C519991488 @default.
- W4281661597 hasConcept C73586568 @default.
- W4281661597 hasConcept C83283714 @default.
- W4281661597 hasConceptScore W4281661597C105795698 @default.
- W4281661597 hasConceptScore W4281661597C111919701 @default.
- W4281661597 hasConceptScore W4281661597C119857082 @default.
- W4281661597 hasConceptScore W4281661597C15286952 @default.
- W4281661597 hasConceptScore W4281661597C154945302 @default.
- W4281661597 hasConceptScore W4281661597C177264268 @default.
- W4281661597 hasConceptScore W4281661597C199360897 @default.
- W4281661597 hasConceptScore W4281661597C2777904410 @default.
- W4281661597 hasConceptScore W4281661597C2778572836 @default.
- W4281661597 hasConceptScore W4281661597C2780598303 @default.
- W4281661597 hasConceptScore W4281661597C33923547 @default.
- W4281661597 hasConceptScore W4281661597C41008148 @default.
- W4281661597 hasConceptScore W4281661597C50644808 @default.
- W4281661597 hasConceptScore W4281661597C519991488 @default.
- W4281661597 hasConceptScore W4281661597C73586568 @default.
- W4281661597 hasConceptScore W4281661597C83283714 @default.
- W4281661597 hasLocation W42816615971 @default.
- W4281661597 hasLocation W42816615972 @default.