Matches in SemOpenAlex for { <https://semopenalex.org/work/W2998008288> ?p ?o ?g. }
Showing items 1 to 92 of
92
with 100 items per page.
- W2998008288 abstract "Mapping all the neurons in the brain requires automatic reconstruction of entire cells from volume electron microscopy data. The flood-filling network (FFN) architecture has demonstrated leading performance for segmenting structures from this data. However, the training of the network is computationally expensive. In order to reduce the training time, we implemented synchronous and data-parallel distributed training using the Horovod library, which is different from the asynchronous training scheme used in the published FFN code. We demonstrated that our distributed training scaled well up to 2048 Intel Knights Landing (KNL) nodes on the Theta supercomputer. Our trained models achieved similar level of inference performance, but took less training time compared to previous methods. Our study on the effects of different batch sizes on FFN training suggests ways to further improve training efficiency. Our findings on optimal learning rate and batch sizes agree with previous works." @default.
- W2998008288 created "2020-01-10" @default.
- W2998008288 creator A5001934097 @default.
- W2998008288 creator A5019168041 @default.
- W2998008288 creator A5027469892 @default.
- W2998008288 creator A5036112655 @default.
- W2998008288 creator A5047558324 @default.
- W2998008288 creator A5062747815 @default.
- W2998008288 creator A5072963231 @default.
- W2998008288 creator A5073368437 @default.
- W2998008288 creator A5075500139 @default.
- W2998008288 creator A5076548549 @default.
- W2998008288 creator A5081838975 @default.
- W2998008288 creator A5091300565 @default.
- W2998008288 date "2019-11-01" @default.
- W2998008288 modified "2023-10-01" @default.
- W2998008288 title "Scaling Distributed Training of Flood-Filling Networks on HPC Infrastructure for Brain Mapping" @default.
- W2998008288 cites W1513082520 @default.
- W2998008288 cites W1898703532 @default.
- W2998008288 cites W2026354831 @default.
- W2998008288 cites W2057332538 @default.
- W2998008288 cites W2072566913 @default.
- W2998008288 cites W2080858319 @default.
- W2998008288 cites W2161515775 @default.
- W2998008288 cites W2232480453 @default.
- W2998008288 cites W2760861884 @default.
- W2998008288 cites W2962747323 @default.
- W2998008288 cites W2962941615 @default.
- W2998008288 cites W2963417518 @default.
- W2998008288 doi "https://doi.org/10.1109/dls49591.2019.00012" @default.
- W2998008288 hasPublicationYear "2019" @default.
- W2998008288 type Work @default.
- W2998008288 sameAs 2998008288 @default.
- W2998008288 citedByCount "5" @default.
- W2998008288 countsByYear W29980082882020 @default.
- W2998008288 countsByYear W29980082882021 @default.
- W2998008288 countsByYear W29980082882022 @default.
- W2998008288 countsByYear W29980082882023 @default.
- W2998008288 crossrefType "proceedings-article" @default.
- W2998008288 hasAuthorship W2998008288A5001934097 @default.
- W2998008288 hasAuthorship W2998008288A5019168041 @default.
- W2998008288 hasAuthorship W2998008288A5027469892 @default.
- W2998008288 hasAuthorship W2998008288A5036112655 @default.
- W2998008288 hasAuthorship W2998008288A5047558324 @default.
- W2998008288 hasAuthorship W2998008288A5062747815 @default.
- W2998008288 hasAuthorship W2998008288A5072963231 @default.
- W2998008288 hasAuthorship W2998008288A5073368437 @default.
- W2998008288 hasAuthorship W2998008288A5075500139 @default.
- W2998008288 hasAuthorship W2998008288A5076548549 @default.
- W2998008288 hasAuthorship W2998008288A5081838975 @default.
- W2998008288 hasAuthorship W2998008288A5091300565 @default.
- W2998008288 hasBestOaLocation W29980082882 @default.
- W2998008288 hasConcept C120314980 @default.
- W2998008288 hasConcept C121332964 @default.
- W2998008288 hasConcept C153294291 @default.
- W2998008288 hasConcept C166957645 @default.
- W2998008288 hasConcept C205649164 @default.
- W2998008288 hasConcept C2524010 @default.
- W2998008288 hasConcept C2777211547 @default.
- W2998008288 hasConcept C33923547 @default.
- W2998008288 hasConcept C41008148 @default.
- W2998008288 hasConcept C74256435 @default.
- W2998008288 hasConcept C99844830 @default.
- W2998008288 hasConceptScore W2998008288C120314980 @default.
- W2998008288 hasConceptScore W2998008288C121332964 @default.
- W2998008288 hasConceptScore W2998008288C153294291 @default.
- W2998008288 hasConceptScore W2998008288C166957645 @default.
- W2998008288 hasConceptScore W2998008288C205649164 @default.
- W2998008288 hasConceptScore W2998008288C2524010 @default.
- W2998008288 hasConceptScore W2998008288C2777211547 @default.
- W2998008288 hasConceptScore W2998008288C33923547 @default.
- W2998008288 hasConceptScore W2998008288C41008148 @default.
- W2998008288 hasConceptScore W2998008288C74256435 @default.
- W2998008288 hasConceptScore W2998008288C99844830 @default.
- W2998008288 hasLocation W29980082881 @default.
- W2998008288 hasLocation W29980082882 @default.
- W2998008288 hasOpenAccess W2998008288 @default.
- W2998008288 hasPrimaryLocation W29980082881 @default.
- W2998008288 hasRelatedWork W1485627940 @default.
- W2998008288 hasRelatedWork W1587227328 @default.
- W2998008288 hasRelatedWork W1596201972 @default.
- W2998008288 hasRelatedWork W1880774266 @default.
- W2998008288 hasRelatedWork W1986253068 @default.
- W2998008288 hasRelatedWork W2028061998 @default.
- W2998008288 hasRelatedWork W2152433827 @default.
- W2998008288 hasRelatedWork W2160425906 @default.
- W2998008288 hasRelatedWork W2998813341 @default.
- W2998008288 hasRelatedWork W3089338843 @default.
- W2998008288 isParatext "false" @default.
- W2998008288 isRetracted "false" @default.
- W2998008288 magId "2998008288" @default.
- W2998008288 workType "article" @default.