Matches in SemOpenAlex for { <https://semopenalex.org/work/W2895891814> ?p ?o ?g. }
- W2895891814 endingPage "976" @default.
- W2895891814 startingPage "965" @default.
- W2895891814 abstract "Benefitting from large-scale training datasets and the complex training network, Convolutional Neural Networks (CNNs) are widely applied in various fields with high accuracy. However, the training process of CNNs is very time-consuming, where large amounts of training samples and iterative operations are required to obtain high-quality weight parameters. In this paper, we focus on the time-consuming training process of large-scale CNNs and propose a Bi-layered Parallel Training (BPT-CNN) architecture in distributed computing environments. BPT-CNN consists of two main components: (a) an outer-layer parallel training for multiple CNN subnetworks on separate data subsets, and (b) an inner-layer parallel training for each subnetwork. In the outer-layer parallelism, we address critical issues of distributed and parallel computing, including data communication, synchronization, and workload balance. A heterogeneous-aware Incremental Data Partitioning and Allocation (IDPA) strategy is proposed, where large-scale training datasets are partitioned and allocated to the computing nodes in batches according to their computing power. To minimize the synchronization waiting during the global weight update process, an Asynchronous Global Weight Update (AGWU) strategy is proposed. In the inner-layer parallelism, we further accelerate the training process for each CNN subnetwork on each computer, where computation steps of convolutional layer and the local weight training are parallelized based on task-parallelism. We introduce task decomposition and scheduling strategies with the objectives of thread-level load balancing and minimum waiting time for critical paths. Extensive experimental results indicate that the proposed BPT-CNN effectively improves the training performance of CNNs while maintaining the accuracy." @default.
- W2895891814 created "2018-10-26" @default.
- W2895891814 creator A5036357902 @default.
- W2895891814 creator A5039639491 @default.
- W2895891814 creator A5055197093 @default.
- W2895891814 creator A5077980579 @default.
- W2895891814 creator A5084696350 @default.
- W2895891814 creator A5091496905 @default.
- W2895891814 date "2019-05-01" @default.
- W2895891814 modified "2023-10-13" @default.
- W2895891814 title "A Bi-layered Parallel Training Architecture for Large-Scale Convolutional Neural Networks" @default.
- W2895891814 cites W1978660892 @default.
- W2895891814 cites W1980147176 @default.
- W2895891814 cites W2009832130 @default.
- W2895891814 cites W2026883296 @default.
- W2895891814 cites W2043039667 @default.
- W2895891814 cites W2070167224 @default.
- W2895891814 cites W2085105049 @default.
- W2895891814 cites W2096645269 @default.
- W2895891814 cites W2108598243 @default.
- W2895891814 cites W2135229999 @default.
- W2895891814 cites W2155893237 @default.
- W2895891814 cites W2289252105 @default.
- W2895891814 cites W2403646140 @default.
- W2895891814 cites W2488745305 @default.
- W2895891814 cites W2555706138 @default.
- W2895891814 cites W274829104 @default.
- W2895891814 cites W2766694763 @default.
- W2895891814 cites W2767020884 @default.
- W2895891814 doi "https://doi.org/10.1109/tpds.2018.2877359" @default.
- W2895891814 hasPublicationYear "2019" @default.
- W2895891814 type Work @default.
- W2895891814 sameAs 2895891814 @default.
- W2895891814 citedByCount "126" @default.
- W2895891814 countsByYear W28958918142019 @default.
- W2895891814 countsByYear W28958918142020 @default.
- W2895891814 countsByYear W28958918142021 @default.
- W2895891814 countsByYear W28958918142022 @default.
- W2895891814 countsByYear W28958918142023 @default.
- W2895891814 crossrefType "journal-article" @default.
- W2895891814 hasAuthorship W2895891814A5036357902 @default.
- W2895891814 hasAuthorship W2895891814A5039639491 @default.
- W2895891814 hasAuthorship W2895891814A5055197093 @default.
- W2895891814 hasAuthorship W2895891814A5077980579 @default.
- W2895891814 hasAuthorship W2895891814A5084696350 @default.
- W2895891814 hasAuthorship W2895891814A5091496905 @default.
- W2895891814 hasBestOaLocation W28958918142 @default.
- W2895891814 hasConcept C120314980 @default.
- W2895891814 hasConcept C127162648 @default.
- W2895891814 hasConcept C138959212 @default.
- W2895891814 hasConcept C151319957 @default.
- W2895891814 hasConcept C154945302 @default.
- W2895891814 hasConcept C162324750 @default.
- W2895891814 hasConcept C173608175 @default.
- W2895891814 hasConcept C187691185 @default.
- W2895891814 hasConcept C206729178 @default.
- W2895891814 hasConcept C21547014 @default.
- W2895891814 hasConcept C2524010 @default.
- W2895891814 hasConcept C2778562939 @default.
- W2895891814 hasConcept C2780186347 @default.
- W2895891814 hasConcept C2781172179 @default.
- W2895891814 hasConcept C31258907 @default.
- W2895891814 hasConcept C33923547 @default.
- W2895891814 hasConcept C41008148 @default.
- W2895891814 hasConcept C61483411 @default.
- W2895891814 hasConcept C81363708 @default.
- W2895891814 hasConcept C83283714 @default.
- W2895891814 hasConceptScore W2895891814C120314980 @default.
- W2895891814 hasConceptScore W2895891814C127162648 @default.
- W2895891814 hasConceptScore W2895891814C138959212 @default.
- W2895891814 hasConceptScore W2895891814C151319957 @default.
- W2895891814 hasConceptScore W2895891814C154945302 @default.
- W2895891814 hasConceptScore W2895891814C162324750 @default.
- W2895891814 hasConceptScore W2895891814C173608175 @default.
- W2895891814 hasConceptScore W2895891814C187691185 @default.
- W2895891814 hasConceptScore W2895891814C206729178 @default.
- W2895891814 hasConceptScore W2895891814C21547014 @default.
- W2895891814 hasConceptScore W2895891814C2524010 @default.
- W2895891814 hasConceptScore W2895891814C2778562939 @default.
- W2895891814 hasConceptScore W2895891814C2780186347 @default.
- W2895891814 hasConceptScore W2895891814C2781172179 @default.
- W2895891814 hasConceptScore W2895891814C31258907 @default.
- W2895891814 hasConceptScore W2895891814C33923547 @default.
- W2895891814 hasConceptScore W2895891814C41008148 @default.
- W2895891814 hasConceptScore W2895891814C61483411 @default.
- W2895891814 hasConceptScore W2895891814C81363708 @default.
- W2895891814 hasConceptScore W2895891814C83283714 @default.
- W2895891814 hasFunder F4320306076 @default.
- W2895891814 hasFunder F4320321001 @default.
- W2895891814 hasFunder F4320321543 @default.
- W2895891814 hasIssue "5" @default.
- W2895891814 hasLocation W28958918141 @default.
- W2895891814 hasLocation W28958918142 @default.
- W2895891814 hasOpenAccess W2895891814 @default.
- W2895891814 hasPrimaryLocation W28958918141 @default.
- W2895891814 hasRelatedWork W1513409726 @default.
- W2895891814 hasRelatedWork W1572108542 @default.
- W2895891814 hasRelatedWork W159927083 @default.