Matches in SemOpenAlex for { <https://semopenalex.org/work/W3205721846> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W3205721846 endingPage "124" @default.
- W3205721846 startingPage "1" @default.
- W3205721846 abstract "The growing energy and performance costs of deep learning have driven the community to reduce the size of neural networks by selectively pruning components. Similarly to their biological counterparts, sparse networks generalize just as well, if not better than, the original dense networks. Sparsity can reduce the memory footprint of regular networks to fit mobile devices, as well as shorten training time for ever growing networks. In this paper, we survey prior work on sparsity in deep learning and provide an extensive tutorial of sparsification for both inference and training. We describe approaches to remove and add elements of neural networks, different training strategies to achieve model sparsity, and mechanisms to exploit sparsity in practice. Our work distills ideas from more than 300 research papers and provides guidance to practitioners who wish to utilize sparsity today, as well as to researchers whose goal is to push the frontier forward. We include the necessary background on mathematical methods in sparsification, describe phenomena such as early structure adaptation, the intricate relations between sparsity and the training process, and show techniques for achieving acceleration on real hardware. We also define a metric of pruned parameter efficiency that could serve as a baseline for comparison of different sparse networks. We close by speculating on how sparsity can improve future workloads and outline major open problems in the field." @default.
- W3205721846 created "2021-10-25" @default.
- W3205721846 creator A5002615744 @default.
- W3205721846 creator A5026990786 @default.
- W3205721846 creator A5066306669 @default.
- W3205721846 creator A5076938722 @default.
- W3205721846 creator A5083822059 @default.
- W3205721846 date "2021-01-01" @default.
- W3205721846 modified "2023-09-26" @default.
- W3205721846 title "Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks" @default.
- W3205721846 hasPublicationYear "2021" @default.
- W3205721846 type Work @default.
- W3205721846 sameAs 3205721846 @default.
- W3205721846 citedByCount "2" @default.
- W3205721846 countsByYear W32057218462020 @default.
- W3205721846 countsByYear W32057218462021 @default.
- W3205721846 crossrefType "journal-article" @default.
- W3205721846 hasAuthorship W3205721846A5002615744 @default.
- W3205721846 hasAuthorship W3205721846A5026990786 @default.
- W3205721846 hasAuthorship W3205721846A5066306669 @default.
- W3205721846 hasAuthorship W3205721846A5076938722 @default.
- W3205721846 hasAuthorship W3205721846A5083822059 @default.
- W3205721846 hasConcept C108010975 @default.
- W3205721846 hasConcept C108583219 @default.
- W3205721846 hasConcept C111919701 @default.
- W3205721846 hasConcept C119857082 @default.
- W3205721846 hasConcept C154945302 @default.
- W3205721846 hasConcept C162324750 @default.
- W3205721846 hasConcept C165696696 @default.
- W3205721846 hasConcept C176217482 @default.
- W3205721846 hasConcept C202444582 @default.
- W3205721846 hasConcept C21547014 @default.
- W3205721846 hasConcept C2776214188 @default.
- W3205721846 hasConcept C33923547 @default.
- W3205721846 hasConcept C38652104 @default.
- W3205721846 hasConcept C41008148 @default.
- W3205721846 hasConcept C50644808 @default.
- W3205721846 hasConcept C6557445 @default.
- W3205721846 hasConcept C74912251 @default.
- W3205721846 hasConcept C86803240 @default.
- W3205721846 hasConcept C9652623 @default.
- W3205721846 hasConcept C98045186 @default.
- W3205721846 hasConceptScore W3205721846C108010975 @default.
- W3205721846 hasConceptScore W3205721846C108583219 @default.
- W3205721846 hasConceptScore W3205721846C111919701 @default.
- W3205721846 hasConceptScore W3205721846C119857082 @default.
- W3205721846 hasConceptScore W3205721846C154945302 @default.
- W3205721846 hasConceptScore W3205721846C162324750 @default.
- W3205721846 hasConceptScore W3205721846C165696696 @default.
- W3205721846 hasConceptScore W3205721846C176217482 @default.
- W3205721846 hasConceptScore W3205721846C202444582 @default.
- W3205721846 hasConceptScore W3205721846C21547014 @default.
- W3205721846 hasConceptScore W3205721846C2776214188 @default.
- W3205721846 hasConceptScore W3205721846C33923547 @default.
- W3205721846 hasConceptScore W3205721846C38652104 @default.
- W3205721846 hasConceptScore W3205721846C41008148 @default.
- W3205721846 hasConceptScore W3205721846C50644808 @default.
- W3205721846 hasConceptScore W3205721846C6557445 @default.
- W3205721846 hasConceptScore W3205721846C74912251 @default.
- W3205721846 hasConceptScore W3205721846C86803240 @default.
- W3205721846 hasConceptScore W3205721846C9652623 @default.
- W3205721846 hasConceptScore W3205721846C98045186 @default.
- W3205721846 hasIssue "241" @default.
- W3205721846 hasLocation W32057218461 @default.
- W3205721846 hasOpenAccess W3205721846 @default.
- W3205721846 hasPrimaryLocation W32057218461 @default.
- W3205721846 hasRelatedWork W1486687522 @default.
- W3205721846 hasRelatedWork W2263490141 @default.
- W3205721846 hasRelatedWork W2733070018 @default.
- W3205721846 hasRelatedWork W2765390540 @default.
- W3205721846 hasRelatedWork W2766953637 @default.
- W3205721846 hasRelatedWork W2788686132 @default.
- W3205721846 hasRelatedWork W2914802228 @default.
- W3205721846 hasRelatedWork W2949626996 @default.
- W3205721846 hasRelatedWork W2952914312 @default.
- W3205721846 hasRelatedWork W3000156901 @default.
- W3205721846 hasRelatedWork W3008124487 @default.
- W3205721846 hasRelatedWork W3008314020 @default.
- W3205721846 hasRelatedWork W3014879845 @default.
- W3205721846 hasRelatedWork W3025504802 @default.
- W3205721846 hasRelatedWork W3093382707 @default.
- W3205721846 hasRelatedWork W3125151176 @default.
- W3205721846 hasRelatedWork W3126611227 @default.
- W3205721846 hasRelatedWork W3129093240 @default.
- W3205721846 hasRelatedWork W3165282941 @default.
- W3205721846 hasRelatedWork W3174479481 @default.
- W3205721846 hasVolume "22" @default.
- W3205721846 isParatext "false" @default.
- W3205721846 isRetracted "false" @default.
- W3205721846 magId "3205721846" @default.
- W3205721846 workType "article" @default.