Matches in SemOpenAlex for { <https://semopenalex.org/work/W2990007648> ?p ?o ?g. }
Showing items 1 to 78 of
78
with 100 items per page.
- W2990007648 abstract "Deep neural networks have revolutionized the field of machine learning by dramatically improving the state-of-the-art in various domains. The sizes of deep neural networks (DNNs) are rapidly outgrowing the capacity of hardware to fast store and train them. Over the past few decades, researches have explored the prospect of sparse DNNs before, during, and after training by pruning edges from the underlying topology. After the above operation, the generated neural network is known as a sparse neural network. More recent works have demonstrated the remarkable results that certain sparse DNNs can train to the same precision as dense DNNs at lower runtime and storage cost. Although existing methods ease the situation that high demand for computation resources severely hinders the deployment of large-scale DNNs in resource-constrained devices, DNNs can be trained at a faster speed and lower cost. In this work, we propose a Fine-tune Structured Sparsity Learning (FSSL) method to regularize the structures of DNNs and accelerate the training of DNNs. FSSL can: (1) learn a compact structure from large sparse DNN to reduce computation cost; (2) obtain a hardware-friendly to accelerate the DNNs evaluation efficiently. Experimental results of the training time and the compression rate show that superior performance and efficiency than the Matlab example code. These speedups are about twice speedups of non-structured sparsity." @default.
- W2990007648 created "2019-12-05" @default.
- W2990007648 creator A5000146423 @default.
- W2990007648 creator A5016038454 @default.
- W2990007648 creator A5019534297 @default.
- W2990007648 creator A5034232964 @default.
- W2990007648 creator A5044757881 @default.
- W2990007648 creator A5058439988 @default.
- W2990007648 creator A5074472751 @default.
- W2990007648 date "2019-09-01" @default.
- W2990007648 modified "2023-09-24" @default.
- W2990007648 title "Performance of Training Sparse Deep Neural Networks on GPUs" @default.
- W2990007648 cites W1965034778 @default.
- W2990007648 cites W1965248225 @default.
- W2990007648 cites W1989337816 @default.
- W2990007648 cites W2015103469 @default.
- W2990007648 cites W2088866486 @default.
- W2990007648 cites W2104636679 @default.
- W2990007648 cites W2124807415 @default.
- W2990007648 cites W2141003547 @default.
- W2990007648 cites W2194775991 @default.
- W2990007648 cites W2257979135 @default.
- W2990007648 cites W2902093742 @default.
- W2990007648 cites W316935178 @default.
- W2990007648 cites W4247712932 @default.
- W2990007648 doi "https://doi.org/10.1109/hpec.2019.8916506" @default.
- W2990007648 hasPublicationYear "2019" @default.
- W2990007648 type Work @default.
- W2990007648 sameAs 2990007648 @default.
- W2990007648 citedByCount "8" @default.
- W2990007648 countsByYear W29900076482020 @default.
- W2990007648 countsByYear W29900076482021 @default.
- W2990007648 countsByYear W29900076482022 @default.
- W2990007648 crossrefType "proceedings-article" @default.
- W2990007648 hasAuthorship W2990007648A5000146423 @default.
- W2990007648 hasAuthorship W2990007648A5016038454 @default.
- W2990007648 hasAuthorship W2990007648A5019534297 @default.
- W2990007648 hasAuthorship W2990007648A5034232964 @default.
- W2990007648 hasAuthorship W2990007648A5044757881 @default.
- W2990007648 hasAuthorship W2990007648A5058439988 @default.
- W2990007648 hasAuthorship W2990007648A5074472751 @default.
- W2990007648 hasConcept C108583219 @default.
- W2990007648 hasConcept C118524514 @default.
- W2990007648 hasConcept C121332964 @default.
- W2990007648 hasConcept C153294291 @default.
- W2990007648 hasConcept C154945302 @default.
- W2990007648 hasConcept C173608175 @default.
- W2990007648 hasConcept C2777211547 @default.
- W2990007648 hasConcept C2984842247 @default.
- W2990007648 hasConcept C41008148 @default.
- W2990007648 hasConcept C50644808 @default.
- W2990007648 hasConceptScore W2990007648C108583219 @default.
- W2990007648 hasConceptScore W2990007648C118524514 @default.
- W2990007648 hasConceptScore W2990007648C121332964 @default.
- W2990007648 hasConceptScore W2990007648C153294291 @default.
- W2990007648 hasConceptScore W2990007648C154945302 @default.
- W2990007648 hasConceptScore W2990007648C173608175 @default.
- W2990007648 hasConceptScore W2990007648C2777211547 @default.
- W2990007648 hasConceptScore W2990007648C2984842247 @default.
- W2990007648 hasConceptScore W2990007648C41008148 @default.
- W2990007648 hasConceptScore W2990007648C50644808 @default.
- W2990007648 hasLocation W29900076481 @default.
- W2990007648 hasOpenAccess W2990007648 @default.
- W2990007648 hasPrimaryLocation W29900076481 @default.
- W2990007648 hasRelatedWork W2279398222 @default.
- W2990007648 hasRelatedWork W2620920084 @default.
- W2990007648 hasRelatedWork W2915754718 @default.
- W2990007648 hasRelatedWork W2950066684 @default.
- W2990007648 hasRelatedWork W3082895349 @default.
- W2990007648 hasRelatedWork W3124304076 @default.
- W2990007648 hasRelatedWork W3139644427 @default.
- W2990007648 hasRelatedWork W4298388782 @default.
- W2990007648 hasRelatedWork W4299822940 @default.
- W2990007648 hasRelatedWork W1829305295 @default.
- W2990007648 isParatext "false" @default.
- W2990007648 isRetracted "false" @default.
- W2990007648 magId "2990007648" @default.
- W2990007648 workType "article" @default.