Matches in SemOpenAlex for { <https://semopenalex.org/work/W3012022301> ?p ?o ?g. }
- W3012022301 abstract "Accelerating the inference speed of CNNs is critical to their deployment in real-world applications. Among all the pruning approaches, those implementing a sparsity learning framework have shown to be effective as they learn and prune the models in an end-to-end data-driven manner. However, these works impose the same sparsity regularization on all filters indiscriminately, which can hardly result in an optimal structure-sparse network. In this paper, we propose a Saliency-Adaptive Sparsity Learning (SASL) approach for further optimization. A novel and effective estimation of each filter, i.e., saliency, is designed, which is measured from two aspects: the importance for the prediction performance and the consumed computational resources. During sparsity learning, the regularization strength is adjusted according to the saliency, so our optimized format can better preserve the prediction performance while zeroing out more computation-heavy filters. The calculation for saliency introduces minimum overhead to the training process, which means our SASL is very efficient. During the pruning phase, in order to optimize the proposed data-dependent criterion, a hard sample mining strategy is utilized, which shows higher effectiveness and efficiency. Extensive experiments demonstrate the superior performance of our method. Notably, on ILSVRC-2012 dataset, our approach can reduce 49.7% FLOPs of ResNet-50 with very negligible 0.39% top-1 and 0.05% top-5 accuracy degradation." @default.
- W3012022301 created "2020-03-23" @default.
- W3012022301 creator A5037315478 @default.
- W3012022301 creator A5044264346 @default.
- W3012022301 creator A5079572598 @default.
- W3012022301 creator A5079581959 @default.
- W3012022301 date "2020-03-12" @default.
- W3012022301 modified "2023-09-26" @default.
- W3012022301 title "SASL: Saliency-Adaptive Sparsity Learning for Neural Network Acceleration" @default.
- W3012022301 cites W104184427 @default.
- W3012022301 cites W1677182931 @default.
- W3012022301 cites W1686810756 @default.
- W3012022301 cites W1799366690 @default.
- W3012022301 cites W1903029394 @default.
- W3012022301 cites W2097117768 @default.
- W3012022301 cites W2114766824 @default.
- W3012022301 cites W2117539524 @default.
- W3012022301 cites W2125389748 @default.
- W3012022301 cites W2163605009 @default.
- W3012022301 cites W2194775991 @default.
- W3012022301 cites W2331143823 @default.
- W3012022301 cites W2495425901 @default.
- W3012022301 cites W2515385951 @default.
- W3012022301 cites W2520760693 @default.
- W3012022301 cites W2582745083 @default.
- W3012022301 cites W2613718673 @default.
- W3012022301 cites W2758000438 @default.
- W3012022301 cites W2788715907 @default.
- W3012022301 cites W2798275680 @default.
- W3012022301 cites W2805003733 @default.
- W3012022301 cites W2807961551 @default.
- W3012022301 cites W2886851211 @default.
- W3012022301 cites W2891462450 @default.
- W3012022301 cites W2893585013 @default.
- W3012022301 cites W2899771611 @default.
- W3012022301 cites W2909680570 @default.
- W3012022301 cites W2924515500 @default.
- W3012022301 cites W2924888702 @default.
- W3012022301 cites W2928560789 @default.
- W3012022301 cites W2928762566 @default.
- W3012022301 cites W2947963429 @default.
- W3012022301 cites W2950220847 @default.
- W3012022301 cites W2951886768 @default.
- W3012022301 cites W2951977814 @default.
- W3012022301 cites W2962851801 @default.
- W3012022301 cites W2963000224 @default.
- W3012022301 cites W2963094099 @default.
- W3012022301 cites W2963140066 @default.
- W3012022301 cites W2963145730 @default.
- W3012022301 cites W2963223345 @default.
- W3012022301 cites W2963363373 @default.
- W3012022301 cites W2963382930 @default.
- W3012022301 cites W2963516811 @default.
- W3012022301 cites W2963828549 @default.
- W3012022301 cites W2964217527 @default.
- W3012022301 cites W2964233199 @default.
- W3012022301 cites W2964266063 @default.
- W3012022301 cites W2964288706 @default.
- W3012022301 cites W2964299589 @default.
- W3012022301 cites W2965862774 @default.
- W3012022301 cites W2970072941 @default.
- W3012022301 cites W2970500560 @default.
- W3012022301 cites W2970562249 @default.
- W3012022301 cites W2971275988 @default.
- W3012022301 cites W2976540144 @default.
- W3012022301 cites W2978081181 @default.
- W3012022301 cites W2984618279 @default.
- W3012022301 cites W2990761096 @default.
- W3012022301 cites W3034513523 @default.
- W3012022301 cites W3118608800 @default.
- W3012022301 doi "https://doi.org/10.48550/arxiv.2003.05891" @default.
- W3012022301 hasPublicationYear "2020" @default.
- W3012022301 type Work @default.
- W3012022301 sameAs 3012022301 @default.
- W3012022301 citedByCount "0" @default.
- W3012022301 crossrefType "posted-content" @default.
- W3012022301 hasAuthorship W3012022301A5037315478 @default.
- W3012022301 hasAuthorship W3012022301A5044264346 @default.
- W3012022301 hasAuthorship W3012022301A5079572598 @default.
- W3012022301 hasAuthorship W3012022301A5079581959 @default.
- W3012022301 hasBestOaLocation W30120223011 @default.
- W3012022301 hasConcept C108010975 @default.
- W3012022301 hasConcept C111919701 @default.
- W3012022301 hasConcept C11413529 @default.
- W3012022301 hasConcept C119857082 @default.
- W3012022301 hasConcept C154945302 @default.
- W3012022301 hasConcept C173608175 @default.
- W3012022301 hasConcept C22019652 @default.
- W3012022301 hasConcept C2776135515 @default.
- W3012022301 hasConcept C2776214188 @default.
- W3012022301 hasConcept C2779960059 @default.
- W3012022301 hasConcept C3826847 @default.
- W3012022301 hasConcept C41008148 @default.
- W3012022301 hasConcept C45374587 @default.
- W3012022301 hasConcept C50644808 @default.
- W3012022301 hasConcept C6557445 @default.
- W3012022301 hasConcept C86803240 @default.
- W3012022301 hasConceptScore W3012022301C108010975 @default.
- W3012022301 hasConceptScore W3012022301C111919701 @default.
- W3012022301 hasConceptScore W3012022301C11413529 @default.