Matches in SemOpenAlex for { <https://semopenalex.org/work/W2963223345> ?p ?o ?g. }
- W2963223345 endingPage "588" @default.
- W2963223345 startingPage "574" @default.
- W2963223345 abstract "The success of convolutional neural networks (CNNs) in computer vision applications has been accompanied by a significant increase of computation and memory costs, which prohibits their usage on resource-limited environments, such as mobile systems or embedded devices. To this end, the research of CNN compression has recently become emerging. In this paper, we propose a novel filter pruning scheme, termed structured sparsity regularization (SSR), to simultaneously speed up the computation and reduce the memory overhead of CNNs, which can be well supported by various off-the-shelf deep learning libraries. Concretely, the proposed scheme incorporates two different regularizers of structured sparsity into the original objective function of filter pruning, which fully coordinates the global output and local pruning operations to adaptively prune filters. We further propose an alternative updating with Lagrange multipliers (AULM) scheme to efficiently solve its optimization. AULM follows the principle of alternating direction method of multipliers (ADMM) and alternates between promoting the structured sparsity of CNNs and optimizing the recognition loss, which leads to a very efficient solver ( 2.5× to the most recent work that directly solves the group sparsity-based regularization). Moreover, by imposing the structured sparsity, the online inference is extremely memory-light since the number of filters and the output feature maps are simultaneously reduced. The proposed scheme has been deployed to a variety of state-of-the-art CNN structures, including LeNet, AlexNet, VGGNet, ResNet, and GoogLeNet, over different data sets. Quantitative results demonstrate that the proposed scheme achieves superior performance over the state-of-the-art methods. We further demonstrate the proposed compression scheme for the task of transfer learning, including domain adaptation and object detection, which also show exciting performance gains over the state-of-the-art filter pruning methods." @default.
- W2963223345 created "2019-07-30" @default.
- W2963223345 creator A5015874725 @default.
- W2963223345 creator A5016080094 @default.
- W2963223345 creator A5043643513 @default.
- W2963223345 creator A5068918243 @default.
- W2963223345 creator A5070551517 @default.
- W2963223345 date "2020-02-01" @default.
- W2963223345 modified "2023-10-14" @default.
- W2963223345 title "Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning" @default.
- W2963223345 cites W1536680647 @default.
- W2963223345 cites W2097117768 @default.
- W2963223345 cites W2102605133 @default.
- W2963223345 cites W2112796928 @default.
- W2963223345 cites W2117539524 @default.
- W2963223345 cites W2138019504 @default.
- W2963223345 cites W2155893237 @default.
- W2963223345 cites W2162409952 @default.
- W2963223345 cites W2163922914 @default.
- W2963223345 cites W2194775991 @default.
- W2963223345 cites W2285660444 @default.
- W2963223345 cites W2531409750 @default.
- W2963223345 cites W2549139847 @default.
- W2963223345 cites W2554302513 @default.
- W2963223345 cites W2758000438 @default.
- W2963223345 cites W2764289073 @default.
- W2963223345 cites W2775811337 @default.
- W2963223345 cites W2790852735 @default.
- W2963223345 cites W2807961551 @default.
- W2963223345 cites W2808168148 @default.
- W2963223345 cites W2895561155 @default.
- W2963223345 cites W2913081068 @default.
- W2963223345 cites W2962851801 @default.
- W2963223345 cites W2963125010 @default.
- W2963223345 cites W2963163009 @default.
- W2963223345 cites W2963446712 @default.
- W2963223345 cites W2963993763 @default.
- W2963223345 cites W2964217848 @default.
- W2963223345 cites W2964233199 @default.
- W2963223345 cites W4292363360 @default.
- W2963223345 cites W566555209 @default.
- W2963223345 doi "https://doi.org/10.1109/tnnls.2019.2906563" @default.
- W2963223345 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/30990448" @default.
- W2963223345 hasPublicationYear "2020" @default.
- W2963223345 type Work @default.
- W2963223345 sameAs 2963223345 @default.
- W2963223345 citedByCount "93" @default.
- W2963223345 countsByYear W29632233452019 @default.
- W2963223345 countsByYear W29632233452020 @default.
- W2963223345 countsByYear W29632233452021 @default.
- W2963223345 countsByYear W29632233452022 @default.
- W2963223345 countsByYear W29632233452023 @default.
- W2963223345 crossrefType "journal-article" @default.
- W2963223345 hasAuthorship W2963223345A5015874725 @default.
- W2963223345 hasAuthorship W2963223345A5016080094 @default.
- W2963223345 hasAuthorship W2963223345A5043643513 @default.
- W2963223345 hasAuthorship W2963223345A5068918243 @default.
- W2963223345 hasAuthorship W2963223345A5070551517 @default.
- W2963223345 hasConcept C106131492 @default.
- W2963223345 hasConcept C108010975 @default.
- W2963223345 hasConcept C111919701 @default.
- W2963223345 hasConcept C11413529 @default.
- W2963223345 hasConcept C138885662 @default.
- W2963223345 hasConcept C153180895 @default.
- W2963223345 hasConcept C154945302 @default.
- W2963223345 hasConcept C199360897 @default.
- W2963223345 hasConcept C2776135515 @default.
- W2963223345 hasConcept C2776214188 @default.
- W2963223345 hasConcept C2776401178 @default.
- W2963223345 hasConcept C2778770139 @default.
- W2963223345 hasConcept C2779960059 @default.
- W2963223345 hasConcept C31972630 @default.
- W2963223345 hasConcept C41008148 @default.
- W2963223345 hasConcept C41895202 @default.
- W2963223345 hasConcept C45374587 @default.
- W2963223345 hasConcept C6557445 @default.
- W2963223345 hasConcept C81363708 @default.
- W2963223345 hasConcept C86803240 @default.
- W2963223345 hasConceptScore W2963223345C106131492 @default.
- W2963223345 hasConceptScore W2963223345C108010975 @default.
- W2963223345 hasConceptScore W2963223345C111919701 @default.
- W2963223345 hasConceptScore W2963223345C11413529 @default.
- W2963223345 hasConceptScore W2963223345C138885662 @default.
- W2963223345 hasConceptScore W2963223345C153180895 @default.
- W2963223345 hasConceptScore W2963223345C154945302 @default.
- W2963223345 hasConceptScore W2963223345C199360897 @default.
- W2963223345 hasConceptScore W2963223345C2776135515 @default.
- W2963223345 hasConceptScore W2963223345C2776214188 @default.
- W2963223345 hasConceptScore W2963223345C2776401178 @default.
- W2963223345 hasConceptScore W2963223345C2778770139 @default.
- W2963223345 hasConceptScore W2963223345C2779960059 @default.
- W2963223345 hasConceptScore W2963223345C31972630 @default.
- W2963223345 hasConceptScore W2963223345C41008148 @default.
- W2963223345 hasConceptScore W2963223345C41895202 @default.
- W2963223345 hasConceptScore W2963223345C45374587 @default.
- W2963223345 hasConceptScore W2963223345C6557445 @default.
- W2963223345 hasConceptScore W2963223345C81363708 @default.
- W2963223345 hasConceptScore W2963223345C86803240 @default.