Matches in SemOpenAlex for { <https://semopenalex.org/work/W3009043393> ?p ?o ?g. }
Showing items 1 to 75 of
75
with 100 items per page.
- W3009043393 abstract "Modern deep networks have millions to billions of parameters, which leads to high memory and energy requirements during training as well as during inference on resource-constrained edge devices. Consequently, pruning techniques have been proposed that remove less significant weights in deep networks, thereby reducing their memory and computational requirements. Pruning is usually performed after training the original network, and is followed by further retraining to compensate for the accuracy loss incurred during pruning. The prune-and-retrain procedure is repeated iteratively until an optimum tradeoff between accuracy and efficiency is reached. However, such iterative retraining adds to the overall training complexity of the network. In this work, we propose a dynamic pruning-while-training procedure, wherein we prune filters of the convolutional layers of a deep network during training itself, thereby precluding the need for separate retraining. We evaluate our dynamic pruning-while-training approach with three different pre-existing pruning strategies, viz. mean activation-based pruning, random pruning, and L1 normalization-based pruning. Our results for VGG-16 trained on CIFAR10 shows that L1 normalization provides the best performance among all the techniques explored in this work with less than 1% drop in accuracy after pruning 80% of the filters compared to the original network. We further evaluated the L1 normalization based pruning mechanism on CIFAR100. Results indicate that pruning while training yields a compressed network with almost no accuracy loss after pruning 50% of the filters compared to the original network and ~5% loss for high pruning rates (>80%). The proposed pruning methodology yields 41% reduction in the number of computations and memory accesses during training for CIFAR10, CIFAR100 and ImageNet compared to training with retraining for 10 epochs ." @default.
- W3009043393 created "2020-03-13" @default.
- W3009043393 creator A5011164681 @default.
- W3009043393 creator A5050310538 @default.
- W3009043393 creator A5065766721 @default.
- W3009043393 creator A5083686495 @default.
- W3009043393 date "2020-03-05" @default.
- W3009043393 modified "2023-09-27" @default.
- W3009043393 title "Pruning Filters while Training for Efficiently Optimizing Deep Learning Networks" @default.
- W3009043393 cites W1522301498 @default.
- W3009043393 cites W2469490737 @default.
- W3009043393 cites W2612445135 @default.
- W3009043393 cites W2707890836 @default.
- W3009043393 cites W2752037867 @default.
- W3009043393 cites W2949650786 @default.
- W3009043393 cites W2964019666 @default.
- W3009043393 cites W2964299589 @default.
- W3009043393 hasPublicationYear "2020" @default.
- W3009043393 type Work @default.
- W3009043393 sameAs 3009043393 @default.
- W3009043393 citedByCount "0" @default.
- W3009043393 crossrefType "posted-content" @default.
- W3009043393 hasAuthorship W3009043393A5011164681 @default.
- W3009043393 hasAuthorship W3009043393A5050310538 @default.
- W3009043393 hasAuthorship W3009043393A5065766721 @default.
- W3009043393 hasAuthorship W3009043393A5083686495 @default.
- W3009043393 hasConcept C108010975 @default.
- W3009043393 hasConcept C114290370 @default.
- W3009043393 hasConcept C119857082 @default.
- W3009043393 hasConcept C136886441 @default.
- W3009043393 hasConcept C144024400 @default.
- W3009043393 hasConcept C154945302 @default.
- W3009043393 hasConcept C173801870 @default.
- W3009043393 hasConcept C19165224 @default.
- W3009043393 hasConcept C41008148 @default.
- W3009043393 hasConcept C6557445 @default.
- W3009043393 hasConcept C86803240 @default.
- W3009043393 hasConceptScore W3009043393C108010975 @default.
- W3009043393 hasConceptScore W3009043393C114290370 @default.
- W3009043393 hasConceptScore W3009043393C119857082 @default.
- W3009043393 hasConceptScore W3009043393C136886441 @default.
- W3009043393 hasConceptScore W3009043393C144024400 @default.
- W3009043393 hasConceptScore W3009043393C154945302 @default.
- W3009043393 hasConceptScore W3009043393C173801870 @default.
- W3009043393 hasConceptScore W3009043393C19165224 @default.
- W3009043393 hasConceptScore W3009043393C41008148 @default.
- W3009043393 hasConceptScore W3009043393C6557445 @default.
- W3009043393 hasConceptScore W3009043393C86803240 @default.
- W3009043393 hasLocation W30090433931 @default.
- W3009043393 hasOpenAccess W3009043393 @default.
- W3009043393 hasPrimaryLocation W30090433931 @default.
- W3009043393 hasRelatedWork W2751597414 @default.
- W3009043393 hasRelatedWork W2771111398 @default.
- W3009043393 hasRelatedWork W2791533388 @default.
- W3009043393 hasRelatedWork W2803543130 @default.
- W3009043393 hasRelatedWork W2910280493 @default.
- W3009043393 hasRelatedWork W2920823150 @default.
- W3009043393 hasRelatedWork W2970958999 @default.
- W3009043393 hasRelatedWork W2972364631 @default.
- W3009043393 hasRelatedWork W2989815941 @default.
- W3009043393 hasRelatedWork W2990619046 @default.
- W3009043393 hasRelatedWork W3005165565 @default.
- W3009043393 hasRelatedWork W3048959830 @default.
- W3009043393 hasRelatedWork W3088108839 @default.
- W3009043393 hasRelatedWork W3091661482 @default.
- W3009043393 hasRelatedWork W3117731702 @default.
- W3009043393 hasRelatedWork W3163244872 @default.
- W3009043393 hasRelatedWork W3172801005 @default.
- W3009043393 hasRelatedWork W3191447923 @default.
- W3009043393 hasRelatedWork W3197204236 @default.
- W3009043393 hasRelatedWork W3197252688 @default.
- W3009043393 isParatext "false" @default.
- W3009043393 isRetracted "false" @default.
- W3009043393 magId "3009043393" @default.
- W3009043393 workType "article" @default.