Matches in SemOpenAlex for { <https://semopenalex.org/work/W3133641570> ?p ?o ?g. }
- W3133641570 abstract "Neural network pruning is a popular technique used to reduce the inference costs of modern, potentially overparameterized, networks. Starting from a pre-trained network, the process is as follows: remove redundant parameters, retrain, and repeat while maintaining the same test accuracy. The result is a model that is a fraction of the size of the original with comparable predictive performance (test accuracy). Here, we reassess and evaluate whether the use of test accuracy alone in the terminating condition is sufficient to ensure that the resulting model performs well across a wide spectrum of harder metrics such as generalization to out-of-distribution data and resilience to noise. Across evaluations on varying architectures and data sets, we find that pruned networks effectively approximate the unpruned model, however, the prune ratio at which pruned networks achieve commensurate performance varies significantly across tasks. These results call into question the extent of emph{genuine} overparameterization in deep learning and raise concerns about the practicability of deploying pruned networks, specifically in the context of safety-critical systems, unless they are widely evaluated beyond test accuracy to reliably predict their performance. Our code is available at this https URL." @default.
- W3133641570 created "2021-03-15" @default.
- W3133641570 creator A5006739892 @default.
- W3133641570 creator A5055979952 @default.
- W3133641570 creator A5064147145 @default.
- W3133641570 creator A5066830185 @default.
- W3133641570 creator A5082442738 @default.
- W3133641570 date "2021-03-04" @default.
- W3133641570 modified "2023-09-27" @default.
- W3133641570 title "Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy" @default.
- W3133641570 cites W2037227137 @default.
- W3133641570 cites W2117539524 @default.
- W3133641570 cites W2144794286 @default.
- W3133641570 cites W2145607950 @default.
- W3133641570 cites W2194775991 @default.
- W3133641570 cites W2401231614 @default.
- W3133641570 cites W2515385951 @default.
- W3133641570 cites W2622263826 @default.
- W3133641570 cites W2630837129 @default.
- W3133641570 cites W2798127509 @default.
- W3133641570 cites W2806021783 @default.
- W3133641570 cites W2807299122 @default.
- W3133641570 cites W2808168148 @default.
- W3133641570 cites W2891710009 @default.
- W3133641570 cites W2896409484 @default.
- W3133641570 cites W2899771611 @default.
- W3133641570 cites W2905741102 @default.
- W3133641570 cites W2910643916 @default.
- W3133641570 cites W2912260645 @default.
- W3133641570 cites W2915589364 @default.
- W3133641570 cites W2918775908 @default.
- W3133641570 cites W2927163560 @default.
- W3133641570 cites W2945785363 @default.
- W3133641570 cites W2950630935 @default.
- W3133641570 cites W2952344559 @default.
- W3133641570 cites W2952892739 @default.
- W3133641570 cites W2961540362 @default.
- W3133641570 cites W2962835968 @default.
- W3133641570 cites W2962900737 @default.
- W3133641570 cites W2963060032 @default.
- W3133641570 cites W2963236897 @default.
- W3133641570 cites W2963247446 @default.
- W3133641570 cites W2963433148 @default.
- W3133641570 cites W2963446712 @default.
- W3133641570 cites W2963518130 @default.
- W3133641570 cites W2963695615 @default.
- W3133641570 cites W2963773358 @default.
- W3133641570 cites W2963813662 @default.
- W3133641570 cites W2964161337 @default.
- W3133641570 cites W2964224652 @default.
- W3133641570 cites W2964253222 @default.
- W3133641570 cites W2964299589 @default.
- W3133641570 cites W2970330753 @default.
- W3133641570 cites W2970692043 @default.
- W3133641570 cites W2971229607 @default.
- W3133641570 cites W2979789219 @default.
- W3133641570 cites W2989457543 @default.
- W3133641570 cites W2989808579 @default.
- W3133641570 cites W2995492258 @default.
- W3133641570 cites W2996603747 @default.
- W3133641570 cites W3000181403 @default.
- W3133641570 cites W3005273253 @default.
- W3133641570 cites W3007702589 @default.
- W3133641570 cites W3009751875 @default.
- W3133641570 cites W3012435359 @default.
- W3133641570 cites W3028304412 @default.
- W3133641570 cites W3037301072 @default.
- W3133641570 cites W3037475485 @default.
- W3133641570 cites W3038041907 @default.
- W3133641570 cites W3042879175 @default.
- W3133641570 cites W3137695714 @default.
- W3133641570 cites W3210815045 @default.
- W3133641570 cites W3034877463 @default.
- W3133641570 hasPublicationYear "2021" @default.
- W3133641570 type Work @default.
- W3133641570 sameAs 3133641570 @default.
- W3133641570 citedByCount "1" @default.
- W3133641570 countsByYear W31336415702021 @default.
- W3133641570 crossrefType "posted-content" @default.
- W3133641570 hasAuthorship W3133641570A5006739892 @default.
- W3133641570 hasAuthorship W3133641570A5055979952 @default.
- W3133641570 hasAuthorship W3133641570A5064147145 @default.
- W3133641570 hasAuthorship W3133641570A5066830185 @default.
- W3133641570 hasAuthorship W3133641570A5082442738 @default.
- W3133641570 hasConcept C108010975 @default.
- W3133641570 hasConcept C119857082 @default.
- W3133641570 hasConcept C134306372 @default.
- W3133641570 hasConcept C149629883 @default.
- W3133641570 hasConcept C151730666 @default.
- W3133641570 hasConcept C154945302 @default.
- W3133641570 hasConcept C16910744 @default.
- W3133641570 hasConcept C177148314 @default.
- W3133641570 hasConcept C177264268 @default.
- W3133641570 hasConcept C178790620 @default.
- W3133641570 hasConcept C185592680 @default.
- W3133641570 hasConcept C199360897 @default.
- W3133641570 hasConcept C2776214188 @default.
- W3133641570 hasConcept C2776760102 @default.
- W3133641570 hasConcept C2779343474 @default.
- W3133641570 hasConcept C2984842247 @default.