Matches in SemOpenAlex for { <https://semopenalex.org/work/W4380479950> ?p ?o ?g. }
- W4380479950 endingPage "32" @default.
- W4380479950 startingPage "1" @default.
- W4380479950 abstract "Deep neural networksDeep Neural Network have achieved remarkable performance for artificial intelligence tasks. The success behind intelligent systems often relies on large-scale models with high computational complexity and storage costs. The over-parameterized networks are often easy to optimize and can achieve better performance. However, it is challenging to deploy them over resource-limited edge-devices. Knowledge DistillationKnowledge distillation (KD) aims to optimize a lightweight network from the perspective of over-parameterized training. The traditional offline KD transfers knowledge from a cumbersome teacher to a small and fast student network. When a sizeable pre-trained teacher network is unavailable, online KDKnowledge distillation can improve a group of models by collaborative or mutual learning. Without needing extra models, Self-KD boosts the network itself using attached auxiliary architectures. KDKnowledge distillation mainly involves knowledge extraction and distillation strategies these two aspects. Beyond KDKnowledge distillation schemes, various KDKnowledge distillation algorithms are widely used in practical applications, such as multi-teacher KD, cross-modal KD, attention-based KD, data-free KD and adversarial KD. This paper provides a comprehensive KDKnowledge distillation survey, including knowledge categoriesKnowledge category, distillation schemes and algorithms, as well as some empirical studies on performance comparison. Finally, we discuss the open challenges of existing KD works and prospect the future directions." @default.
- W4380479950 created "2023-06-14" @default.
- W4380479950 creator A5046293929 @default.
- W4380479950 creator A5047270197 @default.
- W4380479950 creator A5055016576 @default.
- W4380479950 creator A5085502749 @default.
- W4380479950 date "2023-01-01" @default.
- W4380479950 modified "2023-10-16" @default.
- W4380479950 title "Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation" @default.
- W4380479950 cites W2108598243 @default.
- W4380479950 cites W2183341477 @default.
- W4380479950 cites W2194775991 @default.
- W4380479950 cites W2620998106 @default.
- W4380479950 cites W2739879705 @default.
- W4380479950 cites W2743289088 @default.
- W4380479950 cites W2747909401 @default.
- W4380479950 cites W2752782242 @default.
- W4380479950 cites W2788836009 @default.
- W4380479950 cites W2885895075 @default.
- W4380479950 cites W2886756692 @default.
- W4380479950 cites W2887783173 @default.
- W4380479950 cites W2904170036 @default.
- W4380479950 cites W2904340070 @default.
- W4380479950 cites W2936864631 @default.
- W4380479950 cites W2947696193 @default.
- W4380479950 cites W2948582784 @default.
- W4380479950 cites W2952787292 @default.
- W4380479950 cites W2955192706 @default.
- W4380479950 cites W2956027310 @default.
- W4380479950 cites W2963140444 @default.
- W4380479950 cites W2963163009 @default.
- W4380479950 cites W2963468606 @default.
- W4380479950 cites W2963785012 @default.
- W4380479950 cites W2963920537 @default.
- W4380479950 cites W2964111476 @default.
- W4380479950 cites W2964137095 @default.
- W4380479950 cites W2964268168 @default.
- W4380479950 cites W2966796096 @default.
- W4380479950 cites W2966861238 @default.
- W4380479950 cites W2971047694 @default.
- W4380479950 cites W2981441441 @default.
- W4380479950 cites W2981694290 @default.
- W4380479950 cites W2981819252 @default.
- W4380479950 cites W2982157312 @default.
- W4380479950 cites W2982242214 @default.
- W4380479950 cites W2986015886 @default.
- W4380479950 cites W2986349107 @default.
- W4380479950 cites W2986854316 @default.
- W4380479950 cites W2987861506 @default.
- W4380479950 cites W2994742485 @default.
- W4380479950 cites W2996970889 @default.
- W4380479950 cites W2997006708 @default.
- W4380479950 cites W2997563695 @default.
- W4380479950 cites W3004127093 @default.
- W4380479950 cites W3013601111 @default.
- W4380479950 cites W3030204405 @default.
- W4380479950 cites W3034169498 @default.
- W4380479950 cites W3034200289 @default.
- W4380479950 cites W3034368386 @default.
- W4380479950 cites W3034695001 @default.
- W4380479950 cites W3034756453 @default.
- W4380479950 cites W3034957837 @default.
- W4380479950 cites W3035099063 @default.
- W4380479950 cites W3035163969 @default.
- W4380479950 cites W3035204081 @default.
- W4380479950 cites W3035321581 @default.
- W4380479950 cites W3035524453 @default.
- W4380479950 cites W3091093673 @default.
- W4380479950 cites W3091981646 @default.
- W4380479950 cites W3092497160 @default.
- W4380479950 cites W3096121526 @default.
- W4380479950 cites W3097836310 @default.
- W4380479950 cites W3105966348 @default.
- W4380479950 cites W3106974277 @default.
- W4380479950 cites W3107016329 @default.
- W4380479950 cites W3107220607 @default.
- W4380479950 cites W3108075360 @default.
- W4380479950 cites W3108124733 @default.
- W4380479950 cites W3108411658 @default.
- W4380479950 cites W3108491103 @default.
- W4380479950 cites W3109689953 @default.
- W4380479950 cites W3110179775 @default.
- W4380479950 cites W3113223504 @default.
- W4380479950 cites W3127215335 @default.
- W4380479950 cites W3137609883 @default.
- W4380479950 cites W3159481202 @default.
- W4380479950 cites W3159551428 @default.
- W4380479950 cites W3161758233 @default.
- W4380479950 cites W3171007011 @default.
- W4380479950 cites W3174102142 @default.
- W4380479950 cites W3174655058 @default.
- W4380479950 cites W3177008256 @default.
- W4380479950 cites W3177196641 @default.
- W4380479950 cites W3177378457 @default.
- W4380479950 cites W3185613252 @default.
- W4380479950 cites W3187295906 @default.
- W4380479950 cites W3192946406 @default.
- W4380479950 cites W3197813359 @default.