Matches in SemOpenAlex for { <https://semopenalex.org/work/W3124328990> ?p ?o ?g. }
- W3124328990 abstract "We propose self-adaptive training -- a unified training algorithm that dynamically calibrates and enhances training processes by model predictions without incurring an extra computational cost -- to advance both supervised and self-supervised learning of deep neural networks. We analyze the training dynamics of deep networks on training data that are corrupted by, e.g., random noise and adversarial examples. Our analysis shows that model predictions are able to magnify useful underlying information in data and this phenomenon occurs broadly even in the absence of any label information, highlighting that model predictions could substantially benefit the training processes: self-adaptive training improves the generalization of deep networks under noise and enhances the self-supervised representation learning. The analysis also sheds light on understanding deep learning, e.g., a potential explanation of the recently-discovered double-descent phenomenon in empirical risk minimization and the collapsing issue of the state-of-the-art self-supervised learning algorithms. Experiments on the CIFAR, STL, and ImageNet datasets verify the effectiveness of our approach in three applications: classification with label noise, selective classification, and linear evaluation. To facilitate future research, the code has been made publicly available at https://github.com/LayneH/self-adaptive-training." @default.
- W3124328990 created "2021-02-01" @default.
- W3124328990 creator A5018348928 @default.
- W3124328990 creator A5038521996 @default.
- W3124328990 creator A5089966579 @default.
- W3124328990 date "2021-01-21" @default.
- W3124328990 modified "2023-09-26" @default.
- W3124328990 title "Self-Adaptive Training: Bridging Supervised and Self-Supervised Learning" @default.
- W3124328990 cites W123339444 @default.
- W3124328990 cites W1564764609 @default.
- W3124328990 cites W1665214252 @default.
- W3124328990 cites W1836465849 @default.
- W3124328990 cites W1898031563 @default.
- W3124328990 cites W1903029394 @default.
- W3124328990 cites W1959608418 @default.
- W3124328990 cites W2025768430 @default.
- W3124328990 cites W2047581872 @default.
- W3124328990 cites W2095705004 @default.
- W3124328990 cites W2099471712 @default.
- W3124328990 cites W2101946573 @default.
- W3124328990 cites W2102605133 @default.
- W3124328990 cites W2108598243 @default.
- W3124328990 cites W2118858186 @default.
- W3124328990 cites W2129249398 @default.
- W3124328990 cites W2163605009 @default.
- W3124328990 cites W2183341477 @default.
- W3124328990 cites W2194775991 @default.
- W3124328990 cites W2321533354 @default.
- W3124328990 cites W2335728318 @default.
- W3124328990 cites W2401231614 @default.
- W3124328990 cites W2558661413 @default.
- W3124328990 cites W2618574054 @default.
- W3124328990 cites W2622263826 @default.
- W3124328990 cites W2798991696 @default.
- W3124328990 cites W2802198257 @default.
- W3124328990 cites W2842511635 @default.
- W3124328990 cites W2883725317 @default.
- W3124328990 cites W2914735679 @default.
- W3124328990 cites W2941964676 @default.
- W3124328990 cites W2950300355 @default.
- W3124328990 cites W2951418570 @default.
- W3124328990 cites W2959995783 @default.
- W3124328990 cites W2962729158 @default.
- W3124328990 cites W2962742544 @default.
- W3124328990 cites W2962762541 @default.
- W3124328990 cites W2962835968 @default.
- W3124328990 cites W2963060032 @default.
- W3124328990 cites W2963081269 @default.
- W3124328990 cites W2963096987 @default.
- W3124328990 cites W2963263347 @default.
- W3124328990 cites W2963341956 @default.
- W3124328990 cites W2963371670 @default.
- W3124328990 cites W2963399829 @default.
- W3124328990 cites W2963420272 @default.
- W3124328990 cites W2963495051 @default.
- W3124328990 cites W2963518130 @default.
- W3124328990 cites W2963613748 @default.
- W3124328990 cites W2963759070 @default.
- W3124328990 cites W2963772355 @default.
- W3124328990 cites W2964121744 @default.
- W3124328990 cites W2964153729 @default.
- W3124328990 cites W2964222566 @default.
- W3124328990 cites W2964253222 @default.
- W3124328990 cites W2964274690 @default.
- W3124328990 cites W2964292098 @default.
- W3124328990 cites W2970236443 @default.
- W3124328990 cites W2970290137 @default.
- W3124328990 cites W2970971581 @default.
- W3124328990 cites W2978544343 @default.
- W3124328990 cites W2981873476 @default.
- W3124328990 cites W2995181141 @default.
- W3124328990 cites W2995624272 @default.
- W3124328990 cites W2996603747 @default.
- W3124328990 cites W3022061250 @default.
- W3124328990 cites W3034978746 @default.
- W3124328990 cites W3034994123 @default.
- W3124328990 cites W3035160371 @default.
- W3124328990 cites W3035524453 @default.
- W3124328990 cites W3037144731 @default.
- W3124328990 cites W3083720136 @default.
- W3124328990 cites W3095121901 @default.
- W3124328990 cites W3098903812 @default.
- W3124328990 cites W3100156752 @default.
- W3124328990 cites W3100570787 @default.
- W3124328990 cites W3100593864 @default.
- W3124328990 cites W3101821705 @default.
- W3124328990 cites W3102583815 @default.
- W3124328990 cites W3108655343 @default.
- W3124328990 cites W3118608800 @default.
- W3124328990 cites W3137695714 @default.
- W3124328990 doi "https://doi.org/10.48550/arxiv.2101.08732" @default.
- W3124328990 hasPublicationYear "2021" @default.
- W3124328990 type Work @default.
- W3124328990 sameAs 3124328990 @default.
- W3124328990 citedByCount "2" @default.
- W3124328990 countsByYear W31243289902021 @default.
- W3124328990 crossrefType "posted-content" @default.
- W3124328990 hasAuthorship W3124328990A5018348928 @default.
- W3124328990 hasAuthorship W3124328990A5038521996 @default.
- W3124328990 hasAuthorship W3124328990A5089966579 @default.