Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386083088> ?p ?o ?g. }
- W4386083088 abstract "Learning with large-scale unlabeled data has become a powerful tool for pre-training Visual Transformers (VTs). However, prior works tend to overlook that, in real-world scenarios, the input data may be corrupted and unreliable. Pre-training VTs on such corrupted data can be challenging, especially when we pre-train via the masked autoencoding approach, where both the inputs and masked “ground truth” targets can potentially be unreliable in this case. To address this limitation, we introduce the Token Boosting Module (TBM) as a plug-and-play component for VTs that effectively allows the VT to learn to extract clean and robust features during masked autoencoding pre-training. We provide theoretical analysis to show how TBM improves model pre-training with more robust and generalizable representations, thus benefiting down stream tasks. We conduct extensive experiments to analyze TBM's effectiveness, and results on four corrupted datasets demonstrate that TBM consistently improves performance on downstream tasks." @default.
- W4386083088 created "2023-08-23" @default.
- W4386083088 creator A5017342089 @default.
- W4386083088 creator A5018551679 @default.
- W4386083088 creator A5039592275 @default.
- W4386083088 creator A5044237605 @default.
- W4386083088 creator A5049079081 @default.
- W4386083088 creator A5066289448 @default.
- W4386083088 creator A5067110746 @default.
- W4386083088 date "2023-06-01" @default.
- W4386083088 modified "2023-09-26" @default.
- W4386083088 title "Token Boosting for Robust Self-Supervised Visual Transformer Pre-training" @default.
- W4386083088 cites W2108598243 @default.
- W4386083088 cites W2156222070 @default.
- W4386083088 cites W2787919227 @default.
- W4386083088 cites W2860627718 @default.
- W4386083088 cites W2895494475 @default.
- W4386083088 cites W2911648799 @default.
- W4386083088 cites W2911788474 @default.
- W4386083088 cites W2940457086 @default.
- W4386083088 cites W2944006115 @default.
- W4386083088 cites W2948058585 @default.
- W4386083088 cites W2948246283 @default.
- W4386083088 cites W2963032410 @default.
- W4386083088 cites W2963076818 @default.
- W4386083088 cites W2963901718 @default.
- W4386083088 cites W2964134613 @default.
- W4386083088 cites W2978968642 @default.
- W4386083088 cites W2982024564 @default.
- W4386083088 cites W2984006054 @default.
- W4386083088 cites W3034548564 @default.
- W4386083088 cites W3034902810 @default.
- W4386083088 cites W3034999503 @default.
- W4386083088 cites W3035050855 @default.
- W4386083088 cites W3035225512 @default.
- W4386083088 cites W3046339366 @default.
- W4386083088 cites W3094488141 @default.
- W4386083088 cites W3098337560 @default.
- W4386083088 cites W3100186782 @default.
- W4386083088 cites W3105195350 @default.
- W4386083088 cites W3130277620 @default.
- W4386083088 cites W3138516171 @default.
- W4386083088 cites W3143373604 @default.
- W4386083088 cites W3145185940 @default.
- W4386083088 cites W3145450063 @default.
- W4386083088 cites W3159481202 @default.
- W4386083088 cites W3169413442 @default.
- W4386083088 cites W3171007011 @default.
- W4386083088 cites W3203227473 @default.
- W4386083088 cites W3203634062 @default.
- W4386083088 cites W3206084531 @default.
- W4386083088 cites W3215030504 @default.
- W4386083088 cites W3216270236 @default.
- W4386083088 cites W4300030381 @default.
- W4386083088 cites W4306729030 @default.
- W4386083088 cites W4312257792 @default.
- W4386083088 cites W4312270234 @default.
- W4386083088 cites W4312957757 @default.
- W4386083088 doi "https://doi.org/10.1109/cvpr52729.2023.02301" @default.
- W4386083088 hasPublicationYear "2023" @default.
- W4386083088 type Work @default.
- W4386083088 citedByCount "1" @default.
- W4386083088 countsByYear W43860830882023 @default.
- W4386083088 crossrefType "proceedings-article" @default.
- W4386083088 hasAuthorship W4386083088A5017342089 @default.
- W4386083088 hasAuthorship W4386083088A5018551679 @default.
- W4386083088 hasAuthorship W4386083088A5039592275 @default.
- W4386083088 hasAuthorship W4386083088A5044237605 @default.
- W4386083088 hasAuthorship W4386083088A5049079081 @default.
- W4386083088 hasAuthorship W4386083088A5066289448 @default.
- W4386083088 hasAuthorship W4386083088A5067110746 @default.
- W4386083088 hasConcept C104317684 @default.
- W4386083088 hasConcept C119599485 @default.
- W4386083088 hasConcept C119857082 @default.
- W4386083088 hasConcept C127413603 @default.
- W4386083088 hasConcept C146849305 @default.
- W4386083088 hasConcept C154945302 @default.
- W4386083088 hasConcept C165801399 @default.
- W4386083088 hasConcept C169258074 @default.
- W4386083088 hasConcept C185592680 @default.
- W4386083088 hasConcept C38652104 @default.
- W4386083088 hasConcept C41008148 @default.
- W4386083088 hasConcept C46686674 @default.
- W4386083088 hasConcept C48145219 @default.
- W4386083088 hasConcept C51632099 @default.
- W4386083088 hasConcept C55493867 @default.
- W4386083088 hasConcept C63479239 @default.
- W4386083088 hasConcept C66322947 @default.
- W4386083088 hasConcept C70153297 @default.
- W4386083088 hasConceptScore W4386083088C104317684 @default.
- W4386083088 hasConceptScore W4386083088C119599485 @default.
- W4386083088 hasConceptScore W4386083088C119857082 @default.
- W4386083088 hasConceptScore W4386083088C127413603 @default.
- W4386083088 hasConceptScore W4386083088C146849305 @default.
- W4386083088 hasConceptScore W4386083088C154945302 @default.
- W4386083088 hasConceptScore W4386083088C165801399 @default.
- W4386083088 hasConceptScore W4386083088C169258074 @default.
- W4386083088 hasConceptScore W4386083088C185592680 @default.
- W4386083088 hasConceptScore W4386083088C38652104 @default.
- W4386083088 hasConceptScore W4386083088C41008148 @default.