Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313461217> ?p ?o ?g. }
- W4313461217 endingPage "749" @default.
- W4313461217 startingPage "738" @default.
- W4313461217 abstract "Remote sensing images (RSIs) are characterized by complex spatial layouts and ground object structures. ViT can be a good choice for scene classification owing to the ability to capture long-range interactive information between patches of input images. However, due to the lack of some inductive biases inherent to CNNs, such as locality and translation equivariance, ViT cannot generalize well when trained on insufficient amounts of data. Compared with training ViT from scratch, transferring a large-scale pretrained one is more cost-efficient with better performance even when the target data are small scale. In addition, the cross-entropy (CE) loss is frequently utilized in scene classification yet has low robustness to noise labels and poor generalization performances for different scenes. In this article, a ViT-based model in combination with supervised contrastive learning (CL) is proposed, named ViT-CL. For CL, supervised contrastive (SupCon) loss, which is developed by extending the self-supervised contrastive approach to the fully supervised setting, can explore the label information of RSIs in embedding space and improve the robustness to common image corruption. In ViT-CL, a joint loss function that combines CE loss and SupCon loss is developed to prompt the model to learn more discriminative features. Also, a two-stage optimization framework is introduced to enhance the controllability of the optimization process of the ViT-CL model. Extensive experiments on the AID, NWPU-RESISC45, and UCM datasets verified the superior performance of ViT-CL, with the highest accuracies of 97.42%, 94.54%, and 99.76% among all competing methods, respectively." @default.
- W4313461217 created "2023-01-06" @default.
- W4313461217 creator A5018455596 @default.
- W4313461217 creator A5040009629 @default.
- W4313461217 creator A5073806049 @default.
- W4313461217 creator A5075013625 @default.
- W4313461217 date "2023-01-01" @default.
- W4313461217 modified "2023-10-15" @default.
- W4313461217 title "Vision Transformer With Contrastive Learning for Remote Sensing Image Scene Classification" @default.
- W4313461217 cites W1606858007 @default.
- W4313461217 cites W1980038761 @default.
- W4313461217 cites W2005112351 @default.
- W4313461217 cites W2044465660 @default.
- W4313461217 cites W2097117768 @default.
- W4313461217 cites W2108598243 @default.
- W4313461217 cites W2176673053 @default.
- W4313461217 cites W2194775991 @default.
- W4313461217 cites W2291068538 @default.
- W4313461217 cites W2294802479 @default.
- W4313461217 cites W2598998899 @default.
- W4313461217 cites W2620429297 @default.
- W4313461217 cites W2621526417 @default.
- W4313461217 cites W2783165089 @default.
- W4313461217 cites W2786225488 @default.
- W4313461217 cites W2798991696 @default.
- W4313461217 cites W2899198451 @default.
- W4313461217 cites W2914885528 @default.
- W4313461217 cites W2940939359 @default.
- W4313461217 cites W2963163009 @default.
- W4313461217 cites W2963745697 @default.
- W4313461217 cites W2974770574 @default.
- W4313461217 cites W3022140654 @default.
- W4313461217 cites W3035524453 @default.
- W4313461217 cites W3047443805 @default.
- W4313461217 cites W3048631361 @default.
- W4313461217 cites W3080181119 @default.
- W4313461217 cites W3103294617 @default.
- W4313461217 cites W3103856189 @default.
- W4313461217 cites W3128592650 @default.
- W4313461217 cites W3135445258 @default.
- W4313461217 cites W3201623325 @default.
- W4313461217 cites W3205886311 @default.
- W4313461217 cites W4205127446 @default.
- W4313461217 cites W4206470192 @default.
- W4313461217 cites W4226285265 @default.
- W4313461217 cites W4226291728 @default.
- W4313461217 cites W4285220262 @default.
- W4313461217 cites W4295308382 @default.
- W4313461217 cites W4312807131 @default.
- W4313461217 cites W4312981150 @default.
- W4313461217 cites W4313506322 @default.
- W4313461217 doi "https://doi.org/10.1109/jstars.2022.3230835" @default.
- W4313461217 hasPublicationYear "2023" @default.
- W4313461217 type Work @default.
- W4313461217 citedByCount "4" @default.
- W4313461217 countsByYear W43134612172023 @default.
- W4313461217 crossrefType "journal-article" @default.
- W4313461217 hasAuthorship W4313461217A5018455596 @default.
- W4313461217 hasAuthorship W4313461217A5040009629 @default.
- W4313461217 hasAuthorship W4313461217A5073806049 @default.
- W4313461217 hasAuthorship W4313461217A5075013625 @default.
- W4313461217 hasBestOaLocation W43134612171 @default.
- W4313461217 hasConcept C104317684 @default.
- W4313461217 hasConcept C115961682 @default.
- W4313461217 hasConcept C121332964 @default.
- W4313461217 hasConcept C127313418 @default.
- W4313461217 hasConcept C138885662 @default.
- W4313461217 hasConcept C153180895 @default.
- W4313461217 hasConcept C154945302 @default.
- W4313461217 hasConcept C165801399 @default.
- W4313461217 hasConcept C167981619 @default.
- W4313461217 hasConcept C185592680 @default.
- W4313461217 hasConcept C2779808786 @default.
- W4313461217 hasConcept C41008148 @default.
- W4313461217 hasConcept C41608201 @default.
- W4313461217 hasConcept C41895202 @default.
- W4313461217 hasConcept C55493867 @default.
- W4313461217 hasConcept C62520636 @default.
- W4313461217 hasConcept C62649853 @default.
- W4313461217 hasConcept C63479239 @default.
- W4313461217 hasConcept C66322947 @default.
- W4313461217 hasConcept C75294576 @default.
- W4313461217 hasConcept C97931131 @default.
- W4313461217 hasConceptScore W4313461217C104317684 @default.
- W4313461217 hasConceptScore W4313461217C115961682 @default.
- W4313461217 hasConceptScore W4313461217C121332964 @default.
- W4313461217 hasConceptScore W4313461217C127313418 @default.
- W4313461217 hasConceptScore W4313461217C138885662 @default.
- W4313461217 hasConceptScore W4313461217C153180895 @default.
- W4313461217 hasConceptScore W4313461217C154945302 @default.
- W4313461217 hasConceptScore W4313461217C165801399 @default.
- W4313461217 hasConceptScore W4313461217C167981619 @default.
- W4313461217 hasConceptScore W4313461217C185592680 @default.
- W4313461217 hasConceptScore W4313461217C2779808786 @default.
- W4313461217 hasConceptScore W4313461217C41008148 @default.
- W4313461217 hasConceptScore W4313461217C41608201 @default.
- W4313461217 hasConceptScore W4313461217C41895202 @default.
- W4313461217 hasConceptScore W4313461217C55493867 @default.