Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285220262> ?p ?o ?g. }
- W4285220262 endingPage "13" @default.
- W4285220262 startingPage "1" @default.
- W4285220262 abstract "With more detailed spatial information being represented in very-high-resolution (VHR) remote sensing images, stringent requirements are imposed on accurate image classification. Due to the diverse land-objects with intraclass variation and interclass similarity, efficient and fine classification of VHR images especially in complex scenes is challenging. Even for some popular deep learning (DL) frameworks, geometric details of land-object may be lost in deep feature levels, so it is difficult to maintain the highly-detailed spatial information (e.g., edges, small objects) only relying on the last high-level layer. Moreover, many of the newly developed DL methods require massive well-labeled samples, which inevitably deteriorates the model generalization ability under the few-shot learning. Therefore, in this paper, a lightweight shallow-to-deep feature fusion network (SDF2N) is proposed for VHR image classification, where the traditional machine learning (ML) and DL schemes are integrated to learn rich and representative information to improve the classification accuracy. In particular, the shallow spectral-spatial features are first extracted, and then a novel triple-stage fusion (TSF) module is designed to learn the saliency and discriminative information at different levels for classification. The TSF module includes three feature fusion stages, i.e., low-level spectral-spatial feature fusion, middle-level multi-scale feature fusion, and high-level multi-layer feature fusion. The proposed SDF2N takes advantages of the shallow-to-deep features, which can extract representative and complementary information of crossing layers. It is important to note that even with limited training samples, the SDF2N still can achieve satisfying classification performance. Experimental results obtained on three real VHR remote sensing data sets including two multispectral and one airborne hyperspectral images covering complex urban scenarios confirm the effectiveness of the proposed approach compared with the state-of-the-art methods." @default.
- W4285220262 created "2022-07-14" @default.
- W4285220262 creator A5001395664 @default.
- W4285220262 creator A5002046606 @default.
- W4285220262 creator A5006095323 @default.
- W4285220262 creator A5033017179 @default.
- W4285220262 creator A5037069703 @default.
- W4285220262 creator A5039348214 @default.
- W4285220262 creator A5077813266 @default.
- W4285220262 creator A5089631478 @default.
- W4285220262 date "2022-01-01" @default.
- W4285220262 modified "2023-10-16" @default.
- W4285220262 title "A Shallow-to-Deep Feature Fusion Network for VHR Remote Sensing Image Classification" @default.
- W4285220262 cites W1976416886 @default.
- W4285220262 cites W2005672614 @default.
- W4285220262 cites W2008258179 @default.
- W4285220262 cites W2056302425 @default.
- W4285220262 cites W2085529604 @default.
- W4285220262 cites W2114819256 @default.
- W4285220262 cites W2115451191 @default.
- W4285220262 cites W2127199143 @default.
- W4285220262 cites W2320846209 @default.
- W4285220262 cites W2342652911 @default.
- W4285220262 cites W2611868724 @default.
- W4285220262 cites W2747638294 @default.
- W4285220262 cites W2764276316 @default.
- W4285220262 cites W2765939201 @default.
- W4285220262 cites W2768309288 @default.
- W4285220262 cites W2790694152 @default.
- W4285220262 cites W2791006446 @default.
- W4285220262 cites W2800371750 @default.
- W4285220262 cites W2803057685 @default.
- W4285220262 cites W2822065499 @default.
- W4285220262 cites W2914331134 @default.
- W4285220262 cites W2944413439 @default.
- W4285220262 cites W2953308875 @default.
- W4285220262 cites W2963420686 @default.
- W4285220262 cites W2969238677 @default.
- W4285220262 cites W2969881582 @default.
- W4285220262 cites W2981625567 @default.
- W4285220262 cites W2982124515 @default.
- W4285220262 cites W3011625001 @default.
- W4285220262 cites W3012042051 @default.
- W4285220262 cites W3047443805 @default.
- W4285220262 cites W3083675032 @default.
- W4285220262 cites W3088464175 @default.
- W4285220262 cites W3100561351 @default.
- W4285220262 cites W3106450183 @default.
- W4285220262 cites W3106967300 @default.
- W4285220262 cites W3109750868 @default.
- W4285220262 cites W3113297088 @default.
- W4285220262 cites W3134321581 @default.
- W4285220262 cites W3146366485 @default.
- W4285220262 cites W3206197202 @default.
- W4285220262 cites W3213272555 @default.
- W4285220262 cites W3214821343 @default.
- W4285220262 cites W4240485910 @default.
- W4285220262 doi "https://doi.org/10.1109/tgrs.2022.3179288" @default.
- W4285220262 hasPublicationYear "2022" @default.
- W4285220262 type Work @default.
- W4285220262 citedByCount "6" @default.
- W4285220262 countsByYear W42852202622022 @default.
- W4285220262 countsByYear W42852202622023 @default.
- W4285220262 crossrefType "journal-article" @default.
- W4285220262 hasAuthorship W4285220262A5001395664 @default.
- W4285220262 hasAuthorship W4285220262A5002046606 @default.
- W4285220262 hasAuthorship W4285220262A5006095323 @default.
- W4285220262 hasAuthorship W4285220262A5033017179 @default.
- W4285220262 hasAuthorship W4285220262A5037069703 @default.
- W4285220262 hasAuthorship W4285220262A5039348214 @default.
- W4285220262 hasAuthorship W4285220262A5077813266 @default.
- W4285220262 hasAuthorship W4285220262A5089631478 @default.
- W4285220262 hasConcept C108583219 @default.
- W4285220262 hasConcept C115961682 @default.
- W4285220262 hasConcept C127313418 @default.
- W4285220262 hasConcept C138885662 @default.
- W4285220262 hasConcept C153180895 @default.
- W4285220262 hasConcept C154945302 @default.
- W4285220262 hasConcept C2776401178 @default.
- W4285220262 hasConcept C41008148 @default.
- W4285220262 hasConcept C41895202 @default.
- W4285220262 hasConcept C62649853 @default.
- W4285220262 hasConcept C69744172 @default.
- W4285220262 hasConcept C75294576 @default.
- W4285220262 hasConcept C97931131 @default.
- W4285220262 hasConceptScore W4285220262C108583219 @default.
- W4285220262 hasConceptScore W4285220262C115961682 @default.
- W4285220262 hasConceptScore W4285220262C127313418 @default.
- W4285220262 hasConceptScore W4285220262C138885662 @default.
- W4285220262 hasConceptScore W4285220262C153180895 @default.
- W4285220262 hasConceptScore W4285220262C154945302 @default.
- W4285220262 hasConceptScore W4285220262C2776401178 @default.
- W4285220262 hasConceptScore W4285220262C41008148 @default.
- W4285220262 hasConceptScore W4285220262C41895202 @default.
- W4285220262 hasConceptScore W4285220262C62649853 @default.
- W4285220262 hasConceptScore W4285220262C69744172 @default.
- W4285220262 hasConceptScore W4285220262C75294576 @default.
- W4285220262 hasConceptScore W4285220262C97931131 @default.