Matches in SemOpenAlex for { <https://semopenalex.org/work/W2792857687> ?p ?o ?g. }
- W2792857687 endingPage "444" @default.
- W2792857687 startingPage "444" @default.
- W2792857687 abstract "With the large number of high-resolution images now being acquired, high spatial resolution (HSR) remote sensing imagery scene classification has drawn great attention but is still a challenging task due to the complex arrangements of the ground objects in HSR imagery, which leads to the semantic gap between low-level features and high-level semantic concepts. As a feature representation method for automatically learning essential features from image data, convolutional neural networks (CNNs) have been introduced for HSR remote sensing image scene classification due to their excellent performance in natural image classification. However, some scene classes of remote sensing images are object-centered, i.e., the scene class of an image is decided by the objects it contains. Although previous methods based on CNNs have achieved comparatively high classification accuracies compared with the traditional methods with handcrafted features, they do not consider the scale variation of the objects in the scenes. This makes it difficult to directly utilize CNNs on those remote sensing images belonging to object-centered classes to extract features that are robust to scale variation, leading to wrongly classified scene images. To solve this problem, scene classification based on a deep random-scale stretched convolutional neural network (SRSCNN) for HSR remote sensing imagery is proposed in this paper. In the proposed method, patches with a random scale are cropped from the image and stretched to the specified scale as the input to train the CNN. This forces the CNN to extract features that are robust to the scale variation. Furthermore, to further improve the performance of the CNN, a robust scene classification strategy is adopted, i.e., multi-perspective fusion. The experimental results obtained using three datasets—the UC Merced dataset, the Google dataset of SIRI-WHU, and the Wuhan IKONOS dataset—confirm that the proposed method performs better than the traditional scene classification methods." @default.
- W2792857687 created "2018-03-29" @default.
- W2792857687 creator A5029093505 @default.
- W2792857687 creator A5033347732 @default.
- W2792857687 creator A5035548891 @default.
- W2792857687 creator A5075903928 @default.
- W2792857687 creator A5079862010 @default.
- W2792857687 date "2018-03-12" @default.
- W2792857687 modified "2023-09-24" @default.
- W2792857687 title "Scene Classification Based on a Deep Random-Scale Stretched Convolutional Neural Network" @default.
- W2792857687 cites W1526295910 @default.
- W2792857687 cites W1885185971 @default.
- W2792857687 cites W1958291604 @default.
- W2792857687 cites W1968591910 @default.
- W2792857687 cites W1983364832 @default.
- W2792857687 cites W2001123951 @default.
- W2792857687 cites W2005112351 @default.
- W2792857687 cites W2006603039 @default.
- W2792857687 cites W2015386604 @default.
- W2792857687 cites W2029316659 @default.
- W2792857687 cites W2085625911 @default.
- W2792857687 cites W2086866337 @default.
- W2792857687 cites W2098676252 @default.
- W2792857687 cites W2105032938 @default.
- W2792857687 cites W2115973703 @default.
- W2792857687 cites W2121915926 @default.
- W2792857687 cites W2179290474 @default.
- W2792857687 cites W2213075807 @default.
- W2792857687 cites W2283168383 @default.
- W2792857687 cites W2291068538 @default.
- W2792857687 cites W2294802479 @default.
- W2792857687 cites W2303475025 @default.
- W2792857687 cites W2325982591 @default.
- W2792857687 cites W2344884875 @default.
- W2792857687 cites W2345128667 @default.
- W2792857687 cites W2532691318 @default.
- W2792857687 cites W2660018007 @default.
- W2792857687 cites W2761600867 @default.
- W2792857687 cites W2771635595 @default.
- W2792857687 cites W2789784903 @default.
- W2792857687 doi "https://doi.org/10.3390/rs10030444" @default.
- W2792857687 hasPublicationYear "2018" @default.
- W2792857687 type Work @default.
- W2792857687 sameAs 2792857687 @default.
- W2792857687 citedByCount "70" @default.
- W2792857687 countsByYear W27928576872018 @default.
- W2792857687 countsByYear W27928576872019 @default.
- W2792857687 countsByYear W27928576872020 @default.
- W2792857687 countsByYear W27928576872021 @default.
- W2792857687 countsByYear W27928576872022 @default.
- W2792857687 countsByYear W27928576872023 @default.
- W2792857687 crossrefType "journal-article" @default.
- W2792857687 hasAuthorship W2792857687A5029093505 @default.
- W2792857687 hasAuthorship W2792857687A5033347732 @default.
- W2792857687 hasAuthorship W2792857687A5035548891 @default.
- W2792857687 hasAuthorship W2792857687A5075903928 @default.
- W2792857687 hasAuthorship W2792857687A5079862010 @default.
- W2792857687 hasBestOaLocation W27928576871 @default.
- W2792857687 hasConcept C108583219 @default.
- W2792857687 hasConcept C115961682 @default.
- W2792857687 hasConcept C138885662 @default.
- W2792857687 hasConcept C153180895 @default.
- W2792857687 hasConcept C154945302 @default.
- W2792857687 hasConcept C17744445 @default.
- W2792857687 hasConcept C199539241 @default.
- W2792857687 hasConcept C205649164 @default.
- W2792857687 hasConcept C2776359362 @default.
- W2792857687 hasConcept C2776401178 @default.
- W2792857687 hasConcept C2778755073 @default.
- W2792857687 hasConcept C2781238097 @default.
- W2792857687 hasConcept C31972630 @default.
- W2792857687 hasConcept C41008148 @default.
- W2792857687 hasConcept C41895202 @default.
- W2792857687 hasConcept C58640448 @default.
- W2792857687 hasConcept C62649853 @default.
- W2792857687 hasConcept C75294576 @default.
- W2792857687 hasConcept C81363708 @default.
- W2792857687 hasConcept C94625758 @default.
- W2792857687 hasConceptScore W2792857687C108583219 @default.
- W2792857687 hasConceptScore W2792857687C115961682 @default.
- W2792857687 hasConceptScore W2792857687C138885662 @default.
- W2792857687 hasConceptScore W2792857687C153180895 @default.
- W2792857687 hasConceptScore W2792857687C154945302 @default.
- W2792857687 hasConceptScore W2792857687C17744445 @default.
- W2792857687 hasConceptScore W2792857687C199539241 @default.
- W2792857687 hasConceptScore W2792857687C205649164 @default.
- W2792857687 hasConceptScore W2792857687C2776359362 @default.
- W2792857687 hasConceptScore W2792857687C2776401178 @default.
- W2792857687 hasConceptScore W2792857687C2778755073 @default.
- W2792857687 hasConceptScore W2792857687C2781238097 @default.
- W2792857687 hasConceptScore W2792857687C31972630 @default.
- W2792857687 hasConceptScore W2792857687C41008148 @default.
- W2792857687 hasConceptScore W2792857687C41895202 @default.
- W2792857687 hasConceptScore W2792857687C58640448 @default.
- W2792857687 hasConceptScore W2792857687C62649853 @default.
- W2792857687 hasConceptScore W2792857687C75294576 @default.
- W2792857687 hasConceptScore W2792857687C81363708 @default.
- W2792857687 hasConceptScore W2792857687C94625758 @default.