Matches in SemOpenAlex for { <https://semopenalex.org/work/W3185308247> ?p ?o ?g. }
- W3185308247 endingPage "102441" @default.
- W3185308247 startingPage "102441" @default.
- W3185308247 abstract "As the important components of modern facility agriculture, both plastic greenhouses and mulching films have been widely utilized in agriculture production. Due to the similarity of spectral signatures, it remains a challenging task to separate plastic greenhouses and mulching films from each other. Meanwhile, deep learning has achieved great performance in many computer vison tasks, and has become a research hotspot in remote sensing image analysis. However, deep learning has been rarely studied for the accurate mapping of agricultural plastic covers, especially for the long-neglected issue of the separation between plastic greenhouses and mulching films. Therefore, this study aims to propose a deep learning model to detect and separate plastic greenhouses and mulching films from very high resolution (VHR) remotely sensed data, providing the agricultural plastic covered maps for relevant decision-makers. In specific, the proposed model is a dilated and non-local convolutional neural network (DNCNN), which consists of several multi-scale dilated convolution blocks and a non-local feature extraction module. The former contains a series of dilated convolutions with various dilated rates, which is to aggregate multi-level spatial features hence to account for the scale variations of land objects. While the latter utilizes a non-local module to extract the global and contextual features to further enhance the inter-class separability. Experimental results from Shenxian, China and Al-Kharj, Saudi Arabia show that the DNCNN in this study obtains a high accuracy with an overall accuracy of 89.6% and 92.6%, respectively. Compared to standard convolution, the inclusion of dilated convolution could raise the classification accuracy by 2.7%. In addition, ablation analysis shows that the non-local feature extraction module could also improve the classification accuracy by about 2%. This study demonstrates that the proposed DNCNN yields an effective approach for the accurate agricultural plastic cover mapping from VHR remotely sensed imagery." @default.
- W3185308247 created "2021-08-02" @default.
- W3185308247 creator A5007776175 @default.
- W3185308247 creator A5007897707 @default.
- W3185308247 creator A5015733188 @default.
- W3185308247 creator A5071650569 @default.
- W3185308247 creator A5077901649 @default.
- W3185308247 creator A5078340301 @default.
- W3185308247 creator A5080044640 @default.
- W3185308247 creator A5086487604 @default.
- W3185308247 creator A5089998406 @default.
- W3185308247 date "2021-10-01" @default.
- W3185308247 modified "2023-10-11" @default.
- W3185308247 title "Mapping of plastic greenhouses and mulching films from very high resolution remote sensing imagery based on a dilated and non-local convolutional neural network" @default.
- W3185308247 cites W1677182931 @default.
- W3185308247 cites W1999857868 @default.
- W3185308247 cites W2064675550 @default.
- W3185308247 cites W2093072860 @default.
- W3185308247 cites W2121885753 @default.
- W3185308247 cites W2162423745 @default.
- W3185308247 cites W2395611524 @default.
- W3185308247 cites W2412588858 @default.
- W3185308247 cites W2412782625 @default.
- W3185308247 cites W2507974895 @default.
- W3185308247 cites W2601221592 @default.
- W3185308247 cites W2736678235 @default.
- W3185308247 cites W2782522152 @default.
- W3185308247 cites W2792827505 @default.
- W3185308247 cites W2801061313 @default.
- W3185308247 cites W2803946774 @default.
- W3185308247 cites W2811244448 @default.
- W3185308247 cites W2900587135 @default.
- W3185308247 cites W2908968031 @default.
- W3185308247 cites W2919115771 @default.
- W3185308247 cites W2943214363 @default.
- W3185308247 cites W2944971001 @default.
- W3185308247 cites W3104839310 @default.
- W3185308247 doi "https://doi.org/10.1016/j.jag.2021.102441" @default.
- W3185308247 hasPublicationYear "2021" @default.
- W3185308247 type Work @default.
- W3185308247 sameAs 3185308247 @default.
- W3185308247 citedByCount "9" @default.
- W3185308247 countsByYear W31853082472022 @default.
- W3185308247 countsByYear W31853082472023 @default.
- W3185308247 crossrefType "journal-article" @default.
- W3185308247 hasAuthorship W3185308247A5007776175 @default.
- W3185308247 hasAuthorship W3185308247A5007897707 @default.
- W3185308247 hasAuthorship W3185308247A5015733188 @default.
- W3185308247 hasAuthorship W3185308247A5071650569 @default.
- W3185308247 hasAuthorship W3185308247A5077901649 @default.
- W3185308247 hasAuthorship W3185308247A5078340301 @default.
- W3185308247 hasAuthorship W3185308247A5080044640 @default.
- W3185308247 hasAuthorship W3185308247A5086487604 @default.
- W3185308247 hasAuthorship W3185308247A5089998406 @default.
- W3185308247 hasBestOaLocation W31853082471 @default.
- W3185308247 hasConcept C108583219 @default.
- W3185308247 hasConcept C154945302 @default.
- W3185308247 hasConcept C175092762 @default.
- W3185308247 hasConcept C205372480 @default.
- W3185308247 hasConcept C205649164 @default.
- W3185308247 hasConcept C2778755073 @default.
- W3185308247 hasConcept C31972630 @default.
- W3185308247 hasConcept C32198211 @default.
- W3185308247 hasConcept C41008148 @default.
- W3185308247 hasConcept C58640448 @default.
- W3185308247 hasConcept C62649853 @default.
- W3185308247 hasConcept C6557445 @default.
- W3185308247 hasConcept C81363708 @default.
- W3185308247 hasConcept C86803240 @default.
- W3185308247 hasConceptScore W3185308247C108583219 @default.
- W3185308247 hasConceptScore W3185308247C154945302 @default.
- W3185308247 hasConceptScore W3185308247C175092762 @default.
- W3185308247 hasConceptScore W3185308247C205372480 @default.
- W3185308247 hasConceptScore W3185308247C205649164 @default.
- W3185308247 hasConceptScore W3185308247C2778755073 @default.
- W3185308247 hasConceptScore W3185308247C31972630 @default.
- W3185308247 hasConceptScore W3185308247C32198211 @default.
- W3185308247 hasConceptScore W3185308247C41008148 @default.
- W3185308247 hasConceptScore W3185308247C58640448 @default.
- W3185308247 hasConceptScore W3185308247C62649853 @default.
- W3185308247 hasConceptScore W3185308247C6557445 @default.
- W3185308247 hasConceptScore W3185308247C81363708 @default.
- W3185308247 hasConceptScore W3185308247C86803240 @default.
- W3185308247 hasFunder F4320321001 @default.
- W3185308247 hasFunder F4320321540 @default.
- W3185308247 hasFunder F4320335777 @default.
- W3185308247 hasLocation W31853082471 @default.
- W3185308247 hasOpenAccess W3185308247 @default.
- W3185308247 hasPrimaryLocation W31853082471 @default.
- W3185308247 hasRelatedWork W2731899572 @default.
- W3185308247 hasRelatedWork W2999805992 @default.
- W3185308247 hasRelatedWork W3011074480 @default.
- W3185308247 hasRelatedWork W3116150086 @default.
- W3185308247 hasRelatedWork W3133861977 @default.
- W3185308247 hasRelatedWork W4200173597 @default.
- W3185308247 hasRelatedWork W4200550458 @default.
- W3185308247 hasRelatedWork W4291897433 @default.
- W3185308247 hasRelatedWork W4312417841 @default.