Matches in SemOpenAlex for { <https://semopenalex.org/work/W4366579897> ?p ?o ?g. }
- W4366579897 endingPage "2132" @default.
- W4366579897 startingPage "2132" @default.
- W4366579897 abstract "Satellite imagery is the only feasible approach to annual monitoring and reporting on land cover change. Unfortunately, conventional pixel-based classification methods based on spectral response only (e.g., using random forests algorithms) have shown a lack of spatial and temporal stability due, for instance, to variability between individual pixels and changes in vegetation condition, respectively. Machine learning methods that consider spatial patterns in addition to reflectance can address some of these issues. In this study, a convolutional neural network (CNN) model, U-Net, was trained for a 500 km × 500 km region in southeast Australia using annual Landsat geomedian data for the relatively dry and wet years of 2018 and 2020, respectively. The label data for model training was an eight-class classification inferred from a static land-use map, enhanced using forest-extent mapping. Here, we wished to analyse the benefits of CNN-based land cover mapping and reporting over 34 years (1987–2020). We used the trained model to generate annual land cover maps for a 100 km × 100 km tile near the Australian Capital Territory. We developed innovative diagnostic methods to assess spatial and temporal stability, analysed how the CNN method differs from pixel-based mapping and compared it with two reference land cover products available for some years. Our U-Net CNN results showed better spatial and temporal stability with, respectively, overall accuracy of 89% verses 82% for reference pixel-based mapping, and 76% of pixels unchanged over 33 years. This gave a clearer insight into where and when land cover change occurred compared to reference mapping, where only 30% of pixels were conserved. Remaining issues include edge effects associated with the CNN method and a limited ability to distinguish some land cover types (e.g., broadacre crops vs. pasture). We conclude that the CNN model was better for understanding broad-scale land cover change, use in environmental accounting and natural resource management, whereas pixel-based approaches sometimes more accurately represented small-scale changes in land cover." @default.
- W4366579897 created "2023-04-23" @default.
- W4366579897 creator A5008882303 @default.
- W4366579897 creator A5026818785 @default.
- W4366579897 creator A5056400135 @default.
- W4366579897 date "2023-04-18" @default.
- W4366579897 modified "2023-09-30" @default.
- W4366579897 title "Convolutional Neural Network Shows Greater Spatial and Temporal Stability in Multi-Annual Land Cover Mapping Than Pixel-Based Methods" @default.
- W4366579897 cites W1595350222 @default.
- W4366579897 cites W1968114652 @default.
- W4366579897 cites W1984792953 @default.
- W4366579897 cites W2001581479 @default.
- W4366579897 cites W2031024973 @default.
- W4366579897 cites W2036632898 @default.
- W4366579897 cites W2053154970 @default.
- W4366579897 cites W2078619499 @default.
- W4366579897 cites W2082081125 @default.
- W4366579897 cites W2095410437 @default.
- W4366579897 cites W2106584184 @default.
- W4366579897 cites W2138408852 @default.
- W4366579897 cites W2164777277 @default.
- W4366579897 cites W2307094448 @default.
- W4366579897 cites W2499316477 @default.
- W4366579897 cites W2531168480 @default.
- W4366579897 cites W2742452911 @default.
- W4366579897 cites W2755013453 @default.
- W4366579897 cites W2782522152 @default.
- W4366579897 cites W2794382750 @default.
- W4366579897 cites W2900237898 @default.
- W4366579897 cites W2911964244 @default.
- W4366579897 cites W2940726923 @default.
- W4366579897 cites W2963131120 @default.
- W4366579897 cites W2983226425 @default.
- W4366579897 cites W3003266520 @default.
- W4366579897 cites W3027542479 @default.
- W4366579897 cites W3096936835 @default.
- W4366579897 cites W3099319035 @default.
- W4366579897 cites W3124539583 @default.
- W4366579897 cites W3127572194 @default.
- W4366579897 cites W3207200417 @default.
- W4366579897 cites W4213283144 @default.
- W4366579897 cites W4226427927 @default.
- W4366579897 cites W4232914504 @default.
- W4366579897 cites W4285594764 @default.
- W4366579897 cites W4285597156 @default.
- W4366579897 cites W4306318398 @default.
- W4366579897 doi "https://doi.org/10.3390/rs15082132" @default.
- W4366579897 hasPublicationYear "2023" @default.
- W4366579897 type Work @default.
- W4366579897 citedByCount "1" @default.
- W4366579897 countsByYear W43665798972023 @default.
- W4366579897 crossrefType "journal-article" @default.
- W4366579897 hasAuthorship W4366579897A5008882303 @default.
- W4366579897 hasAuthorship W4366579897A5026818785 @default.
- W4366579897 hasAuthorship W4366579897A5056400135 @default.
- W4366579897 hasBestOaLocation W43665798971 @default.
- W4366579897 hasConcept C112972136 @default.
- W4366579897 hasConcept C119857082 @default.
- W4366579897 hasConcept C127413603 @default.
- W4366579897 hasConcept C142724271 @default.
- W4366579897 hasConcept C147176958 @default.
- W4366579897 hasConcept C154945302 @default.
- W4366579897 hasConcept C160633673 @default.
- W4366579897 hasConcept C205649164 @default.
- W4366579897 hasConcept C2776133958 @default.
- W4366579897 hasConcept C2780648208 @default.
- W4366579897 hasConcept C39432304 @default.
- W4366579897 hasConcept C41008148 @default.
- W4366579897 hasConcept C4792198 @default.
- W4366579897 hasConcept C58640448 @default.
- W4366579897 hasConcept C62649853 @default.
- W4366579897 hasConcept C71924100 @default.
- W4366579897 hasConcept C81363708 @default.
- W4366579897 hasConceptScore W4366579897C112972136 @default.
- W4366579897 hasConceptScore W4366579897C119857082 @default.
- W4366579897 hasConceptScore W4366579897C127413603 @default.
- W4366579897 hasConceptScore W4366579897C142724271 @default.
- W4366579897 hasConceptScore W4366579897C147176958 @default.
- W4366579897 hasConceptScore W4366579897C154945302 @default.
- W4366579897 hasConceptScore W4366579897C160633673 @default.
- W4366579897 hasConceptScore W4366579897C205649164 @default.
- W4366579897 hasConceptScore W4366579897C2776133958 @default.
- W4366579897 hasConceptScore W4366579897C2780648208 @default.
- W4366579897 hasConceptScore W4366579897C39432304 @default.
- W4366579897 hasConceptScore W4366579897C41008148 @default.
- W4366579897 hasConceptScore W4366579897C4792198 @default.
- W4366579897 hasConceptScore W4366579897C58640448 @default.
- W4366579897 hasConceptScore W4366579897C62649853 @default.
- W4366579897 hasConceptScore W4366579897C71924100 @default.
- W4366579897 hasConceptScore W4366579897C81363708 @default.
- W4366579897 hasIssue "8" @default.
- W4366579897 hasLocation W43665798971 @default.
- W4366579897 hasOpenAccess W4366579897 @default.
- W4366579897 hasPrimaryLocation W43665798971 @default.
- W4366579897 hasRelatedWork W1990358015 @default.
- W4366579897 hasRelatedWork W2000051326 @default.
- W4366579897 hasRelatedWork W2082199675 @default.
- W4366579897 hasRelatedWork W2380921289 @default.