Matches in SemOpenAlex for { <https://semopenalex.org/work/W4323543094> ?p ?o ?g. }
- W4323543094 endingPage "100034" @default.
- W4323543094 startingPage "100034" @default.
- W4323543094 abstract "Increasing tree mortality due to climate change has been observed globally. Remote sensing is a suitable means for detecting tree mortality and has been proven effective for the assessment of abrupt and large-scale stand-replacing disturbances, such as those caused by windthrow, clear-cut harvesting, or wildfire. Non-stand replacing tree mortality events (e.g., due to drought) are more difficult to detect with satellite data – especially across regions and forest types. A common limitation for this is the availability of spatially explicit reference data. To address this issue, we propose an automated generation of reference data using uncrewed aerial vehicles (UAV) and deep learning-based pattern recognition. In this study, we used convolutional neural networks (CNN) to semantically segment crowns of standing dead trees from 176 UAV-based very high-resolution (<4 cm) RGB-orthomosaics that we acquired over six regions in Germany and Finland between 2017 and 2021. The local-level CNN-predictions were then extrapolated to landscape-level using Sentinel-1 (i.e., backscatter and interferometric coherence), Sentinel-2 time series, and long short term memory networks (LSTM) to predict the cover fraction of standing deadwood per Sentinel-pixel. The CNN-based segmentation of standing deadwood from UAV imagery was accurate (F1-score = 0.85) and consistent across the different study sites and years. Best results for the LSTM-based extrapolation of fractional cover of standing deadwood using Sentinel-1 and -2 time series were achieved using all available Sentinel-1 and --2 bands, kernel normalized difference vegetation index (kNDVI), and normalized difference water index (NDWI) (Pearson’s r = 0.66, total least squares regression slope = 1.58). The landscape-level predictions showed high spatial detail and were transferable across regions and years. Our results highlight the effectiveness of deep learning-based algorithms for an automated and rapid generation of reference data for large areas using UAV imagery. Potential for improving the presented upscaling approach was found particularly in ensuring the spatial and temporal consistency of the two data sources (e.g., co-registration of very high-resolution UAV data and medium resolution satellite data). The increasing availability of publicly available UAV imagery on sharing platforms combined with automated and transferable deep learning-based mapping algorithms will further increase the potential of such multi-scale approaches." @default.
- W4323543094 created "2023-03-09" @default.
- W4323543094 creator A5007461436 @default.
- W4323543094 creator A5026305066 @default.
- W4323543094 creator A5042728540 @default.
- W4323543094 creator A5051611972 @default.
- W4323543094 creator A5054741445 @default.
- W4323543094 creator A5059804544 @default.
- W4323543094 creator A5061880748 @default.
- W4323543094 creator A5072706908 @default.
- W4323543094 creator A5081706252 @default.
- W4323543094 date "2023-04-01" @default.
- W4323543094 modified "2023-10-16" @default.
- W4323543094 title "UAV-based reference data for the prediction of fractional cover of standing deadwood from Sentinel time series" @default.
- W4323543094 cites W1760422680 @default.
- W4323543094 cites W1902238321 @default.
- W4323543094 cites W1978617972 @default.
- W4323543094 cites W1981213426 @default.
- W4323543094 cites W1983523689 @default.
- W4323543094 cites W1986106091 @default.
- W4323543094 cites W1989203457 @default.
- W4323543094 cites W2036992772 @default.
- W4323543094 cites W2064675550 @default.
- W4323543094 cites W2081895269 @default.
- W4323543094 cites W2083536977 @default.
- W4323543094 cites W2084988036 @default.
- W4323543094 cites W2086154193 @default.
- W4323543094 cites W2131774270 @default.
- W4323543094 cites W2151103935 @default.
- W4323543094 cites W2161336494 @default.
- W4323543094 cites W2338024616 @default.
- W4323543094 cites W2416708566 @default.
- W4323543094 cites W2531213996 @default.
- W4323543094 cites W2560136348 @default.
- W4323543094 cites W2604292667 @default.
- W4323543094 cites W2617056706 @default.
- W4323543094 cites W2783706811 @default.
- W4323543094 cites W2802651079 @default.
- W4323543094 cites W2911964244 @default.
- W4323543094 cites W2913323966 @default.
- W4323543094 cites W2922476837 @default.
- W4323543094 cites W2939104056 @default.
- W4323543094 cites W2966350036 @default.
- W4323543094 cites W2972020684 @default.
- W4323543094 cites W2979348177 @default.
- W4323543094 cites W3012994627 @default.
- W4323543094 cites W3023449409 @default.
- W4323543094 cites W3023775819 @default.
- W4323543094 cites W3027542479 @default.
- W4323543094 cites W3035432101 @default.
- W4323543094 cites W3041849717 @default.
- W4323543094 cites W3045270849 @default.
- W4323543094 cites W3045976036 @default.
- W4323543094 cites W3085762746 @default.
- W4323543094 cites W3085784695 @default.
- W4323543094 cites W3097361954 @default.
- W4323543094 cites W3097822859 @default.
- W4323543094 cites W3104839310 @default.
- W4323543094 cites W3107192144 @default.
- W4323543094 cites W3124539583 @default.
- W4323543094 cites W3130461578 @default.
- W4323543094 cites W3163340372 @default.
- W4323543094 cites W3201650802 @default.
- W4323543094 cites W3206302840 @default.
- W4323543094 cites W4200565687 @default.
- W4323543094 cites W4206805496 @default.
- W4323543094 cites W4210492675 @default.
- W4323543094 cites W4210755971 @default.
- W4323543094 cites W4214866790 @default.
- W4323543094 cites W4220945761 @default.
- W4323543094 cites W4224309885 @default.
- W4323543094 cites W4302013472 @default.
- W4323543094 cites W892605381 @default.
- W4323543094 doi "https://doi.org/10.1016/j.ophoto.2023.100034" @default.
- W4323543094 hasPublicationYear "2023" @default.
- W4323543094 type Work @default.
- W4323543094 citedByCount "3" @default.
- W4323543094 countsByYear W43235430942023 @default.
- W4323543094 crossrefType "journal-article" @default.
- W4323543094 hasAuthorship W4323543094A5007461436 @default.
- W4323543094 hasAuthorship W4323543094A5026305066 @default.
- W4323543094 hasAuthorship W4323543094A5042728540 @default.
- W4323543094 hasAuthorship W4323543094A5051611972 @default.
- W4323543094 hasAuthorship W4323543094A5054741445 @default.
- W4323543094 hasAuthorship W4323543094A5059804544 @default.
- W4323543094 hasAuthorship W4323543094A5061880748 @default.
- W4323543094 hasAuthorship W4323543094A5072706908 @default.
- W4323543094 hasAuthorship W4323543094A5081706252 @default.
- W4323543094 hasBestOaLocation W43235430941 @default.
- W4323543094 hasConcept C113174947 @default.
- W4323543094 hasConcept C134306372 @default.
- W4323543094 hasConcept C154945302 @default.
- W4323543094 hasConcept C205649164 @default.
- W4323543094 hasConcept C33923547 @default.
- W4323543094 hasConcept C39432304 @default.
- W4323543094 hasConcept C41008148 @default.
- W4323543094 hasConcept C62649853 @default.
- W4323543094 hasConceptScore W4323543094C113174947 @default.