Matches in SemOpenAlex for { <https://semopenalex.org/work/W4378085651> ?p ?o ?g. }
- W4378085651 endingPage "2697" @default.
- W4378085651 startingPage "2697" @default.
- W4378085651 abstract "Accurate information on dominant tree species and their spatial distribution in subtropical natural forests are key ecological monitoring factors for accurately characterizing forest biodiversity, depicting the tree competition mechanism and quantitatively evaluating forest ecosystem stability. In this study, the subtropical natural forest in northwest Yunnan province of China was selected as the study area. Firstly, an object-oriented multi-resolution segmentation (MRS) algorithm was used to segment individual tree crowns from the UAV RGB imagery and satellite multispectral imagery in the forests with different densities (low (547 n/ha), middle (753 n/ha) and high (1040 n/ha)), and parameters of the MRS algorithm were tested and optimized for accurately extracting the tree crown and position information of the individual tree. Secondly, the texture metrics of the UAV RGB imagery and the spectral metrics of the satellite multispectral imagery within the individual tree crown were extracted, and the random forest algorithm and three deep learning networks constructed in this study were utilized to classify the five dominant tree species. Finally, we compared and evaluated the performance of the random forest algorithm and three deep learning networks for dominant tree species classification using the field measurement data, and the influence of the number of training samples on the accuracy of dominant tree species classification using deep learning networks was investigated. The results showed that: (1) Stand density had little influence on individual tree segmentation using the object-oriented MRS algorithm. In the forests with different stand densities, the F1 score of individual tree segmentation based on satellite multispectral imagery was 71.3–74.7%, and that based on UAV high-resolution RGB imagery was 75.4–79.2%. (2) The overall accuracy of dominant tree species classification using the light-weight network MobileNetV2 (OA = 71.11–82.22%), residual network ResNet34 (OA = 78.89–91.11%) and dense network DenseNet121 (OA = 81.11–94.44%) was higher than that of the random forest algorithm (OA = 60.00–64.44%), among which DenseNet121 had the highest overall accuracy. Texture metrics improved the overall accuracy of dominant tree species classification. (3) For the three deep learning networks, the changes in overall accuracy of dominant tree species classification influenced by the number of training samples were 2.69–4.28%." @default.
- W4378085651 created "2023-05-25" @default.
- W4378085651 creator A5005121291 @default.
- W4378085651 creator A5031929340 @default.
- W4378085651 creator A5083073196 @default.
- W4378085651 date "2023-05-22" @default.
- W4378085651 modified "2023-10-18" @default.
- W4378085651 title "Tree Species Classification in Subtropical Natural Forests Using High-Resolution UAV RGB and SuperView-1 Multispectral Imageries Based on Deep Learning Network Approaches: A Case Study within the Baima Snow Mountain National Nature Reserve, China" @default.
- W4378085651 cites W1995234545 @default.
- W4378085651 cites W1997732436 @default.
- W4378085651 cites W2004553299 @default.
- W4378085651 cites W2008233110 @default.
- W4378085651 cites W2008469107 @default.
- W4378085651 cites W2013081398 @default.
- W4378085651 cites W2019549520 @default.
- W4378085651 cites W2026925128 @default.
- W4378085651 cites W2034650341 @default.
- W4378085651 cites W2037225341 @default.
- W4378085651 cites W2044084913 @default.
- W4378085651 cites W2055734610 @default.
- W4378085651 cites W2057084393 @default.
- W4378085651 cites W2061240006 @default.
- W4378085651 cites W2084857239 @default.
- W4378085651 cites W2096996101 @default.
- W4378085651 cites W2103317434 @default.
- W4378085651 cites W2125574854 @default.
- W4378085651 cites W2125724410 @default.
- W4378085651 cites W2145739509 @default.
- W4378085651 cites W2145830077 @default.
- W4378085651 cites W2161815745 @default.
- W4378085651 cites W2165796970 @default.
- W4378085651 cites W2194775991 @default.
- W4378085651 cites W2233456439 @default.
- W4378085651 cites W2337442676 @default.
- W4378085651 cites W2512351403 @default.
- W4378085651 cites W2515306179 @default.
- W4378085651 cites W2577537809 @default.
- W4378085651 cites W2581662761 @default.
- W4378085651 cites W2591466624 @default.
- W4378085651 cites W2595436381 @default.
- W4378085651 cites W2608436798 @default.
- W4378085651 cites W2626305371 @default.
- W4378085651 cites W2736508163 @default.
- W4378085651 cites W2765587043 @default.
- W4378085651 cites W2768921697 @default.
- W4378085651 cites W2791598887 @default.
- W4378085651 cites W2886384139 @default.
- W4378085651 cites W2893127089 @default.
- W4378085651 cites W2902668251 @default.
- W4378085651 cites W2908603697 @default.
- W4378085651 cites W2911261286 @default.
- W4378085651 cites W2911964244 @default.
- W4378085651 cites W2921401402 @default.
- W4378085651 cites W2930433608 @default.
- W4378085651 cites W2938586265 @default.
- W4378085651 cites W2948612393 @default.
- W4378085651 cites W2963163009 @default.
- W4378085651 cites W2963446712 @default.
- W4378085651 cites W3004674667 @default.
- W4378085651 cites W3024237441 @default.
- W4378085651 cites W3048194731 @default.
- W4378085651 cites W3097361954 @default.
- W4378085651 cites W3097971370 @default.
- W4378085651 cites W3110459943 @default.
- W4378085651 cites W3120119699 @default.
- W4378085651 cites W3121566766 @default.
- W4378085651 cites W3124539583 @default.
- W4378085651 cites W3127319645 @default.
- W4378085651 cites W3197961809 @default.
- W4378085651 cites W4224229372 @default.
- W4378085651 cites W4297009805 @default.
- W4378085651 cites W4322742038 @default.
- W4378085651 doi "https://doi.org/10.3390/rs15102697" @default.
- W4378085651 hasPublicationYear "2023" @default.
- W4378085651 type Work @default.
- W4378085651 citedByCount "0" @default.
- W4378085651 crossrefType "journal-article" @default.
- W4378085651 hasAuthorship W4378085651A5005121291 @default.
- W4378085651 hasAuthorship W4378085651A5031929340 @default.
- W4378085651 hasAuthorship W4378085651A5083073196 @default.
- W4378085651 hasBestOaLocation W43780856511 @default.
- W4378085651 hasConcept C113174947 @default.
- W4378085651 hasConcept C134306372 @default.
- W4378085651 hasConcept C154945302 @default.
- W4378085651 hasConcept C169258074 @default.
- W4378085651 hasConcept C173163844 @default.
- W4378085651 hasConcept C205649164 @default.
- W4378085651 hasConcept C2778102629 @default.
- W4378085651 hasConcept C33923547 @default.
- W4378085651 hasConcept C39432304 @default.
- W4378085651 hasConcept C41008148 @default.
- W4378085651 hasConcept C58640448 @default.
- W4378085651 hasConcept C62649853 @default.
- W4378085651 hasConcept C89600930 @default.
- W4378085651 hasConceptScore W4378085651C113174947 @default.
- W4378085651 hasConceptScore W4378085651C134306372 @default.
- W4378085651 hasConceptScore W4378085651C154945302 @default.
- W4378085651 hasConceptScore W4378085651C169258074 @default.