Matches in SemOpenAlex for { <https://semopenalex.org/work/W4304127943> ?p ?o ?g. }
- W4304127943 endingPage "10167" @default.
- W4304127943 startingPage "10167" @default.
- W4304127943 abstract "Insect pests are a major element influencing agricultural production. According to the Food and Agriculture Organization (FAO), an estimated 20–40% of pest damage occurs each year, which reduces global production and becomes a major challenge to crop production. These insect pests cause sooty mold disease by sucking the sap from the crop’s organs, especially leaves, fruits, stems, and roots. To control these pests, pesticides are frequently used because they are fast-acting and scalable. Due to environmental pollution and health awareness, less use of pesticides is recommended. One of the salient approaches could be to reduce the wide use of pesticides by spraying on demand. To perform spot spraying, the location of the pest must first be determined. Therefore, the growing population and increasing food demand emphasize the development of novel methods and systems for agricultural production to address environmental concerns and ensure efficiency and sustainability. To accurately identify these insect pests at an early stage, insect pest detection and classification have recently become in high demand. Thus, this study aims to develop an object recognition system for the detection of crops damaging insect pests and their classification. The current work proposes an automatic system in the form of a smartphone IP- camera to detect insect pests from digital images/videos to reduce farmers’ reliance on pesticides. The proposed approach is based on YOLO object detection architectures including YOLOv5 (n, s, m, l, and x), YOLOv3, YOLO-Lite, and YOLOR. For this purpose, we collected 7046 images in the wild under different illumination and background conditions to train the underlying object detection approaches. We trained and test the object recognition system with different parameters from scratch. The eight models are compared and analyzed. The experimental results show that the average precision (AP@0.5) of the eight models including YOLO-Lite, YOLOv3, YOLOR, and YOLOv5 with five different scales (n, s, m, l, and x) reach 51.7%, 97.6%, 96.80%, 83.85%, 94.61%, 97.18%, 97.04%, and 98.3% respectively. The larger the model, the higher the average accuracy of the detection validation results. We observed that the YOLOv5x model is fully functional and can correctly identify the twenty-three species of insect pests at 40.5 milliseconds (ms). The developed model YOLOv5x performs the state-of-the-art model with an average precision value of (mAP@0.5) 98.3%, (mAP@0.5:0.95) value of 79.8%, precision of 94.5% and a recall of 97.8%, and F1-score with 96% on our IP-23 dataset. The results show that the system works efficiently and was able to correctly detect and identify insect pests, which can be employed for realistic application while farming." @default.
- W4304127943 created "2022-10-11" @default.
- W4304127943 creator A5012984625 @default.
- W4304127943 creator A5019857727 @default.
- W4304127943 creator A5030976490 @default.
- W4304127943 creator A5038558768 @default.
- W4304127943 creator A5047254817 @default.
- W4304127943 creator A5049301997 @default.
- W4304127943 creator A5052116574 @default.
- W4304127943 creator A5072608669 @default.
- W4304127943 date "2022-10-10" @default.
- W4304127943 modified "2023-10-05" @default.
- W4304127943 title "Deep Learning Based Detector YOLOv5 for Identifying Insect Pests" @default.
- W4304127943 cites W2037227137 @default.
- W4304127943 cites W2238609161 @default.
- W4304127943 cites W2565639579 @default.
- W4304127943 cites W2753403518 @default.
- W4304127943 cites W2790979755 @default.
- W4304127943 cites W2800816754 @default.
- W4304127943 cites W2801216204 @default.
- W4304127943 cites W2884561390 @default.
- W4304127943 cites W2889024854 @default.
- W4304127943 cites W2938719104 @default.
- W4304127943 cites W2953592496 @default.
- W4304127943 cites W2954607988 @default.
- W4304127943 cites W2962766617 @default.
- W4304127943 cites W2963037989 @default.
- W4304127943 cites W2963144738 @default.
- W4304127943 cites W2963849369 @default.
- W4304127943 cites W2963857746 @default.
- W4304127943 cites W3003067960 @default.
- W4304127943 cites W3010149727 @default.
- W4304127943 cites W3022851742 @default.
- W4304127943 cites W3032016692 @default.
- W4304127943 cites W3034884618 @default.
- W4304127943 cites W3034971973 @default.
- W4304127943 cites W3042011474 @default.
- W4304127943 cites W3111983898 @default.
- W4304127943 cites W3121637738 @default.
- W4304127943 cites W3132971810 @default.
- W4304127943 cites W3151832061 @default.
- W4304127943 cites W3183824707 @default.
- W4304127943 cites W3196915754 @default.
- W4304127943 cites W3208506861 @default.
- W4304127943 cites W3210535920 @default.
- W4304127943 cites W3212737115 @default.
- W4304127943 cites W4206720985 @default.
- W4304127943 cites W4207057737 @default.
- W4304127943 cites W4214652236 @default.
- W4304127943 cites W4220920464 @default.
- W4304127943 doi "https://doi.org/10.3390/app121910167" @default.
- W4304127943 hasPublicationYear "2022" @default.
- W4304127943 type Work @default.
- W4304127943 citedByCount "31" @default.
- W4304127943 countsByYear W43041279432022 @default.
- W4304127943 countsByYear W43041279432023 @default.
- W4304127943 crossrefType "journal-article" @default.
- W4304127943 hasAuthorship W4304127943A5012984625 @default.
- W4304127943 hasAuthorship W4304127943A5019857727 @default.
- W4304127943 hasAuthorship W4304127943A5030976490 @default.
- W4304127943 hasAuthorship W4304127943A5038558768 @default.
- W4304127943 hasAuthorship W4304127943A5047254817 @default.
- W4304127943 hasAuthorship W4304127943A5049301997 @default.
- W4304127943 hasAuthorship W4304127943A5052116574 @default.
- W4304127943 hasAuthorship W4304127943A5072608669 @default.
- W4304127943 hasBestOaLocation W43041279431 @default.
- W4304127943 hasConcept C118518473 @default.
- W4304127943 hasConcept C123963621 @default.
- W4304127943 hasConcept C127413603 @default.
- W4304127943 hasConcept C128383755 @default.
- W4304127943 hasConcept C137580998 @default.
- W4304127943 hasConcept C144027150 @default.
- W4304127943 hasConcept C150903083 @default.
- W4304127943 hasConcept C161176658 @default.
- W4304127943 hasConcept C18903297 @default.
- W4304127943 hasConcept C22508944 @default.
- W4304127943 hasConcept C2908647359 @default.
- W4304127943 hasConcept C2994141551 @default.
- W4304127943 hasConcept C6557445 @default.
- W4304127943 hasConcept C71924100 @default.
- W4304127943 hasConcept C86803240 @default.
- W4304127943 hasConcept C88463610 @default.
- W4304127943 hasConcept C99454951 @default.
- W4304127943 hasConceptScore W4304127943C118518473 @default.
- W4304127943 hasConceptScore W4304127943C123963621 @default.
- W4304127943 hasConceptScore W4304127943C127413603 @default.
- W4304127943 hasConceptScore W4304127943C128383755 @default.
- W4304127943 hasConceptScore W4304127943C137580998 @default.
- W4304127943 hasConceptScore W4304127943C144027150 @default.
- W4304127943 hasConceptScore W4304127943C150903083 @default.
- W4304127943 hasConceptScore W4304127943C161176658 @default.
- W4304127943 hasConceptScore W4304127943C18903297 @default.
- W4304127943 hasConceptScore W4304127943C22508944 @default.
- W4304127943 hasConceptScore W4304127943C2908647359 @default.
- W4304127943 hasConceptScore W4304127943C2994141551 @default.
- W4304127943 hasConceptScore W4304127943C6557445 @default.
- W4304127943 hasConceptScore W4304127943C71924100 @default.
- W4304127943 hasConceptScore W4304127943C86803240 @default.