Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387020916> ?p ?o ?g. }
- W4387020916 abstract "Abstract Computer vision has found many applications in automatic wildlife data analytics and biodiversity monitoring. Automating tasks like animal recognition or animal detection usually require machine learning models (e.g., deep neural networks) trained on annotated datasets. However, image datasets built for general purposes fail to capture realistic conditions of ecological studies, and existing datasets collected with camera-traps mainly focus on medium to large-sized animals. There is a lack of annotated small-sized animal datasets in the field. Small-sized animals (e.g., small mammals, frogs, lizards, arthropods) play an important role in ecosystems but are difficult to capture on camera-traps. They also present additional challenges: small animals can be more difficult to identify and blend more easily with their surroundings. To fill this gap, we introduce in this paper a new dataset dedicated to ecological studies of small-sized animals, and provide benchmark results of computer vision-based wildlife monitoring. The novelty of our work lies on SAWIT ( s mall-sized a nimal w ild i mage da t aset), the first real-world dataset of small-sized animals, collected from camera traps and in realistic conditions. Our dataset consists of 34,434 images and is annotated by experts in the field with object-level annotations (bounding boxes) providing 34,820 annotated animals for seven animal categories. The dataset encompasses a wide range of challenging scenarios, such as occlusions, blurriness, and instances where animals blend into the dense vegetation. Based on the dataset, we benchmark two prevailing object detection algorithms: Faster RCNN and YOLO, and their variants. Experimental results show that all the variants of YOLO (version 5) perform similarly, ranging from 59.3% to 62.6% for the overall mean Average Precision (mAP) across all the animal categories. Faster RCNN with ResNet50 and HRNet backbone achieve 61.7% mAP and 58.5% mAP respectively. Through experiments, we indicate challenges and suggest research directions for computer vision-based wildlife monitoring. We provide both the dataset and the animal detection code at https://github.com/dtnguyen0304/sawit ." @default.
- W4387020916 created "2023-09-26" @default.
- W4387020916 creator A5024634557 @default.
- W4387020916 creator A5031504400 @default.
- W4387020916 creator A5053434140 @default.
- W4387020916 creator A5059269684 @default.
- W4387020916 creator A5061102329 @default.
- W4387020916 creator A5085593383 @default.
- W4387020916 creator A5087107528 @default.
- W4387020916 creator A5088007702 @default.
- W4387020916 date "2023-09-25" @default.
- W4387020916 modified "2023-10-12" @default.
- W4387020916 title "SAWIT: A small-sized animal wild image dataset with annotations" @default.
- W4387020916 cites W1243811869 @default.
- W4387020916 cites W1833143043 @default.
- W4387020916 cites W1861492603 @default.
- W4387020916 cites W1874745161 @default.
- W4387020916 cites W1977295328 @default.
- W4387020916 cites W2031489346 @default.
- W4387020916 cites W2087612985 @default.
- W4387020916 cites W2108598243 @default.
- W4387020916 cites W2133339132 @default.
- W4387020916 cites W2194775991 @default.
- W4387020916 cites W2413367505 @default.
- W4387020916 cites W2495802977 @default.
- W4387020916 cites W2570343428 @default.
- W4387020916 cites W2737340643 @default.
- W4387020916 cites W2755633238 @default.
- W4387020916 cites W2759820149 @default.
- W4387020916 cites W2769210209 @default.
- W4387020916 cites W2782689936 @default.
- W4387020916 cites W2797977484 @default.
- W4387020916 cites W2883386984 @default.
- W4387020916 cites W2885976210 @default.
- W4387020916 cites W2914365678 @default.
- W4387020916 cites W2916798096 @default.
- W4387020916 cites W2918472578 @default.
- W4387020916 cites W2952113774 @default.
- W4387020916 cites W2969264236 @default.
- W4387020916 cites W3009364469 @default.
- W4387020916 cites W3012374685 @default.
- W4387020916 cites W3017145925 @default.
- W4387020916 cites W3030331853 @default.
- W4387020916 cites W3035263170 @default.
- W4387020916 cites W3093326439 @default.
- W4387020916 cites W3119653668 @default.
- W4387020916 cites W3196673604 @default.
- W4387020916 cites W4205668735 @default.
- W4387020916 cites W4224285690 @default.
- W4387020916 cites W4280534131 @default.
- W4387020916 cites W4282033102 @default.
- W4387020916 cites W4282968895 @default.
- W4387020916 cites W4286750465 @default.
- W4387020916 cites W4288083516 @default.
- W4387020916 cites W4289986171 @default.
- W4387020916 cites W4303446389 @default.
- W4387020916 cites W639708223 @default.
- W4387020916 doi "https://doi.org/10.1007/s11042-023-16673-3" @default.
- W4387020916 hasPublicationYear "2023" @default.
- W4387020916 type Work @default.
- W4387020916 citedByCount "0" @default.
- W4387020916 crossrefType "journal-article" @default.
- W4387020916 hasAuthorship W4387020916A5024634557 @default.
- W4387020916 hasAuthorship W4387020916A5031504400 @default.
- W4387020916 hasAuthorship W4387020916A5053434140 @default.
- W4387020916 hasAuthorship W4387020916A5059269684 @default.
- W4387020916 hasAuthorship W4387020916A5061102329 @default.
- W4387020916 hasAuthorship W4387020916A5085593383 @default.
- W4387020916 hasAuthorship W4387020916A5087107528 @default.
- W4387020916 hasAuthorship W4387020916A5088007702 @default.
- W4387020916 hasBestOaLocation W43870209161 @default.
- W4387020916 hasConcept C108583219 @default.
- W4387020916 hasConcept C119857082 @default.
- W4387020916 hasConcept C120665830 @default.
- W4387020916 hasConcept C121332964 @default.
- W4387020916 hasConcept C138885662 @default.
- W4387020916 hasConcept C153180895 @default.
- W4387020916 hasConcept C154945302 @default.
- W4387020916 hasConcept C185798385 @default.
- W4387020916 hasConcept C192209626 @default.
- W4387020916 hasConcept C197352329 @default.
- W4387020916 hasConcept C202444582 @default.
- W4387020916 hasConcept C205649164 @default.
- W4387020916 hasConcept C27206212 @default.
- W4387020916 hasConcept C2776151529 @default.
- W4387020916 hasConcept C2778738651 @default.
- W4387020916 hasConcept C33923547 @default.
- W4387020916 hasConcept C41008148 @default.
- W4387020916 hasConcept C58640448 @default.
- W4387020916 hasConcept C59822182 @default.
- W4387020916 hasConcept C63584917 @default.
- W4387020916 hasConcept C86803240 @default.
- W4387020916 hasConcept C9652623 @default.
- W4387020916 hasConceptScore W4387020916C108583219 @default.
- W4387020916 hasConceptScore W4387020916C119857082 @default.
- W4387020916 hasConceptScore W4387020916C120665830 @default.
- W4387020916 hasConceptScore W4387020916C121332964 @default.
- W4387020916 hasConceptScore W4387020916C138885662 @default.
- W4387020916 hasConceptScore W4387020916C153180895 @default.
- W4387020916 hasConceptScore W4387020916C154945302 @default.