Matches in SemOpenAlex for { <https://semopenalex.org/work/W4295308209> ?p ?o ?g. }
- W4295308209 endingPage "8154" @default.
- W4295308209 startingPage "8142" @default.
- W4295308209 abstract "With the accumulation and storage of remote sensing images in various satellite data centers, the rapid detection of objects of interest from large-scale remote sensing images is a current research focus and application requirement. Although some cutting-edge object detection algorithms in remote sensing images perform well in terms of accuracy (mAP), their inference speed is slow and requires high hardware requirements that are not suitable for real-time object detection in large-scale remote sensing images. To address this issue, we propose a fast inference framework for object detection in large-scale remote sensing images. On the one hand, we introduce <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$alpha$</tex-math></inline-formula> -IoU Loss on the YWCSL model to implement adaptive weighted loss and gradient, which achieves 64.62% and 79.54% mAP on DIOR-R and DOTA test sets, respectively. More importantly, the inference speed of the YWCSL model reaches 60.74 FPS on a single NVIDIA GeForce RTX 3080 Ti, which is 2.87 times faster than the current state-of-the-art one-stage detector S <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$^{2}$</tex-math></inline-formula> A-Net. On the other hand, we build a distributed inference framework to enable fast inference on large-scale remote sensing images. Specifically, we save the images on HDFS for distributed storage and deploy the pre-trained YWCSL model on the Spark cluster. In addition, we use a custom partitioner RankPartition to repartition the data to further improve the performance of the cluster. When using 5 nodes, the speedup of the cluster reaches 9.54, which is 90.80% higher than the theoretical linear speedup (5.00). Our distributed inference framework for large-scale remote sensing images significantly reduces the dependence of object detection on expensive hardware resources, which has important research significance for the wide application of object detection in remote sensing images." @default.
- W4295308209 created "2022-09-12" @default.
- W4295308209 creator A5013558939 @default.
- W4295308209 creator A5013763730 @default.
- W4295308209 creator A5038110449 @default.
- W4295308209 creator A5043295613 @default.
- W4295308209 creator A5047516173 @default.
- W4295308209 creator A5057269007 @default.
- W4295308209 creator A5057287134 @default.
- W4295308209 date "2022-01-01" @default.
- W4295308209 modified "2023-10-09" @default.
- W4295308209 title "Object Detection in Large-Scale Remote Sensing Images With a Distributed Deep Learning Framework" @default.
- W4295308209 cites W1536680647 @default.
- W4295308209 cites W2070600700 @default.
- W4295308209 cites W2282372241 @default.
- W4295308209 cites W2295320150 @default.
- W4295308209 cites W2547794587 @default.
- W4295308209 cites W2594177559 @default.
- W4295308209 cites W2797862701 @default.
- W4295308209 cites W2801431388 @default.
- W4295308209 cites W2884561390 @default.
- W4295308209 cites W2962749812 @default.
- W4295308209 cites W2962766617 @default.
- W4295308209 cites W2963037989 @default.
- W4295308209 cites W2963351448 @default.
- W4295308209 cites W2963849369 @default.
- W4295308209 cites W2963927307 @default.
- W4295308209 cites W2964979676 @default.
- W4295308209 cites W2970845903 @default.
- W4295308209 cites W2991359031 @default.
- W4295308209 cites W2991363140 @default.
- W4295308209 cites W2992240579 @default.
- W4295308209 cites W2997747012 @default.
- W4295308209 cites W3015331846 @default.
- W4295308209 cites W3034993937 @default.
- W4295308209 cites W3036479037 @default.
- W4295308209 cites W3044807506 @default.
- W4295308209 cites W3046174881 @default.
- W4295308209 cites W3106141888 @default.
- W4295308209 cites W3106228955 @default.
- W4295308209 cites W3112736773 @default.
- W4295308209 cites W3119345651 @default.
- W4295308209 cites W3170033848 @default.
- W4295308209 cites W3174389852 @default.
- W4295308209 cites W3174710842 @default.
- W4295308209 cites W3174873843 @default.
- W4295308209 cites W3175496347 @default.
- W4295308209 cites W3177105943 @default.
- W4295308209 cites W3186560493 @default.
- W4295308209 cites W3201797941 @default.
- W4295308209 cites W3203608457 @default.
- W4295308209 cites W4210707990 @default.
- W4295308209 cites W4210925408 @default.
- W4295308209 cites W4211173354 @default.
- W4295308209 cites W4213159881 @default.
- W4295308209 cites W4214648418 @default.
- W4295308209 cites W4225916372 @default.
- W4295308209 cites W4226191030 @default.
- W4295308209 cites W4226227310 @default.
- W4295308209 cites W639708223 @default.
- W4295308209 doi "https://doi.org/10.1109/jstars.2022.3206085" @default.
- W4295308209 hasPublicationYear "2022" @default.
- W4295308209 type Work @default.
- W4295308209 citedByCount "1" @default.
- W4295308209 countsByYear W42953082092023 @default.
- W4295308209 crossrefType "journal-article" @default.
- W4295308209 hasAuthorship W4295308209A5013558939 @default.
- W4295308209 hasAuthorship W4295308209A5013763730 @default.
- W4295308209 hasAuthorship W4295308209A5038110449 @default.
- W4295308209 hasAuthorship W4295308209A5043295613 @default.
- W4295308209 hasAuthorship W4295308209A5047516173 @default.
- W4295308209 hasAuthorship W4295308209A5057269007 @default.
- W4295308209 hasAuthorship W4295308209A5057287134 @default.
- W4295308209 hasBestOaLocation W42953082091 @default.
- W4295308209 hasConcept C120665830 @default.
- W4295308209 hasConcept C121332964 @default.
- W4295308209 hasConcept C153180895 @default.
- W4295308209 hasConcept C154945302 @default.
- W4295308209 hasConcept C192209626 @default.
- W4295308209 hasConcept C2776151529 @default.
- W4295308209 hasConcept C2776214188 @default.
- W4295308209 hasConcept C2778755073 @default.
- W4295308209 hasConcept C2781238097 @default.
- W4295308209 hasConcept C31972630 @default.
- W4295308209 hasConcept C33923547 @default.
- W4295308209 hasConcept C41008148 @default.
- W4295308209 hasConcept C45357846 @default.
- W4295308209 hasConcept C62520636 @default.
- W4295308209 hasConcept C76155785 @default.
- W4295308209 hasConcept C94375191 @default.
- W4295308209 hasConcept C94915269 @default.
- W4295308209 hasConceptScore W4295308209C120665830 @default.
- W4295308209 hasConceptScore W4295308209C121332964 @default.
- W4295308209 hasConceptScore W4295308209C153180895 @default.
- W4295308209 hasConceptScore W4295308209C154945302 @default.
- W4295308209 hasConceptScore W4295308209C192209626 @default.
- W4295308209 hasConceptScore W4295308209C2776151529 @default.
- W4295308209 hasConceptScore W4295308209C2776214188 @default.