Matches in SemOpenAlex for { <https://semopenalex.org/work/W3163238441> ?p ?o ?g. }
- W3163238441 endingPage "106140" @default.
- W3163238441 startingPage "106140" @default.
- W3163238441 abstract "In respect of pig instance segmentation, the application of traditional computer vision techniques is constrained by sundries barrier, overlapping, and different perspectives in the pig breeding environment. In recent years, the attention-based methods have achieved remarkable performance. In this paper, we introduce two types of attention blocks into the feature pyramid network (FPN) (see nomenclature table) framework, which encode the semantic interdependencies in the channel (named channel attention block (CAB)) (see nomenclature table) and spatial (named spatial attention block (SAB)) (see nomenclature table) dimensions, respectively. By integrating the associated features, the CAB selectively emphasizes the interdependencies among the channels. Meanwhile, the SAB selectively aggregates the features at each position through a weighted sum of the features at all positions. A dual attention block (DAB) (see nomenclature table) is proposed to integrate CAB features with SAB information flexibly. A total of 45 pigs with 8 pens are captured as the experiment subjects. In comparison with such state-of-art attention modules as convolutional block attention module (CBAM) (see nomenclature table), bottleneck attention module (BAM) (see nomenclature table), and spatial-channel squeeze & excitation (SCSE) (see nomenclature table), embedding DAB can contribute to the most significant performance improvement in different task networks with distinct backbone networks. Especially with HTC-R101-DAB (hybrid task cascade) (see nomenclature table), the best performance is produced, with the AP0.5 (average precision) (see nomenclature table) AP0.75, AP0.5:0.95, and AP0.5:0.95-large reaching 93.1%, 84.1%, 69.4%, and 71.8%, respectively. Also, as indicated by ablation experiments, the SAB contributes more than CAB. Meanwhile, the predictive results appear a trend of increasing initially and decreasing afterwards after different numbers of SAB are merged. Besides, as revealed by the visualization of attention maps, attention blocks can extract regions with similar semantic information. The attention-based models also produce outstanding segmentation performance on public dataset, which evidences the practicability of our attention blocks. Our baseline models are available1." @default.
- W3163238441 created "2021-05-24" @default.
- W3163238441 creator A5015943706 @default.
- W3163238441 creator A5052741041 @default.
- W3163238441 creator A5073886992 @default.
- W3163238441 date "2021-07-01" @default.
- W3163238441 modified "2023-09-30" @default.
- W3163238441 title "Dual attention-guided feature pyramid network for instance segmentation of group pigs" @default.
- W3163238441 cites W2015678706 @default.
- W3163238441 cites W210617966 @default.
- W3163238441 cites W2108581046 @default.
- W3163238441 cites W2343728412 @default.
- W3163238441 cites W2345833724 @default.
- W3163238441 cites W2412782625 @default.
- W3163238441 cites W2581974582 @default.
- W3163238441 cites W2618530766 @default.
- W3163238441 cites W2771389134 @default.
- W3163238441 cites W2775186784 @default.
- W3163238441 cites W2790979755 @default.
- W3163238441 cites W2791690647 @default.
- W3163238441 cites W2792559640 @default.
- W3163238441 cites W2794578160 @default.
- W3163238441 cites W2806070179 @default.
- W3163238441 cites W2845797600 @default.
- W3163238441 cites W2886201707 @default.
- W3163238441 cites W2887902433 @default.
- W3163238441 cites W2893064254 @default.
- W3163238441 cites W2898509049 @default.
- W3163238441 cites W2899609705 @default.
- W3163238441 cites W2901473977 @default.
- W3163238441 cites W2904452810 @default.
- W3163238441 cites W2904918578 @default.
- W3163238441 cites W2904939198 @default.
- W3163238441 cites W2913533968 @default.
- W3163238441 cites W2922182161 @default.
- W3163238441 cites W2936307272 @default.
- W3163238441 cites W2950288321 @default.
- W3163238441 cites W2953151638 @default.
- W3163238441 cites W2957697881 @default.
- W3163238441 cites W2969953806 @default.
- W3163238441 cites W2970576551 @default.
- W3163238441 cites W2971432438 @default.
- W3163238441 cites W2971546563 @default.
- W3163238441 cites W2976038271 @default.
- W3163238441 cites W2977012283 @default.
- W3163238441 cites W2981099980 @default.
- W3163238441 cites W2981609437 @default.
- W3163238441 cites W2995313058 @default.
- W3163238441 cites W2997707473 @default.
- W3163238441 cites W3004977026 @default.
- W3163238441 cites W3008125552 @default.
- W3163238441 cites W3017357154 @default.
- W3163238441 cites W3025656547 @default.
- W3163238441 cites W3035236910 @default.
- W3163238441 cites W3036237841 @default.
- W3163238441 cites W3042132036 @default.
- W3163238441 cites W3043977637 @default.
- W3163238441 cites W3048534840 @default.
- W3163238441 cites W639708223 @default.
- W3163238441 doi "https://doi.org/10.1016/j.compag.2021.106140" @default.
- W3163238441 hasPublicationYear "2021" @default.
- W3163238441 type Work @default.
- W3163238441 sameAs 3163238441 @default.
- W3163238441 citedByCount "19" @default.
- W3163238441 countsByYear W31632384412022 @default.
- W3163238441 countsByYear W31632384412023 @default.
- W3163238441 crossrefType "journal-article" @default.
- W3163238441 hasAuthorship W3163238441A5015943706 @default.
- W3163238441 hasAuthorship W3163238441A5052741041 @default.
- W3163238441 hasAuthorship W3163238441A5073886992 @default.
- W3163238441 hasConcept C121684516 @default.
- W3163238441 hasConcept C124101348 @default.
- W3163238441 hasConcept C127162648 @default.
- W3163238441 hasConcept C127413603 @default.
- W3163238441 hasConcept C138885662 @default.
- W3163238441 hasConcept C142575187 @default.
- W3163238441 hasConcept C153180895 @default.
- W3163238441 hasConcept C154945302 @default.
- W3163238441 hasConcept C201995342 @default.
- W3163238441 hasConcept C21442007 @default.
- W3163238441 hasConcept C2524010 @default.
- W3163238441 hasConcept C2776401178 @default.
- W3163238441 hasConcept C2777210771 @default.
- W3163238441 hasConcept C2780451532 @default.
- W3163238441 hasConcept C33923547 @default.
- W3163238441 hasConcept C41008148 @default.
- W3163238441 hasConcept C41895202 @default.
- W3163238441 hasConcept C45235069 @default.
- W3163238441 hasConcept C514705636 @default.
- W3163238441 hasConcept C58642233 @default.
- W3163238441 hasConcept C59822182 @default.
- W3163238441 hasConcept C76155785 @default.
- W3163238441 hasConcept C86803240 @default.
- W3163238441 hasConcept C89600930 @default.
- W3163238441 hasConceptScore W3163238441C121684516 @default.
- W3163238441 hasConceptScore W3163238441C124101348 @default.
- W3163238441 hasConceptScore W3163238441C127162648 @default.
- W3163238441 hasConceptScore W3163238441C127413603 @default.