Matches in SemOpenAlex for { <https://semopenalex.org/work/W3209732904> ?p ?o ?g. }
- W3209732904 endingPage "9042" @default.
- W3209732904 startingPage "9026" @default.
- W3209732904 abstract "How to effectively fuse cross-modal information is a key problem for RGB-D salient object detection. Early fusion and result fusion schemes fuse RGB and depth information at the input and output stages, respectively, and hence incur distribution gaps or information loss. Many models instead employ a feature fusion strategy, but they are limited by their use of low-order point-to-point fusion methods. In this paper, we propose a novel mutual attention model by fusing attention and context from different modalities. We use the non-local attention of one modality to propagate long-range contextual dependencies for the other, thus leveraging complementary attention cues to achieve high-order and trilinear cross-modal interaction. We also propose to induce contrast inference from the mutual attention and obtain a unified model. Considering that low-quality depth data may be detrimental to model performance, we further propose a selective attention to reweight the added depth cues. We embed the proposed modules in a two-stream CNN for RGB-D SOD. Experimental results demonstrate the effectiveness of our proposed model. Moreover, we also construct a new and challenging large-scale RGB-D SOD dataset of high-quality, which can promote both the training and evaluation of deep models." @default.
- W3209732904 created "2021-11-08" @default.
- W3209732904 creator A5001154049 @default.
- W3209732904 creator A5012529382 @default.
- W3209732904 creator A5036881486 @default.
- W3209732904 creator A5082634513 @default.
- W3209732904 date "2022-12-01" @default.
- W3209732904 modified "2023-10-16" @default.
- W3209732904 title "Learning Selective Mutual Attention and Contrast for RGB-D Saliency Detection" @default.
- W3209732904 cites W1903029394 @default.
- W3209732904 cites W1966025376 @default.
- W3209732904 cites W1976409045 @default.
- W3209732904 cites W1976754232 @default.
- W3209732904 cites W2000946514 @default.
- W3209732904 cites W2033959528 @default.
- W3209732904 cites W2039298799 @default.
- W3209732904 cites W2056898157 @default.
- W3209732904 cites W20683899 @default.
- W3209732904 cites W2070922051 @default.
- W3209732904 cites W2090518410 @default.
- W3209732904 cites W2108598243 @default.
- W3209732904 cites W2128272608 @default.
- W3209732904 cites W2149694199 @default.
- W3209732904 cites W2159797236 @default.
- W3209732904 cites W2194775991 @default.
- W3209732904 cites W2257456433 @default.
- W3209732904 cites W2293332611 @default.
- W3209732904 cites W2337762808 @default.
- W3209732904 cites W2412782625 @default.
- W3209732904 cites W2461475918 @default.
- W3209732904 cites W2514453564 @default.
- W3209732904 cites W2546696630 @default.
- W3209732904 cites W2549139847 @default.
- W3209732904 cites W2560474170 @default.
- W3209732904 cites W2620958690 @default.
- W3209732904 cites W2732026016 @default.
- W3209732904 cites W2765838470 @default.
- W3209732904 cites W2798373498 @default.
- W3209732904 cites W2798857366 @default.
- W3209732904 cites W2799213142 @default.
- W3209732904 cites W2887522866 @default.
- W3209732904 cites W2907643346 @default.
- W3209732904 cites W2909381593 @default.
- W3209732904 cites W2948300571 @default.
- W3209732904 cites W2955084925 @default.
- W3209732904 cites W2957414648 @default.
- W3209732904 cites W2959581809 @default.
- W3209732904 cites W2961348656 @default.
- W3209732904 cites W2962159375 @default.
- W3209732904 cites W2963091558 @default.
- W3209732904 cites W2963334022 @default.
- W3209732904 cites W2963446712 @default.
- W3209732904 cites W2963529609 @default.
- W3209732904 cites W2963685207 @default.
- W3209732904 cites W2963868681 @default.
- W3209732904 cites W2963897031 @default.
- W3209732904 cites W2963998427 @default.
- W3209732904 cites W2990984982 @default.
- W3209732904 cites W2996770795 @default.
- W3209732904 cites W2999343753 @default.
- W3209732904 cites W3002301267 @default.
- W3209732904 cites W3010616503 @default.
- W3209732904 cites W3019728440 @default.
- W3209732904 cites W3023562424 @default.
- W3209732904 cites W3034320133 @default.
- W3209732904 cites W3035284915 @default.
- W3209732904 cites W3035290198 @default.
- W3209732904 cites W3035357085 @default.
- W3209732904 cites W3035633116 @default.
- W3209732904 cites W3035687312 @default.
- W3209732904 cites W3044364325 @default.
- W3209732904 cites W3092729213 @default.
- W3209732904 cites W3096966254 @default.
- W3209732904 cites W3097053213 @default.
- W3209732904 cites W3097725659 @default.
- W3209732904 cites W3097996169 @default.
- W3209732904 cites W3099871687 @default.
- W3209732904 cites W3104979525 @default.
- W3209732904 cites W3108421143 @default.
- W3209732904 cites W3108608656 @default.
- W3209732904 cites W3108812909 @default.
- W3209732904 cites W3114651545 @default.
- W3209732904 cites W3118710621 @default.
- W3209732904 cites W4239147634 @default.
- W3209732904 doi "https://doi.org/10.1109/tpami.2021.3122139" @default.
- W3209732904 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/34699348" @default.
- W3209732904 hasPublicationYear "2022" @default.
- W3209732904 type Work @default.
- W3209732904 sameAs 3209732904 @default.
- W3209732904 citedByCount "15" @default.
- W3209732904 countsByYear W32097329042022 @default.
- W3209732904 countsByYear W32097329042023 @default.
- W3209732904 crossrefType "journal-article" @default.
- W3209732904 hasAuthorship W3209732904A5001154049 @default.
- W3209732904 hasAuthorship W3209732904A5012529382 @default.
- W3209732904 hasAuthorship W3209732904A5036881486 @default.
- W3209732904 hasAuthorship W3209732904A5082634513 @default.
- W3209732904 hasBestOaLocation W32097329042 @default.