Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285176870> ?p ?o ?g. }
- W4285176870 endingPage "5102" @default.
- W4285176870 startingPage "5095" @default.
- W4285176870 abstract "In this letter, we propose an adaptive cost volume fusion algorithm for multi-modal depth estimation in changing environments. Our method takes measurements from multi-modal sensors to exploit their complementary characteristics and generates depth cues from each modality in the form of adaptive cost volumes using deep neural networks. The proposed adaptive cost volume considers sensor configurations and computational costs to resolve an imbalanced and redundant depth bases problem of conventional cost volumes. We further extend its role to a generalized depth representation and propose a geometry-aware cost fusion algorithm. Our unified and geometrically consistent depth representation leads to an accurate and efficient multi-modal sensor fusion, which is crucial for robustness to changing environments. To validate the proposed framework, we introduce a new multi-modal depth in changing environments (MMDCE) dataset. The dataset was collected by our own vehicular system with RGB, NIR, and LiDAR sensors in changing environments. Experimental results demonstrate that our method is robust, accurate, and reliable in changing environments. Our codes and dataset are available at our project page." @default.
- W4285176870 created "2022-07-14" @default.
- W4285176870 creator A5012455275 @default.
- W4285176870 creator A5016424347 @default.
- W4285176870 creator A5016637653 @default.
- W4285176870 creator A5042144079 @default.
- W4285176870 creator A5079380164 @default.
- W4285176870 date "2022-04-01" @default.
- W4285176870 modified "2023-09-25" @default.
- W4285176870 title "Adaptive Cost Volume Fusion Network for Multi-Modal Depth Estimation in Changing Environments" @default.
- W4285176870 cites W1923779427 @default.
- W4285176870 cites W2049981393 @default.
- W4285176870 cites W2150066425 @default.
- W4285176870 cites W2167667767 @default.
- W4285176870 cites W2418849765 @default.
- W4285176870 cites W2558027072 @default.
- W4285176870 cites W2565233142 @default.
- W4285176870 cites W2565639579 @default.
- W4285176870 cites W2789621390 @default.
- W4285176870 cites W2798881637 @default.
- W4285176870 cites W2885093229 @default.
- W4285176870 cites W2899479761 @default.
- W4285176870 cites W2904094572 @default.
- W4285176870 cites W2926429807 @default.
- W4285176870 cites W2954174912 @default.
- W4285176870 cites W2955189650 @default.
- W4285176870 cites W2961926014 @default.
- W4285176870 cites W2962793285 @default.
- W4285176870 cites W2963045776 @default.
- W4285176870 cites W2963292632 @default.
- W4285176870 cites W2963316641 @default.
- W4285176870 cites W2963619659 @default.
- W4285176870 cites W2964339842 @default.
- W4285176870 cites W2966927056 @default.
- W4285176870 cites W2968826970 @default.
- W4285176870 cites W2969202876 @default.
- W4285176870 cites W2973110888 @default.
- W4285176870 cites W2982809261 @default.
- W4285176870 cites W2986701260 @default.
- W4285176870 cites W2992464978 @default.
- W4285176870 cites W3003928900 @default.
- W4285176870 cites W3034514115 @default.
- W4285176870 cites W3034543232 @default.
- W4285176870 cites W3034604951 @default.
- W4285176870 cites W3034976873 @default.
- W4285176870 cites W3035574168 @default.
- W4285176870 cites W3089764790 @default.
- W4285176870 cites W3109128945 @default.
- W4285176870 cites W4200200310 @default.
- W4285176870 doi "https://doi.org/10.1109/lra.2022.3150868" @default.
- W4285176870 hasPublicationYear "2022" @default.
- W4285176870 type Work @default.
- W4285176870 citedByCount "2" @default.
- W4285176870 countsByYear W42851768702023 @default.
- W4285176870 crossrefType "journal-article" @default.
- W4285176870 hasAuthorship W4285176870A5012455275 @default.
- W4285176870 hasAuthorship W4285176870A5016424347 @default.
- W4285176870 hasAuthorship W4285176870A5016637653 @default.
- W4285176870 hasAuthorship W4285176870A5042144079 @default.
- W4285176870 hasAuthorship W4285176870A5079380164 @default.
- W4285176870 hasConcept C104317684 @default.
- W4285176870 hasConcept C11413529 @default.
- W4285176870 hasConcept C121332964 @default.
- W4285176870 hasConcept C138885662 @default.
- W4285176870 hasConcept C154945302 @default.
- W4285176870 hasConcept C158525013 @default.
- W4285176870 hasConcept C165696696 @default.
- W4285176870 hasConcept C17744445 @default.
- W4285176870 hasConcept C185592680 @default.
- W4285176870 hasConcept C188027245 @default.
- W4285176870 hasConcept C199539241 @default.
- W4285176870 hasConcept C20556612 @default.
- W4285176870 hasConcept C2776359362 @default.
- W4285176870 hasConcept C33954974 @default.
- W4285176870 hasConcept C38652104 @default.
- W4285176870 hasConcept C41008148 @default.
- W4285176870 hasConcept C41895202 @default.
- W4285176870 hasConcept C55493867 @default.
- W4285176870 hasConcept C62520636 @default.
- W4285176870 hasConcept C63479239 @default.
- W4285176870 hasConcept C71139939 @default.
- W4285176870 hasConcept C94625758 @default.
- W4285176870 hasConceptScore W4285176870C104317684 @default.
- W4285176870 hasConceptScore W4285176870C11413529 @default.
- W4285176870 hasConceptScore W4285176870C121332964 @default.
- W4285176870 hasConceptScore W4285176870C138885662 @default.
- W4285176870 hasConceptScore W4285176870C154945302 @default.
- W4285176870 hasConceptScore W4285176870C158525013 @default.
- W4285176870 hasConceptScore W4285176870C165696696 @default.
- W4285176870 hasConceptScore W4285176870C17744445 @default.
- W4285176870 hasConceptScore W4285176870C185592680 @default.
- W4285176870 hasConceptScore W4285176870C188027245 @default.
- W4285176870 hasConceptScore W4285176870C199539241 @default.
- W4285176870 hasConceptScore W4285176870C20556612 @default.
- W4285176870 hasConceptScore W4285176870C2776359362 @default.
- W4285176870 hasConceptScore W4285176870C33954974 @default.
- W4285176870 hasConceptScore W4285176870C38652104 @default.
- W4285176870 hasConceptScore W4285176870C41008148 @default.