Matches in SemOpenAlex for { <https://semopenalex.org/work/W4304080585> ?p ?o ?g. }
Showing items 1 to 81 of
81
with 100 items per page.
- W4304080585 abstract "Accurately measuring the absolute depth of every pixel captured by an imaging sensor is of critical importance in real-time applications such as autonomous navigation, augmented reality and robotics. In order to predict dense depth, a general approach is to fuse sensor inputs from different modalities such as LiDAR, camera and other time-of-flight sensors. LiDAR and other time-of-flight sensors provide accurate depth data but are quite sparse, both spatially and temporally. To augment missing depth information, generally RGB guidance is leveraged due to its high resolution information. Due to the reliance on multiple sensor modalities, design for robustness and adaptation is essential. In this work, we propose a transformer-like self-attention based generative adversarial network to estimate dense depth using RGB and sparse depth data. We introduce a novel training recipe for making the model robust so that it works even when one of the input modalities is not available. The multi-head self-attention mechanism can dynamically attend to most salient parts of the RGB image or corresponding sparse depth data producing the most competitive results. Our proposed network also requires less memory for training and inference compared to other existing heavily residual connection based convolutional neural networks, making it more suitable for resource-constrained edge applications. The source code is available at: https://github.com/kocchop/robust-multimodal-fusion-gan" @default.
- W4304080585 created "2022-10-10" @default.
- W4304080585 creator A5053696889 @default.
- W4304080585 creator A5067814331 @default.
- W4304080585 creator A5071026265 @default.
- W4304080585 creator A5088564355 @default.
- W4304080585 date "2022-10-10" @default.
- W4304080585 modified "2023-09-27" @default.
- W4304080585 title "Robust Multimodal Depth Estimation using Transformer based Generative Adversarial Networks" @default.
- W4304080585 cites W1539811621 @default.
- W4304080585 cites W1803059841 @default.
- W4304080585 cites W1905829557 @default.
- W4304080585 cites W1967027087 @default.
- W4304080585 cites W2005441409 @default.
- W4304080585 cites W2412454633 @default.
- W4304080585 cites W2807828983 @default.
- W4304080585 cites W2886851716 @default.
- W4304080585 cites W2963045776 @default.
- W4304080585 cites W2963316641 @default.
- W4304080585 cites W2963591054 @default.
- W4304080585 cites W2963867516 @default.
- W4304080585 cites W2964326562 @default.
- W4304080585 cites W2980467688 @default.
- W4304080585 cites W2998293366 @default.
- W4304080585 cites W3034543232 @default.
- W4304080585 cites W3110653837 @default.
- W4304080585 cites W3160809918 @default.
- W4304080585 cites W3172863135 @default.
- W4304080585 cites W3174856432 @default.
- W4304080585 cites W3206335707 @default.
- W4304080585 cites W4213150748 @default.
- W4304080585 cites W4214520160 @default.
- W4304080585 doi "https://doi.org/10.1145/3503161.3548418" @default.
- W4304080585 hasPublicationYear "2022" @default.
- W4304080585 type Work @default.
- W4304080585 citedByCount "0" @default.
- W4304080585 crossrefType "proceedings-article" @default.
- W4304080585 hasAuthorship W4304080585A5053696889 @default.
- W4304080585 hasAuthorship W4304080585A5067814331 @default.
- W4304080585 hasAuthorship W4304080585A5071026265 @default.
- W4304080585 hasAuthorship W4304080585A5088564355 @default.
- W4304080585 hasConcept C104317684 @default.
- W4304080585 hasConcept C127313418 @default.
- W4304080585 hasConcept C131979681 @default.
- W4304080585 hasConcept C154945302 @default.
- W4304080585 hasConcept C185592680 @default.
- W4304080585 hasConcept C31972630 @default.
- W4304080585 hasConcept C41008148 @default.
- W4304080585 hasConcept C51399673 @default.
- W4304080585 hasConcept C55493867 @default.
- W4304080585 hasConcept C62649853 @default.
- W4304080585 hasConcept C63479239 @default.
- W4304080585 hasConcept C82990744 @default.
- W4304080585 hasConceptScore W4304080585C104317684 @default.
- W4304080585 hasConceptScore W4304080585C127313418 @default.
- W4304080585 hasConceptScore W4304080585C131979681 @default.
- W4304080585 hasConceptScore W4304080585C154945302 @default.
- W4304080585 hasConceptScore W4304080585C185592680 @default.
- W4304080585 hasConceptScore W4304080585C31972630 @default.
- W4304080585 hasConceptScore W4304080585C41008148 @default.
- W4304080585 hasConceptScore W4304080585C51399673 @default.
- W4304080585 hasConceptScore W4304080585C55493867 @default.
- W4304080585 hasConceptScore W4304080585C62649853 @default.
- W4304080585 hasConceptScore W4304080585C63479239 @default.
- W4304080585 hasConceptScore W4304080585C82990744 @default.
- W4304080585 hasLocation W43040805851 @default.
- W4304080585 hasOpenAccess W4304080585 @default.
- W4304080585 hasPrimaryLocation W43040805851 @default.
- W4304080585 hasRelatedWork W1982088786 @default.
- W4304080585 hasRelatedWork W2766011388 @default.
- W4304080585 hasRelatedWork W2921707373 @default.
- W4304080585 hasRelatedWork W2979718872 @default.
- W4304080585 hasRelatedWork W2980357211 @default.
- W4304080585 hasRelatedWork W2980953096 @default.
- W4304080585 hasRelatedWork W3002324236 @default.
- W4304080585 hasRelatedWork W3137401801 @default.
- W4304080585 hasRelatedWork W3206828132 @default.
- W4304080585 hasRelatedWork W4321064134 @default.
- W4304080585 isParatext "false" @default.
- W4304080585 isRetracted "false" @default.
- W4304080585 workType "article" @default.