Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312562010> ?p ?o ?g. }
- W4312562010 abstract "LiDAR and camera are two common sensors to collect data in time for 3D object detection under the autonomous driving context. Though the complementary information across sensors and time has great potential of benefiting 3D perception, taking full advantage of sequential cross-sensor data still remains challenging. In this paper, we propose a novel LiDAR Image Fusion Transformer (LIFT) to model the mutual interaction relationship of cross-sensor data over time. LIFT learns to align the input 4D sequential cross-sensor data to achieve multi-frame multi-modal information aggregation. To alleviate computational load, we project both point clouds and images into the bird-eye-view maps to compute sparse grid-wise self-attention. LIFT also benefits from a cross-sensor and cross-time data augmentation scheme. We evaluate the proposed approach on the challenging nuScenes and Waymo datasets, where our LIFT performs well over the state-of-the-art and strong baselines." @default.
- W4312562010 created "2023-01-05" @default.
- W4312562010 creator A5018639976 @default.
- W4312562010 creator A5019884153 @default.
- W4312562010 creator A5028120929 @default.
- W4312562010 creator A5050309514 @default.
- W4312562010 creator A5051172458 @default.
- W4312562010 creator A5059684490 @default.
- W4312562010 creator A5062284542 @default.
- W4312562010 creator A5063089557 @default.
- W4312562010 date "2022-06-01" @default.
- W4312562010 modified "2023-10-16" @default.
- W4312562010 title "LIFT: Learning 4D LiDAR Image Fusion Transformer for 3D Object Detection" @default.
- W4312562010 cites W2555618208 @default.
- W4312562010 cites W2798965597 @default.
- W4312562010 cites W2897529137 @default.
- W4312562010 cites W2951517617 @default.
- W4312562010 cites W2963323244 @default.
- W4312562010 cites W2963400571 @default.
- W4312562010 cites W2963727135 @default.
- W4312562010 cites W2964062501 @default.
- W4312562010 cites W2967324759 @default.
- W4312562010 cites W2968296999 @default.
- W4312562010 cites W2981949127 @default.
- W4312562010 cites W2991653934 @default.
- W4312562010 cites W3034314779 @default.
- W4312562010 cites W3035172746 @default.
- W4312562010 cites W3035346742 @default.
- W4312562010 cites W3035461736 @default.
- W4312562010 cites W3090166818 @default.
- W4312562010 cites W3170030651 @default.
- W4312562010 cites W3171377125 @default.
- W4312562010 cites W3172863135 @default.
- W4312562010 cites W3175859344 @default.
- W4312562010 cites W3176888779 @default.
- W4312562010 cites W3206704105 @default.
- W4312562010 cites W4200381380 @default.
- W4312562010 cites W4214763741 @default.
- W4312562010 doi "https://doi.org/10.1109/cvpr52688.2022.01666" @default.
- W4312562010 hasPublicationYear "2022" @default.
- W4312562010 type Work @default.
- W4312562010 citedByCount "5" @default.
- W4312562010 countsByYear W43125620102022 @default.
- W4312562010 countsByYear W43125620102023 @default.
- W4312562010 crossrefType "proceedings-article" @default.
- W4312562010 hasAuthorship W4312562010A5018639976 @default.
- W4312562010 hasAuthorship W4312562010A5019884153 @default.
- W4312562010 hasAuthorship W4312562010A5028120929 @default.
- W4312562010 hasAuthorship W4312562010A5050309514 @default.
- W4312562010 hasAuthorship W4312562010A5051172458 @default.
- W4312562010 hasAuthorship W4312562010A5059684490 @default.
- W4312562010 hasAuthorship W4312562010A5062284542 @default.
- W4312562010 hasAuthorship W4312562010A5063089557 @default.
- W4312562010 hasConcept C119599485 @default.
- W4312562010 hasConcept C124101348 @default.
- W4312562010 hasConcept C127413603 @default.
- W4312562010 hasConcept C131979681 @default.
- W4312562010 hasConcept C13280743 @default.
- W4312562010 hasConcept C139002025 @default.
- W4312562010 hasConcept C153180895 @default.
- W4312562010 hasConcept C154945302 @default.
- W4312562010 hasConcept C165801399 @default.
- W4312562010 hasConcept C187691185 @default.
- W4312562010 hasConcept C205649164 @default.
- W4312562010 hasConcept C2776151529 @default.
- W4312562010 hasConcept C31972630 @default.
- W4312562010 hasConcept C33954974 @default.
- W4312562010 hasConcept C41008148 @default.
- W4312562010 hasConcept C51399673 @default.
- W4312562010 hasConcept C62649853 @default.
- W4312562010 hasConcept C66322947 @default.
- W4312562010 hasConcept C79403827 @default.
- W4312562010 hasConceptScore W4312562010C119599485 @default.
- W4312562010 hasConceptScore W4312562010C124101348 @default.
- W4312562010 hasConceptScore W4312562010C127413603 @default.
- W4312562010 hasConceptScore W4312562010C131979681 @default.
- W4312562010 hasConceptScore W4312562010C13280743 @default.
- W4312562010 hasConceptScore W4312562010C139002025 @default.
- W4312562010 hasConceptScore W4312562010C153180895 @default.
- W4312562010 hasConceptScore W4312562010C154945302 @default.
- W4312562010 hasConceptScore W4312562010C165801399 @default.
- W4312562010 hasConceptScore W4312562010C187691185 @default.
- W4312562010 hasConceptScore W4312562010C205649164 @default.
- W4312562010 hasConceptScore W4312562010C2776151529 @default.
- W4312562010 hasConceptScore W4312562010C31972630 @default.
- W4312562010 hasConceptScore W4312562010C33954974 @default.
- W4312562010 hasConceptScore W4312562010C41008148 @default.
- W4312562010 hasConceptScore W4312562010C51399673 @default.
- W4312562010 hasConceptScore W4312562010C62649853 @default.
- W4312562010 hasConceptScore W4312562010C66322947 @default.
- W4312562010 hasConceptScore W4312562010C79403827 @default.
- W4312562010 hasFunder F4320321001 @default.
- W4312562010 hasFunder F4320321885 @default.
- W4312562010 hasFunder F4320335777 @default.
- W4312562010 hasLocation W43125620101 @default.
- W4312562010 hasOpenAccess W4312562010 @default.
- W4312562010 hasPrimaryLocation W43125620101 @default.
- W4312562010 hasRelatedWork W2922421953 @default.
- W4312562010 hasRelatedWork W2979718872 @default.
- W4312562010 hasRelatedWork W3002270006 @default.