Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313028649> ?p ?o ?g. }
- W4313028649 endingPage "5641" @default.
- W4313028649 startingPage "5628" @default.
- W4313028649 abstract "Multi-modal fusion overcomes the inherent limitations of single-sensor perception in 3D object detection of autonomous driving. The fusion of 4D Radar and LiDAR can boost the detection range and more robust. Nevertheless, different data characteristics and noise distributions between two sensors hinder performance improvement when directly integrating them. Therefore, we are the first to propose a novel fusion method termed <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$M^{2}$</tex-math></inline-formula> -Fusion for 4D Radar and LiDAR, based on Multi-modal and Multi-scale fusion. To better integrate two sensors, we propose an Interaction-based Multi-Modal Fusion (IMMF) method utilizing a self-attention mechanism to learn features from each modality and exchange intermediate layer information. Specific to the current single-resolution voxel division's precision and efficiency balance problem, we also put forward a Center-based Multi-Scale Fusion (CMSF) method to first regress the center points of objects and then extract features in multiple resolutions. Furthermore, we present a data preprocessing method based on Gaussian distribution that effectively decreases data noise to reduce errors caused by point cloud divergence of 4D Radar data in the <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$x$</tex-math></inline-formula> - <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$z$</tex-math></inline-formula> plane. To evaluate the proposed fusion method, a series of experiments were conducted using the Astyx HiRes 2019 dataset, including the calibrated 4D Radar and 16-line LiDAR data. The results demonstrated that our fusion method compared favorably with state-of-the-art algorithms. When compared to PointPillars, our method achieves mAP (mean average precision) increases of 5.64 <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$%$</tex-math></inline-formula> and 13.57 <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink><tex-math notation=LaTeX>$%$</tex-math></inline-formula> for 3D and BEV (bird's eye view) detection of the car class at a moderate level, respectively." @default.
- W4313028649 created "2023-01-06" @default.
- W4313028649 creator A5001135373 @default.
- W4313028649 creator A5027835055 @default.
- W4313028649 creator A5031621739 @default.
- W4313028649 creator A5035406850 @default.
- W4313028649 creator A5045865179 @default.
- W4313028649 creator A5054021570 @default.
- W4313028649 creator A5070453047 @default.
- W4313028649 creator A5086082119 @default.
- W4313028649 creator A5086992948 @default.
- W4313028649 date "2023-05-01" @default.
- W4313028649 modified "2023-10-03" @default.
- W4313028649 title "Multi-Modal and Multi-Scale Fusion 3D Object Detection of 4D Radar and LiDAR for Autonomous Driving" @default.
- W4313028649 cites W2104071580 @default.
- W4313028649 cites W2555618208 @default.
- W4313028649 cites W2560609797 @default.
- W4313028649 cites W2564140372 @default.
- W4313028649 cites W2565639579 @default.
- W4313028649 cites W2741069557 @default.
- W4313028649 cites W2774756930 @default.
- W4313028649 cites W2798965597 @default.
- W4313028649 cites W2891649842 @default.
- W4313028649 cites W2897529137 @default.
- W4313028649 cites W2902920832 @default.
- W4313028649 cites W2911486422 @default.
- W4313028649 cites W2920468273 @default.
- W4313028649 cites W2937816158 @default.
- W4313028649 cites W2944551804 @default.
- W4313028649 cites W2949708697 @default.
- W4313028649 cites W2951517617 @default.
- W4313028649 cites W2954174912 @default.
- W4313028649 cites W2963120444 @default.
- W4313028649 cites W2963351448 @default.
- W4313028649 cites W2963400571 @default.
- W4313028649 cites W2963576229 @default.
- W4313028649 cites W2963727135 @default.
- W4313028649 cites W2967324759 @default.
- W4313028649 cites W2968296999 @default.
- W4313028649 cites W2970095196 @default.
- W4313028649 cites W2973595938 @default.
- W4313028649 cites W2974922121 @default.
- W4313028649 cites W2982632025 @default.
- W4313028649 cites W2989604896 @default.
- W4313028649 cites W2998614245 @default.
- W4313028649 cites W3004351857 @default.
- W4313028649 cites W3021632667 @default.
- W4313028649 cites W3032008066 @default.
- W4313028649 cites W3034314779 @default.
- W4313028649 cites W3034602892 @default.
- W4313028649 cites W3035461736 @default.
- W4313028649 cites W3081674976 @default.
- W4313028649 cites W3103179390 @default.
- W4313028649 cites W3103612787 @default.
- W4313028649 cites W3117804044 @default.
- W4313028649 cites W3118132944 @default.
- W4313028649 cites W3118341329 @default.
- W4313028649 cites W3118479569 @default.
- W4313028649 cites W3128655704 @default.
- W4313028649 cites W3167214600 @default.
- W4313028649 cites W3208941688 @default.
- W4313028649 cites W4282049157 @default.
- W4313028649 doi "https://doi.org/10.1109/tvt.2022.3230265" @default.
- W4313028649 hasPublicationYear "2023" @default.
- W4313028649 type Work @default.
- W4313028649 citedByCount "0" @default.
- W4313028649 crossrefType "journal-article" @default.
- W4313028649 hasAuthorship W4313028649A5001135373 @default.
- W4313028649 hasAuthorship W4313028649A5027835055 @default.
- W4313028649 hasAuthorship W4313028649A5031621739 @default.
- W4313028649 hasAuthorship W4313028649A5035406850 @default.
- W4313028649 hasAuthorship W4313028649A5045865179 @default.
- W4313028649 hasAuthorship W4313028649A5054021570 @default.
- W4313028649 hasAuthorship W4313028649A5070453047 @default.
- W4313028649 hasAuthorship W4313028649A5086082119 @default.
- W4313028649 hasAuthorship W4313028649A5086992948 @default.
- W4313028649 hasConcept C11413529 @default.
- W4313028649 hasConcept C121332964 @default.
- W4313028649 hasConcept C127313418 @default.
- W4313028649 hasConcept C154945302 @default.
- W4313028649 hasConcept C2778755073 @default.
- W4313028649 hasConcept C33954974 @default.
- W4313028649 hasConcept C34736171 @default.
- W4313028649 hasConcept C41008148 @default.
- W4313028649 hasConcept C51399673 @default.
- W4313028649 hasConcept C554190296 @default.
- W4313028649 hasConcept C62520636 @default.
- W4313028649 hasConcept C62649853 @default.
- W4313028649 hasConcept C76155785 @default.
- W4313028649 hasConceptScore W4313028649C11413529 @default.
- W4313028649 hasConceptScore W4313028649C121332964 @default.
- W4313028649 hasConceptScore W4313028649C127313418 @default.
- W4313028649 hasConceptScore W4313028649C154945302 @default.
- W4313028649 hasConceptScore W4313028649C2778755073 @default.
- W4313028649 hasConceptScore W4313028649C33954974 @default.
- W4313028649 hasConceptScore W4313028649C34736171 @default.
- W4313028649 hasConceptScore W4313028649C41008148 @default.
- W4313028649 hasConceptScore W4313028649C51399673 @default.