Matches in SemOpenAlex for { <https://semopenalex.org/work/W2562309948> ?p ?o ?g. }
Showing items 1 to 89 of
89
with 100 items per page.
- W2562309948 abstract "Precise, robust, and consistent localization is an important subject in many areas of science such as vision-based control, path planning, and simultaneous localization and mapping (SLAM). To estimate the pose of a platform, sensors such as inertial measurement units (IMUs), global positioning system (GPS), and cameras are commonly employed. Each of these sensors has their strengths and weaknesses. Sensor fusion is a known approach that combines the data measured by different sensors to achieve a more accurate or complete pose estimation and to cope with sensor outages. In this paper, a three-dimensional (3D) pose estimation algorithm is presented for a unmanned aerial vehicle (UAV) in an unknown GPS-denied environment. A UAV can be fully localized by three position coordinates and three orientation angles. The proposed algorithm fuses the data from an IMU, a camera, and a two-dimensional (2D) light detection and ranging (LiDAR) using extended Kalman filter (EKF) to achieve accurate localization. Among the employed sensors, LiDAR has not received proper attention in the past; mostly because a two-dimensional (2D) LiDAR can only provide pose estimation in its scanning plane, and thus, it cannot obtain a full pose estimation in a 3D environment. A novel method is introduced in this paper that employs a 2D LiDAR to improve the full 3D pose estimation accuracy acquired from an IMU and a camera, and it is shown that this method can significantly improve the precision of the localization algorithm. The proposed approach is evaluated and justified by simulation and real world experiments." @default.
- W2562309948 created "2017-01-06" @default.
- W2562309948 creator A5059916900 @default.
- W2562309948 creator A5060923133 @default.
- W2562309948 creator A5083681053 @default.
- W2562309948 date "2017-04-18" @default.
- W2562309948 modified "2023-09-24" @default.
- W2562309948 title "Heterogeneous Multisensor Fusion for Mobile Platform Three-Dimensional Pose Estimation" @default.
- W2562309948 cites W1964907491 @default.
- W2562309948 cites W1969792502 @default.
- W2562309948 cites W1987142562 @default.
- W2562309948 cites W2032160168 @default.
- W2562309948 cites W2033908922 @default.
- W2562309948 cites W2042154170 @default.
- W2562309948 cites W2042850147 @default.
- W2562309948 cites W2055069594 @default.
- W2562309948 cites W2056298239 @default.
- W2562309948 cites W2077013278 @default.
- W2562309948 cites W2098860918 @default.
- W2562309948 cites W2100494739 @default.
- W2562309948 cites W2114125542 @default.
- W2562309948 cites W2128739770 @default.
- W2562309948 cites W2140924050 @default.
- W2562309948 cites W2147536420 @default.
- W2562309948 cites W2161717870 @default.
- W2562309948 cites W2166429996 @default.
- W2562309948 cites W2167828580 @default.
- W2562309948 cites W3083706479 @default.
- W2562309948 doi "https://doi.org/10.1115/1.4035452" @default.
- W2562309948 hasPublicationYear "2017" @default.
- W2562309948 type Work @default.
- W2562309948 sameAs 2562309948 @default.
- W2562309948 citedByCount "2" @default.
- W2562309948 countsByYear W25623099482016 @default.
- W2562309948 countsByYear W25623099482020 @default.
- W2562309948 crossrefType "journal-article" @default.
- W2562309948 hasAuthorship W2562309948A5059916900 @default.
- W2562309948 hasAuthorship W2562309948A5060923133 @default.
- W2562309948 hasAuthorship W2562309948A5083681053 @default.
- W2562309948 hasConcept C127413603 @default.
- W2562309948 hasConcept C138885662 @default.
- W2562309948 hasConcept C154945302 @default.
- W2562309948 hasConcept C158525013 @default.
- W2562309948 hasConcept C201995342 @default.
- W2562309948 hasConcept C31972630 @default.
- W2562309948 hasConcept C33954974 @default.
- W2562309948 hasConcept C41008148 @default.
- W2562309948 hasConcept C41895202 @default.
- W2562309948 hasConcept C52102323 @default.
- W2562309948 hasConcept C96250715 @default.
- W2562309948 hasConceptScore W2562309948C127413603 @default.
- W2562309948 hasConceptScore W2562309948C138885662 @default.
- W2562309948 hasConceptScore W2562309948C154945302 @default.
- W2562309948 hasConceptScore W2562309948C158525013 @default.
- W2562309948 hasConceptScore W2562309948C201995342 @default.
- W2562309948 hasConceptScore W2562309948C31972630 @default.
- W2562309948 hasConceptScore W2562309948C33954974 @default.
- W2562309948 hasConceptScore W2562309948C41008148 @default.
- W2562309948 hasConceptScore W2562309948C41895202 @default.
- W2562309948 hasConceptScore W2562309948C52102323 @default.
- W2562309948 hasConceptScore W2562309948C96250715 @default.
- W2562309948 hasFunder F4320308157 @default.
- W2562309948 hasLocation W25623099481 @default.
- W2562309948 hasOpenAccess W2562309948 @default.
- W2562309948 hasPrimaryLocation W25623099481 @default.
- W2562309948 hasRelatedWork W1558339915 @default.
- W2562309948 hasRelatedWork W1599353667 @default.
- W2562309948 hasRelatedWork W1634692259 @default.
- W2562309948 hasRelatedWork W1866371250 @default.
- W2562309948 hasRelatedWork W2018364578 @default.
- W2562309948 hasRelatedWork W2087515342 @default.
- W2562309948 hasRelatedWork W2111830917 @default.
- W2562309948 hasRelatedWork W2123925325 @default.
- W2562309948 hasRelatedWork W2124453535 @default.
- W2562309948 hasRelatedWork W2128682470 @default.
- W2562309948 hasRelatedWork W2150409724 @default.
- W2562309948 hasRelatedWork W2154507130 @default.
- W2562309948 hasRelatedWork W2543931952 @default.
- W2562309948 hasRelatedWork W2571773278 @default.
- W2562309948 hasRelatedWork W2783929347 @default.
- W2562309948 hasRelatedWork W2898050462 @default.
- W2562309948 hasRelatedWork W3009781618 @default.
- W2562309948 hasRelatedWork W3016802079 @default.
- W2562309948 hasRelatedWork W3100828464 @default.
- W2562309948 hasRelatedWork W3151914659 @default.
- W2562309948 isParatext "false" @default.
- W2562309948 isRetracted "false" @default.
- W2562309948 magId "2562309948" @default.
- W2562309948 workType "article" @default.