Matches in SemOpenAlex for { <https://semopenalex.org/work/W4308153125> ?p ?o ?g. }
- W4308153125 endingPage "103081" @default.
- W4308153125 startingPage "103081" @default.
- W4308153125 abstract "To generate a complete 3D point cloud model for complex urban environments, data must be captured from both top-view and street-view sides due to the limited field of view of the sensors. Unmanned aerial vehicle-laser scanning (UAV-LS) point clouds are captured from a top view, and they often contain information about building roofs and tree crowns. In contrast, terrestrial point clouds, collected by terrestrial laser scanners (TLS) and mobile laser scanners (MLS), are captured from a street view and contain information about building facades and tree stems. These two types of point clouds are challenging to register without manual efforts or target-based assistance because of (1) very limited overlapping regions caused by the cross-view and (2) the wide search of regions for the automatic localization of terrestrial scans in a large-scale scene from a UAV scan. Consequently, we highlight that a semantic map can be developed by utilizing the invariant semantic and geometric relationships to automatically locate from a local scan to a global scan. In our tasks, the buildings and tree regions are detected and assigned as semantic objects and input to generate a 2.5D semantic map related to their actual location information. We propose a novel template matching strategy combined with a penalty system to realize the localization. Specifically, the templates can be produced by the obtained semantic maps, and then the best localization can be achieved by a graded punishment method based on the developed rules. We emphasize that the proposed method can handle the integration of multiple-aligned scans and even a single scan by using a terrestrial scanner to a UAV scan owing to the stability and continuity of semantic information. In the experiments, we used a UAV to capture the lidar point clouds over the campus of the Southwest Jiaotong university and generated a global semantic map offline. Keeping the global map in hand, we collected the data from the street views by TLS and MLS also the campus and then implemented the localization from TLS/MLS point clouds to the UAV point clouds. The evaluations are grouped into two grades related to the degree of difficulty: (1) the regular group, which contains the multiple-registered TLS scans and MLS scans, and (2) the challenging group, which has independent TLS scans. In general, from the experimental results, for the proposed method, the success rate of localization is up to 100% in the regular group and 90% in the challenging group, indicating that only one in ten tests fails for the challenging group." @default.
- W4308153125 created "2022-11-08" @default.
- W4308153125 creator A5019636608 @default.
- W4308153125 creator A5033732336 @default.
- W4308153125 creator A5041512805 @default.
- W4308153125 creator A5061975305 @default.
- W4308153125 creator A5075782732 @default.
- W4308153125 creator A5079693039 @default.
- W4308153125 creator A5091049278 @default.
- W4308153125 date "2022-11-01" @default.
- W4308153125 modified "2023-09-26" @default.
- W4308153125 title "Semantic maps for cross-view relocalization of terrestrial to UAV point clouds" @default.
- W4308153125 cites W1520282253 @default.
- W4308153125 cites W1967186435 @default.
- W4308153125 cites W1973146673 @default.
- W4308153125 cites W1989625560 @default.
- W4308153125 cites W2025312932 @default.
- W4308153125 cites W2083681515 @default.
- W4308153125 cites W2151103935 @default.
- W4308153125 cites W2152864241 @default.
- W4308153125 cites W2160821342 @default.
- W4308153125 cites W2194206407 @default.
- W4308153125 cites W2436494909 @default.
- W4308153125 cites W2566265240 @default.
- W4308153125 cites W2610213140 @default.
- W4308153125 cites W2733476998 @default.
- W4308153125 cites W2735595030 @default.
- W4308153125 cites W2755780464 @default.
- W4308153125 cites W2781162456 @default.
- W4308153125 cites W2803591142 @default.
- W4308153125 cites W2804872164 @default.
- W4308153125 cites W2883357174 @default.
- W4308153125 cites W2905100780 @default.
- W4308153125 cites W2910489334 @default.
- W4308153125 cites W2923270926 @default.
- W4308153125 cites W2962998962 @default.
- W4308153125 cites W2965120006 @default.
- W4308153125 cites W2981995220 @default.
- W4308153125 cites W3004201182 @default.
- W4308153125 cites W3015072278 @default.
- W4308153125 cites W3031727955 @default.
- W4308153125 cites W3044144453 @default.
- W4308153125 cites W3104408604 @default.
- W4308153125 cites W3120967516 @default.
- W4308153125 cites W3143836099 @default.
- W4308153125 cites W3167684723 @default.
- W4308153125 cites W3185344423 @default.
- W4308153125 cites W4205104530 @default.
- W4308153125 cites W4237162356 @default.
- W4308153125 doi "https://doi.org/10.1016/j.jag.2022.103081" @default.
- W4308153125 hasPublicationYear "2022" @default.
- W4308153125 type Work @default.
- W4308153125 citedByCount "0" @default.
- W4308153125 crossrefType "journal-article" @default.
- W4308153125 hasAuthorship W4308153125A5019636608 @default.
- W4308153125 hasAuthorship W4308153125A5033732336 @default.
- W4308153125 hasAuthorship W4308153125A5041512805 @default.
- W4308153125 hasAuthorship W4308153125A5061975305 @default.
- W4308153125 hasAuthorship W4308153125A5075782732 @default.
- W4308153125 hasAuthorship W4308153125A5079693039 @default.
- W4308153125 hasAuthorship W4308153125A5091049278 @default.
- W4308153125 hasBestOaLocation W43081531251 @default.
- W4308153125 hasConcept C105795698 @default.
- W4308153125 hasConcept C113174947 @default.
- W4308153125 hasConcept C131979681 @default.
- W4308153125 hasConcept C134306372 @default.
- W4308153125 hasConcept C154945302 @default.
- W4308153125 hasConcept C165064840 @default.
- W4308153125 hasConcept C195958017 @default.
- W4308153125 hasConcept C205649164 @default.
- W4308153125 hasConcept C2524010 @default.
- W4308153125 hasConcept C28719098 @default.
- W4308153125 hasConcept C31972630 @default.
- W4308153125 hasConcept C33923547 @default.
- W4308153125 hasConcept C41008148 @default.
- W4308153125 hasConcept C51399673 @default.
- W4308153125 hasConcept C62649853 @default.
- W4308153125 hasConceptScore W4308153125C105795698 @default.
- W4308153125 hasConceptScore W4308153125C113174947 @default.
- W4308153125 hasConceptScore W4308153125C131979681 @default.
- W4308153125 hasConceptScore W4308153125C134306372 @default.
- W4308153125 hasConceptScore W4308153125C154945302 @default.
- W4308153125 hasConceptScore W4308153125C165064840 @default.
- W4308153125 hasConceptScore W4308153125C195958017 @default.
- W4308153125 hasConceptScore W4308153125C205649164 @default.
- W4308153125 hasConceptScore W4308153125C2524010 @default.
- W4308153125 hasConceptScore W4308153125C28719098 @default.
- W4308153125 hasConceptScore W4308153125C31972630 @default.
- W4308153125 hasConceptScore W4308153125C33923547 @default.
- W4308153125 hasConceptScore W4308153125C41008148 @default.
- W4308153125 hasConceptScore W4308153125C51399673 @default.
- W4308153125 hasConceptScore W4308153125C62649853 @default.
- W4308153125 hasLocation W43081531251 @default.
- W4308153125 hasLocation W43081531252 @default.
- W4308153125 hasOpenAccess W4308153125 @default.
- W4308153125 hasPrimaryLocation W43081531251 @default.
- W4308153125 hasRelatedWork W2194160504 @default.
- W4308153125 hasRelatedWork W2335177719 @default.