Matches in SemOpenAlex for { <https://semopenalex.org/work/W2329142159> ?p ?o ?g. }
Showing items 1 to 50 of
50
with 100 items per page.
- W2329142159 abstract "This paper describes methods for three different sensor driven guidance systems that have been developed, implemented, and tested. They represent advanced capabilities for utilizing onboard sensors, in this case vision systems, to drive guidance policies to steer an aircraft. The first includes utilizing an onboard camera image to enable a helicopter to automatically follow a moving ground vehicle. The second involves enabling an aircraft to automatically follow another aircraft. The third is to enable a helicopter to precisely place a ground robot in a window. In all three cases there is no direct communication between the target and the guidance system. In all cases all information used to guide the aircraft is obtained from onboard sensors. Common lessons are discussed from these very different applications of sensor-driven guidance systems for unmanned vehicles. I. Introduction HIS paper describes methods for three different sensor driven guidance systems that have been developed, implemented, and flight tested. They represent advanced capabilities for utilizing onboard sensors, in this case vision systems, to drive guidance policies to steer an aircraft to perform a useful mission. The use of onboard sensors to guide an aircraft – to choose its path – is an important emerging capability for unmanned aircraft. Contemporary unmanned aircraft are typically guided either in real time by a human operator or are flown autonomously with a prescribed flight path; such as waypoints provided before the flight or updated occasionally during a flight by human operators. In either case, a human operator is essentially determining the path flown. For onboard sensors to be utilized to generate a path, new issues arise. These are issues that are familiar to the developers of, for example, guided missiles: Insuring the integrity of the onboard sensor information, generating estimates of the state of the environment, determining feasible/safe flight paths to take, and finding the appropriate (but different) role for human operators. In addition to these design related issues, one must also realize that systems that rely on generating their desired flight path from real sensor data are inherently more difficult to develop and test: Just because the elements of the system work individually does not ensure the entire closed system will work effectively. In this paper, three different systems that have been developed and tested are briefly described. Two of them have been reported previously, and one for the first time. They are presented together to highlight the common challenges and to provide useful information for developing sensor driven guidance systems in general. The first system described includes utilizing an onboard camera image to enable a helicopter to automatically follow a moving ground vehicle. The second involves enabling an aircraft to automatically follow another aircraft. The third is to enable a helicopter to precisely place a slung load on a target in order to deliver a ground robot into a window. In all three cases there is no direct communication between the target and the guidance system. In all cases all information used to guide the aircraft is obtained from onboard sensors. The conclusions that follow include common lessons from all three." @default.
- W2329142159 created "2016-06-24" @default.
- W2329142159 creator A5081353608 @default.
- W2329142159 date "2009-04-06" @default.
- W2329142159 modified "2023-10-05" @default.
- W2329142159 title "Sensor Driven Guidance Systems for Unmanned Vehicles" @default.
- W2329142159 cites W2026487592 @default.
- W2329142159 cites W2037623489 @default.
- W2329142159 cites W2067427595 @default.
- W2329142159 cites W2145456734 @default.
- W2329142159 doi "https://doi.org/10.2514/6.2009-1894" @default.
- W2329142159 hasPublicationYear "2009" @default.
- W2329142159 type Work @default.
- W2329142159 sameAs 2329142159 @default.
- W2329142159 citedByCount "0" @default.
- W2329142159 crossrefType "proceedings-article" @default.
- W2329142159 hasAuthorship W2329142159A5081353608 @default.
- W2329142159 hasConcept C127413603 @default.
- W2329142159 hasConcept C145424490 @default.
- W2329142159 hasConcept C146978453 @default.
- W2329142159 hasConcept C154945302 @default.
- W2329142159 hasConcept C178802073 @default.
- W2329142159 hasConcept C19966478 @default.
- W2329142159 hasConcept C41008148 @default.
- W2329142159 hasConcept C90509273 @default.
- W2329142159 hasConceptScore W2329142159C127413603 @default.
- W2329142159 hasConceptScore W2329142159C145424490 @default.
- W2329142159 hasConceptScore W2329142159C146978453 @default.
- W2329142159 hasConceptScore W2329142159C154945302 @default.
- W2329142159 hasConceptScore W2329142159C178802073 @default.
- W2329142159 hasConceptScore W2329142159C19966478 @default.
- W2329142159 hasConceptScore W2329142159C41008148 @default.
- W2329142159 hasConceptScore W2329142159C90509273 @default.
- W2329142159 hasLocation W23291421591 @default.
- W2329142159 hasOpenAccess W2329142159 @default.
- W2329142159 hasPrimaryLocation W23291421591 @default.
- W2329142159 hasRelatedWork W1440918713 @default.
- W2329142159 hasRelatedWork W2045929236 @default.
- W2329142159 hasRelatedWork W2089134126 @default.
- W2329142159 hasRelatedWork W2114213204 @default.
- W2329142159 hasRelatedWork W2805349488 @default.
- W2329142159 hasRelatedWork W2887130920 @default.
- W2329142159 hasRelatedWork W2990912121 @default.
- W2329142159 hasRelatedWork W4232828791 @default.
- W2329142159 hasRelatedWork W4252062074 @default.
- W2329142159 hasRelatedWork W804007918 @default.
- W2329142159 isParatext "false" @default.
- W2329142159 isRetracted "false" @default.
- W2329142159 magId "2329142159" @default.
- W2329142159 workType "article" @default.