Matches in SemOpenAlex for { <https://semopenalex.org/work/W4382203442> ?p ?o ?g. }
Showing items 1 to 66 of
66
with 100 items per page.
- W4382203442 endingPage "12" @default.
- W4382203442 startingPage "1" @default.
- W4382203442 abstract "Robot-human object handover has been extensively studied in recent years for a wide range of applications. However, it is still far from being as natural as human-human handovers, largely due to the robots’ limited sensing capabilities. Previous approaches in the literature typically simplify the handover scenarios, including one or more of (a) conducting handovers at fixed locations, (b) not adapting to human preferences, or (c) only focusing on single-arm handover with small objects due to the sensor occlusions caused by large objects. To advance the state of the art toward a human-human level of handover fluency, this paper investigates a bimanual handover scenario in a naturalistic, complex setup. Specifically, we target robot-to-human box transfer while the human partner is on a ladder, and ensure that the object is adaptively delivered based on human preferences. To address the occlusion problem that arises in a complex environment, we develop an onboard multi-sensor perception system for the bimanual robot, introduce a measurement confidence estimation technique, and propose an occlusion-resilient multi-sensor fusion technique by positioning visual perception sensors in distinct locations on the robot with different fields of view. In addition, we establish a Cartesian space controller with a quaternion approach and a leader-follower control structure for compliant motion. Four distinct experiments are conducted, covering different human preferences (such as the box delivered above or below the hands) and significant handover location changes once the process has begun. For validation, the proposed multi-sensor fusion technique was compared to a single-sensor approach for both top and bottom sensors separately, and to simple averaging of both sensors. 30 repetitions were performed for each experiment (four experiments, four methods), the equivalent of 480 handover repetitions in total. Multi-sensor fusion approach achieved a handover success rate above <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$textbf{86.7%}$</tex-math> </inline-formula> for all experiments by successfully combining the strengths of both fields of view for human pose tracking under significant occlusions without sacrificing handover duration. In contrast, due to the occlusions, the single-sensor and simple averaging approaches completely failed during challenging experiments, illustrating the importance of multi-sensor fusion in complex handover scenarios. <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>Note to Practitioners</i> —This paper is motivated by enabling naturalistic robot-to-human bimanual object handovers in complex environments, which is a challenging problem due to occlusions. Existing approaches in the literature do not benefit from multi-sensor fusion to handle occlusions, which is essential in such physical human-robot interaction scenarios. To this aim, we have developed a multi-sensor fusion technique to improve the perception capabilities of robots with respect to human co-workers. The developed framework has been tested with Microsoft Azure Kinect sensors and a bimanual mobile Baxter robot, but it can be adapted to any depth perception sensor and bimanual robotic platform. Furthermore, the introduced multi-sensor fusion technique is comprehensive and generic, as it can be applied to any intermittent sensor data, such as human pose tracking via RGBD sensors. The presented approach shows that increasing the field of view of robots‘ perception used with enhanced data fusion could drastically improve the robot‘s sensing capability. For future work, data fusion can be improved by introducing Bayesian filters, and the system can be validated with different sensors and robotic platforms. Moreover, the handover detection method of physical interaction could further benefit from the incorporation of force sensors." @default.
- W4382203442 created "2023-06-28" @default.
- W4382203442 creator A5018921347 @default.
- W4382203442 creator A5079395273 @default.
- W4382203442 date "2023-01-01" @default.
- W4382203442 modified "2023-09-25" @default.
- W4382203442 title "Naturalistic Robot-to-Human Bimanual Handover in Complex Environments Through Multi-Sensor Fusion" @default.
- W4382203442 doi "https://doi.org/10.1109/tase.2023.3284668" @default.
- W4382203442 hasPublicationYear "2023" @default.
- W4382203442 type Work @default.
- W4382203442 citedByCount "0" @default.
- W4382203442 crossrefType "journal-article" @default.
- W4382203442 hasAuthorship W4382203442A5018921347 @default.
- W4382203442 hasAuthorship W4382203442A5079395273 @default.
- W4382203442 hasConcept C107457646 @default.
- W4382203442 hasConcept C111852164 @default.
- W4382203442 hasConcept C111919701 @default.
- W4382203442 hasConcept C145460709 @default.
- W4382203442 hasConcept C154945302 @default.
- W4382203442 hasConcept C19966478 @default.
- W4382203442 hasConcept C203479927 @default.
- W4382203442 hasConcept C31258907 @default.
- W4382203442 hasConcept C31972630 @default.
- W4382203442 hasConcept C33954974 @default.
- W4382203442 hasConcept C41008148 @default.
- W4382203442 hasConcept C44154836 @default.
- W4382203442 hasConcept C6557445 @default.
- W4382203442 hasConcept C79403827 @default.
- W4382203442 hasConcept C86803240 @default.
- W4382203442 hasConcept C90509273 @default.
- W4382203442 hasConcept C98045186 @default.
- W4382203442 hasConceptScore W4382203442C107457646 @default.
- W4382203442 hasConceptScore W4382203442C111852164 @default.
- W4382203442 hasConceptScore W4382203442C111919701 @default.
- W4382203442 hasConceptScore W4382203442C145460709 @default.
- W4382203442 hasConceptScore W4382203442C154945302 @default.
- W4382203442 hasConceptScore W4382203442C19966478 @default.
- W4382203442 hasConceptScore W4382203442C203479927 @default.
- W4382203442 hasConceptScore W4382203442C31258907 @default.
- W4382203442 hasConceptScore W4382203442C31972630 @default.
- W4382203442 hasConceptScore W4382203442C33954974 @default.
- W4382203442 hasConceptScore W4382203442C41008148 @default.
- W4382203442 hasConceptScore W4382203442C44154836 @default.
- W4382203442 hasConceptScore W4382203442C6557445 @default.
- W4382203442 hasConceptScore W4382203442C79403827 @default.
- W4382203442 hasConceptScore W4382203442C86803240 @default.
- W4382203442 hasConceptScore W4382203442C90509273 @default.
- W4382203442 hasConceptScore W4382203442C98045186 @default.
- W4382203442 hasLocation W43822034421 @default.
- W4382203442 hasOpenAccess W4382203442 @default.
- W4382203442 hasPrimaryLocation W43822034421 @default.
- W4382203442 hasRelatedWork W1485735559 @default.
- W4382203442 hasRelatedWork W2020522033 @default.
- W4382203442 hasRelatedWork W2072060867 @default.
- W4382203442 hasRelatedWork W2079532573 @default.
- W4382203442 hasRelatedWork W2131998556 @default.
- W4382203442 hasRelatedWork W2143937365 @default.
- W4382203442 hasRelatedWork W2292491037 @default.
- W4382203442 hasRelatedWork W2361962498 @default.
- W4382203442 hasRelatedWork W2764092873 @default.
- W4382203442 hasRelatedWork W4312750324 @default.
- W4382203442 isParatext "false" @default.
- W4382203442 isRetracted "false" @default.
- W4382203442 workType "article" @default.