Matches in SemOpenAlex for { <https://semopenalex.org/work/W2978088897> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W2978088897 endingPage "145613" @default.
- W2978088897 startingPage "145604" @default.
- W2978088897 abstract "Learning a task such as pushing something, where the constraints of both position and force have to be satisfied, is usually difficult for a collaborative robot. In this work, we propose a multimodal teaching-by-demonstration system which can enable the robot to perform this kind of tasks. The basic idea is to transfer the adaptation of multi-modal information from a human tutor to the robot by taking account of multiple sensor signals (i.e., motion trajectories, stiffness, and force profiles). The human tutor's stiffness is estimated based on the limb surface electromyography (EMG) signals obtained from the demonstration phase. The force profiles in Cartesian space are collected from a force/torque sensor mounted between the robot endpoint and the tool. Subsequently, the hidden semi-Markov model (HSMM) is used to encode the multiple signals in a unified manner. The correlations between position and the other three control variables (i.e., velocity, stiffness and force) are encoded with separate HSMM models. Based on the estimated parameters of the HSMM model, the Gaussian mixture regression (GMR) is then utilized to generate the expected control variables. The learned variables are further mapped into an impedance controller in the joint space through inverse kinematics for the reproduction of the task. Comparative tests have been conducted to verify the effectiveness of our approach on a Baxter robot." @default.
- W2978088897 created "2019-10-10" @default.
- W2978088897 creator A5006247366 @default.
- W2978088897 creator A5015243405 @default.
- W2978088897 creator A5030956545 @default.
- W2978088897 creator A5066117237 @default.
- W2978088897 date "2019-01-01" @default.
- W2978088897 modified "2023-10-03" @default.
- W2978088897 title "Encoding Multiple Sensor Data for Robotic Learning Skills From Multimodal Demonstration" @default.
- W2978088897 cites W133766508 @default.
- W2978088897 cites W1517823811 @default.
- W2978088897 cites W1969976050 @default.
- W2978088897 cites W1986024064 @default.
- W2978088897 cites W2003421285 @default.
- W2978088897 cites W2068923975 @default.
- W2978088897 cites W2080487795 @default.
- W2978088897 cites W2100993276 @default.
- W2978088897 cites W2111528514 @default.
- W2978088897 cites W2133932631 @default.
- W2978088897 cites W2146121659 @default.
- W2978088897 cites W2154543878 @default.
- W2978088897 cites W2197436471 @default.
- W2978088897 cites W2211544237 @default.
- W2978088897 cites W2300976231 @default.
- W2978088897 cites W2331138946 @default.
- W2978088897 cites W2471216432 @default.
- W2978088897 cites W2500624988 @default.
- W2978088897 cites W2562106483 @default.
- W2978088897 cites W2591999327 @default.
- W2978088897 cites W2722424650 @default.
- W2978088897 cites W2736573057 @default.
- W2978088897 cites W2744687549 @default.
- W2978088897 cites W2754133402 @default.
- W2978088897 cites W2766555673 @default.
- W2978088897 cites W2767508918 @default.
- W2978088897 cites W2791542133 @default.
- W2978088897 cites W2795550549 @default.
- W2978088897 cites W2796864868 @default.
- W2978088897 cites W2797679738 @default.
- W2978088897 cites W2801098918 @default.
- W2978088897 cites W4211008118 @default.
- W2978088897 doi "https://doi.org/10.1109/access.2019.2945484" @default.
- W2978088897 hasPublicationYear "2019" @default.
- W2978088897 type Work @default.
- W2978088897 sameAs 2978088897 @default.
- W2978088897 citedByCount "18" @default.
- W2978088897 countsByYear W29780888972020 @default.
- W2978088897 countsByYear W29780888972021 @default.
- W2978088897 countsByYear W29780888972022 @default.
- W2978088897 countsByYear W29780888972023 @default.
- W2978088897 crossrefType "journal-article" @default.
- W2978088897 hasAuthorship W2978088897A5006247366 @default.
- W2978088897 hasAuthorship W2978088897A5015243405 @default.
- W2978088897 hasAuthorship W2978088897A5030956545 @default.
- W2978088897 hasAuthorship W2978088897A5066117237 @default.
- W2978088897 hasBestOaLocation W29780888971 @default.
- W2978088897 hasConcept C154945302 @default.
- W2978088897 hasConcept C17816587 @default.
- W2978088897 hasConcept C23224414 @default.
- W2978088897 hasConcept C31972630 @default.
- W2978088897 hasConcept C41008148 @default.
- W2978088897 hasConcept C44154836 @default.
- W2978088897 hasConcept C90509273 @default.
- W2978088897 hasConceptScore W2978088897C154945302 @default.
- W2978088897 hasConceptScore W2978088897C17816587 @default.
- W2978088897 hasConceptScore W2978088897C23224414 @default.
- W2978088897 hasConceptScore W2978088897C31972630 @default.
- W2978088897 hasConceptScore W2978088897C41008148 @default.
- W2978088897 hasConceptScore W2978088897C44154836 @default.
- W2978088897 hasConceptScore W2978088897C90509273 @default.
- W2978088897 hasFunder F4320334627 @default.
- W2978088897 hasLocation W29780888971 @default.
- W2978088897 hasLocation W29780888972 @default.
- W2978088897 hasLocation W29780888973 @default.
- W2978088897 hasOpenAccess W2978088897 @default.
- W2978088897 hasPrimaryLocation W29780888971 @default.
- W2978088897 hasRelatedWork W1891287906 @default.
- W2978088897 hasRelatedWork W1969923398 @default.
- W2978088897 hasRelatedWork W2036807459 @default.
- W2978088897 hasRelatedWork W2166024367 @default.
- W2978088897 hasRelatedWork W2229312674 @default.
- W2978088897 hasRelatedWork W2392878237 @default.
- W2978088897 hasRelatedWork W2755342338 @default.
- W2978088897 hasRelatedWork W2772917594 @default.
- W2978088897 hasRelatedWork W2775347418 @default.
- W2978088897 hasRelatedWork W3116076068 @default.
- W2978088897 hasVolume "7" @default.
- W2978088897 isParatext "false" @default.
- W2978088897 isRetracted "false" @default.
- W2978088897 magId "2978088897" @default.
- W2978088897 workType "article" @default.