Matches in SemOpenAlex for { <https://semopenalex.org/work/W2784033568> ?p ?o ?g. }
Showing items 1 to 73 of
73
with 100 items per page.
- W2784033568 abstract "Industrial robots are delivering more and more manipulation services in manufacturing. However, when the task is complex, it is difficult to programme a robot to fulfil all the requirements because even a relatively simple task such as a peg-in-hole insertion contains many uncertainties, e.g. clearance, initial grasping position and insertion path. Humans, on the other hand, can deal with these variations using their vision and haptic feedback. Although humans can adapt to uncertainties easily, most of the time, the skilled based performances that relate to their tacit knowledge cannot be easily articulated. Even though the automation solution may not fully imitate human motion since some of them are not necessary, it would be useful if the skill based performance from a human could be firstly interpreted and modelled, which will then allow it to be transferred to the robot. This thesis aims to reduce robot programming efforts significantly by developing a methodology to capture, model and transfer the manual manufacturing skills from a human demonstrator to the robot. Recently, Learning from Demonstration (LfD) is gaining interest as a framework to transfer skills from human teacher to robot using probability encoding approaches to model observations and state transition uncertainties. In close or actual contact manipulation tasks, it is difficult to reliabley record the state-action examples without interfering with the human senses and activities. Therefore, wearable sensors are investigated as a promising device to record the state-action examples without restricting the human experts during the skilled execution of their tasks. Firstly to track human motions accurately and reliably in a defined 3-dimensional workspace, a hybrid system of Vicon and IMUs is proposed to compensate for the known limitations of the individual system. The data fusion method was able to overcome occlusion and frame flipping problems in the two camera Vicon setup and the drifting problem associated with the IMUs. The results indicated that occlusion and frame flipping problems associated with Vicon can be mitigated by using the IMU measurements. Furthermore, the proposed method improves the Mean Square Error (MSE) tracking accuracy range from 0.8˚ to 6.4˚ compared with the IMU only method. Secondly, to record haptic feedback from a teacher without physically obstructing their interactions with the workpiece, wearable surface electromyography (sEMG) armbands were used as an indirect method to indicate contact feedback during manual manipulations. A muscle-force model using a Time Delayed Neural Network (TDNN) was built to map the sEMG signals to the known contact force. The results indicated that the model was capable of estimating the force from the sEMG armbands in the applications of interest, namely in peg-in-hole and beater winding tasks, with MSE of 2.75N and 0.18N respectively.Finally, given the force estimation and the motion trajectories, a Hidden Markov Model (HMM) based approach was utilised as a state recognition method to encode and generalise the spatial and temporal information of the skilled executions. This method would allow a more representative control policy to be derived. A modified Gaussian Mixture Regression (GMR) method was then applied to enable motions reproduction by using the learned state-action policy. To simplify the validation procedure, instead of using the robot, additional demonstrations from the teacher were used to verify the reproduction performance of the policy, by assuming human teacher and robot learner are physical identical systems. The results confirmed the generalisation capability of the HMM model across a number of demonstrations from different subjects; and the reproduced motions from GMR were acceptable in these additional tests.The proposed methodology provides a framework for producing a state-action model from skilled demonstrations that can be translated into robot kinematics and joint states for the robot to execute. The implication to industry is reduced efforts and time in programming the robots for applications where human skilled performances are required to cope robustly with various uncertainties during tasks execution." @default.
- W2784033568 created "2018-01-26" @default.
- W2784033568 creator A5087513337 @default.
- W2784033568 date "2017-01-01" @default.
- W2784033568 modified "2023-09-26" @default.
- W2784033568 title "Human skill capturing and modelling using wearable devices" @default.
- W2784033568 hasPublicationYear "2017" @default.
- W2784033568 type Work @default.
- W2784033568 sameAs 2784033568 @default.
- W2784033568 citedByCount "0" @default.
- W2784033568 crossrefType "dissertation" @default.
- W2784033568 hasAuthorship W2784033568A5087513337 @default.
- W2784033568 hasConcept C104114177 @default.
- W2784033568 hasConcept C107457646 @default.
- W2784033568 hasConcept C115901376 @default.
- W2784033568 hasConcept C121332964 @default.
- W2784033568 hasConcept C127413603 @default.
- W2784033568 hasConcept C149635348 @default.
- W2784033568 hasConcept C150594956 @default.
- W2784033568 hasConcept C152086174 @default.
- W2784033568 hasConcept C154945302 @default.
- W2784033568 hasConcept C201995342 @default.
- W2784033568 hasConcept C2780451532 @default.
- W2784033568 hasConcept C2780791683 @default.
- W2784033568 hasConcept C41008148 @default.
- W2784033568 hasConcept C44154836 @default.
- W2784033568 hasConcept C62520636 @default.
- W2784033568 hasConcept C78519656 @default.
- W2784033568 hasConcept C90509273 @default.
- W2784033568 hasConceptScore W2784033568C104114177 @default.
- W2784033568 hasConceptScore W2784033568C107457646 @default.
- W2784033568 hasConceptScore W2784033568C115901376 @default.
- W2784033568 hasConceptScore W2784033568C121332964 @default.
- W2784033568 hasConceptScore W2784033568C127413603 @default.
- W2784033568 hasConceptScore W2784033568C149635348 @default.
- W2784033568 hasConceptScore W2784033568C150594956 @default.
- W2784033568 hasConceptScore W2784033568C152086174 @default.
- W2784033568 hasConceptScore W2784033568C154945302 @default.
- W2784033568 hasConceptScore W2784033568C201995342 @default.
- W2784033568 hasConceptScore W2784033568C2780451532 @default.
- W2784033568 hasConceptScore W2784033568C2780791683 @default.
- W2784033568 hasConceptScore W2784033568C41008148 @default.
- W2784033568 hasConceptScore W2784033568C44154836 @default.
- W2784033568 hasConceptScore W2784033568C62520636 @default.
- W2784033568 hasConceptScore W2784033568C78519656 @default.
- W2784033568 hasConceptScore W2784033568C90509273 @default.
- W2784033568 hasLocation W27840335681 @default.
- W2784033568 hasOpenAccess W2784033568 @default.
- W2784033568 hasPrimaryLocation W27840335681 @default.
- W2784033568 hasRelatedWork W1932845498 @default.
- W2784033568 hasRelatedWork W1983885373 @default.
- W2784033568 hasRelatedWork W24777700 @default.
- W2784033568 hasRelatedWork W2584987026 @default.
- W2784033568 hasRelatedWork W2737795549 @default.
- W2784033568 hasRelatedWork W2796176989 @default.
- W2784033568 hasRelatedWork W2886296216 @default.
- W2784033568 hasRelatedWork W2908543926 @default.
- W2784033568 hasRelatedWork W2947818707 @default.
- W2784033568 hasRelatedWork W2955818852 @default.
- W2784033568 hasRelatedWork W3004537488 @default.
- W2784033568 hasRelatedWork W3007715492 @default.
- W2784033568 hasRelatedWork W3096765004 @default.
- W2784033568 hasRelatedWork W3101552779 @default.
- W2784033568 hasRelatedWork W3120428840 @default.
- W2784033568 hasRelatedWork W3133750590 @default.
- W2784033568 hasRelatedWork W3135287423 @default.
- W2784033568 hasRelatedWork W3162532705 @default.
- W2784033568 hasRelatedWork W3196304306 @default.
- W2784033568 hasRelatedWork W3101614029 @default.
- W2784033568 isParatext "false" @default.
- W2784033568 isRetracted "false" @default.
- W2784033568 magId "2784033568" @default.
- W2784033568 workType "dissertation" @default.