Matches in SemOpenAlex for { <https://semopenalex.org/work/W3202284701> ?p ?o ?g. }
- W3202284701 endingPage "056061" @default.
- W3202284701 startingPage "056061" @default.
- W3202284701 abstract "Objective.Brain-machine interfaces (BMIs) interpret human intent into machine reactions, and the visual stimulation (VS) paradigm is one of the most widely used of these approaches. Although VS-based BMIs have a relatively high information transfer rate (ITR), it is still difficult for BMIs to control machines in dynamic environments (for example, grabbing a dynamic object or targeting a walking person).Approach.In this study, we utilized a BMI based on augmented reality (AR) VS (AR-VS). The proposed VS was dynamically generated based on machine vision, and human intent was interpreted by a dynamic decision time interval approach. A robot based on the coordination of a task and self-motion system was controlled by the proposed paradigm in a fast and flexible state.Methods.Objects in scenes were first recognized by machine vision and tracked by optical flow. AR-VS was generated based on the objects' parameters. The number and distribution of VS was confirmed by the recognized objects. Electroencephalogram (EEG) features corresponding to VS and human intent were collected by a dry-electrode EEG cap and determined by the filter bank canonical correlation analysis method. Key parameters in the AR-VS, including the effect of VS size, frequency, dynamic object moving speed, ITR and the performance of the BMI-controlled robot, were analyzed.Conclusion and significance.The ITR of the proposed AR-VS paradigm for nine healthy subjects was 36.3 ± 20.1 bits min-1. In the online robot control experiment, brain-controlled hybrid tasks including self-moving and grabbing objects were 64% faster than when using the traditional steady-state visual evoked potential paradigm. The proposed paradigm based on AR-VS could be optimized and adopted in other kinds of VS-based BMIs, such as P300, omitted stimulus potential, and miniature event-related potential paradigms, for better results in dynamic environments." @default.
- W3202284701 created "2021-10-11" @default.
- W3202284701 creator A5005431252 @default.
- W3202284701 creator A5009234498 @default.
- W3202284701 creator A5021104158 @default.
- W3202284701 creator A5024243438 @default.
- W3202284701 creator A5028056556 @default.
- W3202284701 creator A5035772408 @default.
- W3202284701 creator A5039635164 @default.
- W3202284701 creator A5051689229 @default.
- W3202284701 creator A5057209439 @default.
- W3202284701 creator A5068826437 @default.
- W3202284701 creator A5072316779 @default.
- W3202284701 date "2021-10-01" @default.
- W3202284701 modified "2023-10-16" @default.
- W3202284701 title "Machine-vision fused brain machine interface based on dynamic augmented reality visual stimulation" @default.
- W3202284701 cites W1799066572 @default.
- W3202284701 cites W1991369595 @default.
- W3202284701 cites W2003486042 @default.
- W3202284701 cites W2004959199 @default.
- W3202284701 cites W2070674634 @default.
- W3202284701 cites W2105478324 @default.
- W3202284701 cites W2143183535 @default.
- W3202284701 cites W2162295492 @default.
- W3202284701 cites W2169918686 @default.
- W3202284701 cites W2327723377 @default.
- W3202284701 cites W2496639200 @default.
- W3202284701 cites W2610679545 @default.
- W3202284701 cites W2767977875 @default.
- W3202284701 cites W2778092082 @default.
- W3202284701 cites W2792687613 @default.
- W3202284701 doi "https://doi.org/10.1088/1741-2552/ac2c9e" @default.
- W3202284701 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/34607320" @default.
- W3202284701 hasPublicationYear "2021" @default.
- W3202284701 type Work @default.
- W3202284701 sameAs 3202284701 @default.
- W3202284701 citedByCount "4" @default.
- W3202284701 countsByYear W32022847012022 @default.
- W3202284701 countsByYear W32022847012023 @default.
- W3202284701 crossrefType "journal-article" @default.
- W3202284701 hasAuthorship W3202284701A5005431252 @default.
- W3202284701 hasAuthorship W3202284701A5009234498 @default.
- W3202284701 hasAuthorship W3202284701A5021104158 @default.
- W3202284701 hasAuthorship W3202284701A5024243438 @default.
- W3202284701 hasAuthorship W3202284701A5028056556 @default.
- W3202284701 hasAuthorship W3202284701A5035772408 @default.
- W3202284701 hasAuthorship W3202284701A5039635164 @default.
- W3202284701 hasAuthorship W3202284701A5051689229 @default.
- W3202284701 hasAuthorship W3202284701A5057209439 @default.
- W3202284701 hasAuthorship W3202284701A5068826437 @default.
- W3202284701 hasAuthorship W3202284701A5072316779 @default.
- W3202284701 hasBestOaLocation W32022847011 @default.
- W3202284701 hasConcept C113843644 @default.
- W3202284701 hasConcept C118552586 @default.
- W3202284701 hasConcept C129307140 @default.
- W3202284701 hasConcept C153715457 @default.
- W3202284701 hasConcept C154945302 @default.
- W3202284701 hasConcept C15744967 @default.
- W3202284701 hasConcept C157915830 @default.
- W3202284701 hasConcept C173201364 @default.
- W3202284701 hasConcept C173608175 @default.
- W3202284701 hasConcept C2781238097 @default.
- W3202284701 hasConcept C31972630 @default.
- W3202284701 hasConcept C41008148 @default.
- W3202284701 hasConcept C522805319 @default.
- W3202284701 hasConcept C5339829 @default.
- W3202284701 hasConcept C90509273 @default.
- W3202284701 hasConceptScore W3202284701C113843644 @default.
- W3202284701 hasConceptScore W3202284701C118552586 @default.
- W3202284701 hasConceptScore W3202284701C129307140 @default.
- W3202284701 hasConceptScore W3202284701C153715457 @default.
- W3202284701 hasConceptScore W3202284701C154945302 @default.
- W3202284701 hasConceptScore W3202284701C15744967 @default.
- W3202284701 hasConceptScore W3202284701C157915830 @default.
- W3202284701 hasConceptScore W3202284701C173201364 @default.
- W3202284701 hasConceptScore W3202284701C173608175 @default.
- W3202284701 hasConceptScore W3202284701C2781238097 @default.
- W3202284701 hasConceptScore W3202284701C31972630 @default.
- W3202284701 hasConceptScore W3202284701C41008148 @default.
- W3202284701 hasConceptScore W3202284701C522805319 @default.
- W3202284701 hasConceptScore W3202284701C5339829 @default.
- W3202284701 hasConceptScore W3202284701C90509273 @default.
- W3202284701 hasFunder F4320321001 @default.
- W3202284701 hasFunder F4320325902 @default.
- W3202284701 hasIssue "5" @default.
- W3202284701 hasLocation W32022847011 @default.
- W3202284701 hasLocation W32022847012 @default.
- W3202284701 hasOpenAccess W3202284701 @default.
- W3202284701 hasPrimaryLocation W32022847011 @default.
- W3202284701 hasRelatedWork W1913385466 @default.
- W3202284701 hasRelatedWork W1994410349 @default.
- W3202284701 hasRelatedWork W2015048155 @default.
- W3202284701 hasRelatedWork W2015326241 @default.
- W3202284701 hasRelatedWork W2889342546 @default.
- W3202284701 hasRelatedWork W3004117467 @default.
- W3202284701 hasRelatedWork W3177028067 @default.
- W3202284701 hasRelatedWork W3202969339 @default.
- W3202284701 hasRelatedWork W4237513258 @default.