Matches in SemOpenAlex for { <https://semopenalex.org/work/W2765760237> ?p ?o ?g. }
Showing items 1 to 73 of
73
with 100 items per page.
- W2765760237 abstract "Human-computer interfaces are changing to meet the evolving needs of users and overcome limitations of previous generations of computer systems. The current state of computers consists largely of graphical user interfaces (GUI) that incorporate windows, icons, menus, and pointers (WIMPs) as visual representations of computer interactions controlled via user input on a mouse and keyboard. Although this model of interface has dominated human-computer interaction for decades, WIMPs require an extra step between the user’s intent and the computer action, imposing both limitations on the interaction and introducing cognitive demands (van Dam, 1997). Alternatively, natural user interfaces (NUI) employ input methods such as speech, touch, and gesture commands. With NUIs, users can interact directly with the computer without using an intermediary device (e.g., mouse, keyboard). Using the body as an input device may be more “natural” because it allows the user to apply existing knowledge of how to interact with the world (Roupé, Bosch-Sijtsema, & Johansson, 2014). To utilize the potential of natural interfaces, research must first determine what interactions can be considered natural. For the purpose of this paper, we focus on the naturalness of gesture-based interfaces. The purpose of this study was to determine how people perform natural gesture-based computer actions. To answer this question, we first narrowed down potential gestures that would be considered natural for an action. In a previous study, participants ( n=17) were asked how they would gesture to interact with a computer to complete a series of actions. After narrowing down the potential natural gestures by calculating the most frequently performed gestures for each action, we asked participants ( n=188) to rate the naturalness of the gestures in the current study. Participants each watched 26 videos of gestures (3-5 seconds each) and were asked how natural or arbitrary they interpreted each gesture for the series of computer commands (e.g., move object left, shrink object, select object, etc.). The gestures in these videos included the 17 gestures that were most often performed in the previous study in which participants were asked what gesture they would naturally use to complete the computer actions. Nine gestures were also included that were created arbitrarily to act as a comparison to the natural gestures. By analyzing the ratings on a continuum from “Completely Arbitrary” to “Completely Natural,” we found that the natural gestures people produced in the first study were also interpreted as the intended action by this separate sample of participants. All the gestures that were rated as either “Mostly Natural” or “Completely Natural” by participants corresponded to how the object manipulation would be performed physically. For example, the gesture video that depicts a fist closing was rated as “natural” by participants for the action of “selecting an object.” All of the gestures that were created arbitrarily were interpreted as “arbitrary” when they did not correspond to the physical action. Determining how people naturally gesture computer commands and how people interpret those gestures is useful because it can inform the development of NUIs and contributes to the literature on what makes gestures seem “natural.”" @default.
- W2765760237 created "2017-11-10" @default.
- W2765760237 creator A5022443048 @default.
- W2765760237 creator A5031163052 @default.
- W2765760237 creator A5048165783 @default.
- W2765760237 creator A5053724089 @default.
- W2765760237 date "2017-09-01" @default.
- W2765760237 modified "2023-09-27" @default.
- W2765760237 title "Development of Gesture-based Commands for Natural User Interfaces" @default.
- W2765760237 cites W1995226545 @default.
- W2765760237 cites W2053208276 @default.
- W2765760237 doi "https://doi.org/10.1177/1541931213601851" @default.
- W2765760237 hasPublicationYear "2017" @default.
- W2765760237 type Work @default.
- W2765760237 sameAs 2765760237 @default.
- W2765760237 citedByCount "1" @default.
- W2765760237 countsByYear W27657602372020 @default.
- W2765760237 crossrefType "journal-article" @default.
- W2765760237 hasAuthorship W2765760237A5022443048 @default.
- W2765760237 hasAuthorship W2765760237A5031163052 @default.
- W2765760237 hasAuthorship W2765760237A5048165783 @default.
- W2765760237 hasAuthorship W2765760237A5053724089 @default.
- W2765760237 hasConcept C107457646 @default.
- W2765760237 hasConcept C149229913 @default.
- W2765760237 hasConcept C154945302 @default.
- W2765760237 hasConcept C166957645 @default.
- W2765760237 hasConcept C187482481 @default.
- W2765760237 hasConcept C199360897 @default.
- W2765760237 hasConcept C201025465 @default.
- W2765760237 hasConcept C207347870 @default.
- W2765760237 hasConcept C2776608160 @default.
- W2765760237 hasConcept C41008148 @default.
- W2765760237 hasConcept C89505385 @default.
- W2765760237 hasConcept C95457728 @default.
- W2765760237 hasConceptScore W2765760237C107457646 @default.
- W2765760237 hasConceptScore W2765760237C149229913 @default.
- W2765760237 hasConceptScore W2765760237C154945302 @default.
- W2765760237 hasConceptScore W2765760237C166957645 @default.
- W2765760237 hasConceptScore W2765760237C187482481 @default.
- W2765760237 hasConceptScore W2765760237C199360897 @default.
- W2765760237 hasConceptScore W2765760237C201025465 @default.
- W2765760237 hasConceptScore W2765760237C207347870 @default.
- W2765760237 hasConceptScore W2765760237C2776608160 @default.
- W2765760237 hasConceptScore W2765760237C41008148 @default.
- W2765760237 hasConceptScore W2765760237C89505385 @default.
- W2765760237 hasConceptScore W2765760237C95457728 @default.
- W2765760237 hasLocation W27657602371 @default.
- W2765760237 hasOpenAccess W2765760237 @default.
- W2765760237 hasPrimaryLocation W27657602371 @default.
- W2765760237 hasRelatedWork W1984077343 @default.
- W2765760237 hasRelatedWork W2018410487 @default.
- W2765760237 hasRelatedWork W2019163493 @default.
- W2765760237 hasRelatedWork W2058712906 @default.
- W2765760237 hasRelatedWork W2066157335 @default.
- W2765760237 hasRelatedWork W2098348725 @default.
- W2765760237 hasRelatedWork W2118329615 @default.
- W2765760237 hasRelatedWork W2129819277 @default.
- W2765760237 hasRelatedWork W2135345119 @default.
- W2765760237 hasRelatedWork W2156610639 @default.
- W2765760237 hasRelatedWork W2168162864 @default.
- W2765760237 hasRelatedWork W2171899831 @default.
- W2765760237 hasRelatedWork W2545525543 @default.
- W2765760237 hasRelatedWork W2740088922 @default.
- W2765760237 hasRelatedWork W2945410996 @default.
- W2765760237 hasRelatedWork W3004086852 @default.
- W2765760237 hasRelatedWork W3046789205 @default.
- W2765760237 hasRelatedWork W3164923505 @default.
- W2765760237 hasRelatedWork W5724778 @default.
- W2765760237 hasRelatedWork W78770037 @default.
- W2765760237 isParatext "false" @default.
- W2765760237 isRetracted "false" @default.
- W2765760237 magId "2765760237" @default.
- W2765760237 workType "article" @default.