Matches in SemOpenAlex for { <https://semopenalex.org/work/W2530535663> ?p ?o ?g. }
- W2530535663 abstract "Intelligent devices are quickly becoming necessities to support our activities during both work and play. We are already bound in a symbiotic relationship with these devices. An unfortunate effect of the pervasiveness of intelligent devices is the substantial investment of our time and effort to communicate intent. Even though our increasing reliance on these intelligent devices is inevitable, the limits of conventional methods for devices to perceive human expression hinders communication efficiency. These constraints restrict the usefulness of intelligent devices to support our activities. Our communication time and effort must be minimized to leverage the benefits of intelligent devices and seamlessly integrate them into society. Minimizing the time and effort needed to communicate our intent will allow us to concentrate on tasks in which we excel, including creative thought and problem solving. An intuitive method to minimize human communication effort with intelligent devices is to take advantage of our existing interpersonal communication experience. Recent advances in speech, hand gesture, and facial expression recognition provide alternate viable modes of communication that are more natural than conventional tactile interfaces. Use of natural human communication eliminates the need to adapt and invest time and effort using less intuitive techniques required for traditional keyboard and mouse based interfaces. Although the state of the art in natural but isolated modes of communication achieves impressive results, significant hurdles must be conquered before communication with devices in our daily lives will feel natural and effortless. Research has shown that combining information between multiple noise-prone modalities improves accuracy. Leveraging this complementary and redundant content will improve communication robustness and relax current unimodal limitations. This research presents and evaluates a novel multimodal framework to help reduce the total human effort and time required to communicate with intelligent devices. This reduction is realized by determining human intent using a knowledge-based architecture that combines and leverages conflicting information available across multiple natural communication modes and modalities. The effectiveness of this approach is demonstrated using dynamic hand gestures and simple facial expressions characterizing basic emotions. It is important to note that the framework is not restricted to these two forms of communication. The framework presented in this research provides the flexibility necessary to include additional or alternate modalities and channels of information in future research, including improving the robustness of speech understanding. The primary contributions of this research include the leveraging of conflicts in a closed-loop multimodal framework, explicit use of uncertainty in knowledge representation and reasoning across multiple modalities, and a flexible approach for leveraging domain specific knowledge to help understand multimodal human expression. Experiments using a manually defined knowledge base demonstrate an improved average accuracy of individual concepts and an improved average accuracy of overall intents when leveraging conflicts as compared to an open-loop approach." @default.
- W2530535663 created "2016-10-21" @default.
- W2530535663 creator A5004183139 @default.
- W2530535663 date "2006-01-01" @default.
- W2530535663 modified "2023-09-24" @default.
- W2530535663 title "Toward understanding human expression in human-robot interaction" @default.
- W2530535663 cites W1012370774 @default.
- W2530535663 cites W1481080315 @default.
- W2530535663 cites W1493596129 @default.
- W2530535663 cites W1535226948 @default.
- W2530535663 cites W1573827055 @default.
- W2530535663 cites W1584239336 @default.
- W2530535663 cites W1592355794 @default.
- W2530535663 cites W1815942593 @default.
- W2530535663 cites W1832505903 @default.
- W2530535663 cites W1847116105 @default.
- W2530535663 cites W1893343590 @default.
- W2530535663 cites W1912053598 @default.
- W2530535663 cites W193854645 @default.
- W2530535663 cites W1992825118 @default.
- W2530535663 cites W2000885980 @default.
- W2530535663 cites W2011039300 @default.
- W2530535663 cites W2015394094 @default.
- W2530535663 cites W2025460523 @default.
- W2530535663 cites W203421571 @default.
- W2530535663 cites W2039678845 @default.
- W2530535663 cites W2064084932 @default.
- W2530535663 cites W2083325968 @default.
- W2530535663 cites W2085207288 @default.
- W2530535663 cites W2096098613 @default.
- W2530535663 cites W2097501508 @default.
- W2530535663 cites W2098519971 @default.
- W2530535663 cites W2098808520 @default.
- W2530535663 cites W2098947662 @default.
- W2530535663 cites W2099019320 @default.
- W2530535663 cites W2102156576 @default.
- W2530535663 cites W2102770307 @default.
- W2530535663 cites W2105594594 @default.
- W2530535663 cites W2107658650 @default.
- W2530535663 cites W2108308566 @default.
- W2530535663 cites W2108658789 @default.
- W2530535663 cites W2109513571 @default.
- W2530535663 cites W2110186807 @default.
- W2530535663 cites W2110640136 @default.
- W2530535663 cites W2110730726 @default.
- W2530535663 cites W2112351417 @default.
- W2530535663 cites W2114589608 @default.
- W2530535663 cites W2115903268 @default.
- W2530535663 cites W2118163921 @default.
- W2530535663 cites W2120725061 @default.
- W2530535663 cites W2120954940 @default.
- W2530535663 cites W2121836097 @default.
- W2530535663 cites W2123466031 @default.
- W2530535663 cites W2124144842 @default.
- W2530535663 cites W2125452380 @default.
- W2530535663 cites W2125625906 @default.
- W2530535663 cites W2125834953 @default.
- W2530535663 cites W2125838338 @default.
- W2530535663 cites W2125850087 @default.
- W2530535663 cites W2128027639 @default.
- W2530535663 cites W2128276874 @default.
- W2530535663 cites W2128732469 @default.
- W2530535663 cites W2132103241 @default.
- W2530535663 cites W2132944690 @default.
- W2530535663 cites W2134072810 @default.
- W2530535663 cites W2135643110 @default.
- W2530535663 cites W2136461127 @default.
- W2530535663 cites W2136746027 @default.
- W2530535663 cites W2141437475 @default.
- W2530535663 cites W2141661257 @default.
- W2530535663 cites W2145946022 @default.
- W2530535663 cites W2148877716 @default.
- W2530535663 cites W2152239535 @default.
- W2530535663 cites W2152988638 @default.
- W2530535663 cites W2154337510 @default.
- W2530535663 cites W2154739180 @default.
- W2530535663 cites W2155927480 @default.
- W2530535663 cites W2157548127 @default.
- W2530535663 cites W2159017231 @default.
- W2530535663 cites W2159173611 @default.
- W2530535663 cites W2159620287 @default.
- W2530535663 cites W2160063258 @default.
- W2530535663 cites W2163026336 @default.
- W2530535663 cites W2163529501 @default.
- W2530535663 cites W2167072892 @default.
- W2530535663 cites W2170532760 @default.
- W2530535663 cites W2171053814 @default.
- W2530535663 cites W2171060431 @default.
- W2530535663 cites W2250911494 @default.
- W2530535663 cites W2263409983 @default.
- W2530535663 cites W2795510964 @default.
- W2530535663 cites W2912565176 @default.
- W2530535663 cites W2914885528 @default.
- W2530535663 cites W2950850581 @default.
- W2530535663 cites W3034751874 @default.
- W2530535663 cites W3141774725 @default.
- W2530535663 cites W3141873736 @default.
- W2530535663 cites W34143692 @default.
- W2530535663 cites W594965177 @default.
- W2530535663 cites W71602804 @default.