Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387086893> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W4387086893 endingPage "25" @default.
- W4387086893 startingPage "1" @default.
- W4387086893 abstract "Learning the human--mobility interaction (HMI) on interactive scenes (e.g., how a vehicle turns at an intersection in response to traffic lights and other oncoming vehicles) can enhance the safety, efficiency, and resilience of smart mobility systems (e.g., autonomous vehicles) and many other ubiquitous computing applications. Towards the ubiquitous and understandable HMI learning, this paper considers both spoken language (e.g., human textual annotations) and unspoken language (e.g., visual and sensor-based behavioral mobility information related to the HMI scenes) in terms of information modalities from the real-world HMI scenarios. We aim to extract the important but possibly implicit HMI concepts (as the named entities) from the textual annotations (provided by human annotators) through a novel human language and sensor data co-learning design. To this end, we propose CG-HMI, a novel Cross-modality Graph fusion approach for extracting important Human-Mobility Interaction concepts from co-learning of textual annotations as well as the visual and behavioral sensor data. In order to fuse both unspoken and spoken languages, we have designed a unified representation called the human--mobility interaction graph (HMIG) for each modality related to the HMI scenes, i.e., textual annotations, visual video frames, and behavioral sensor time-series (e.g., from the on-board or smartphone inertial measurement units). The nodes of the HMIG in these modalities correspond to the textual words (tokenized for ease of processing) related to HMI concepts, the detected traffic participant/environment categories, and the vehicle maneuver behavior types determined from the behavioral sensor time-series. To extract the inter- and intra-modality semantic correspondences and interactions in the HMIG, we have designed a novel graph interaction fusion approach with differentiable pooling-based graph attention. The resulting graph embeddings are then processed to identify and retrieve the HMI concepts within the annotations, which can benefit the downstream human-computer interaction and ubiquitous computing applications. We have developed and implemented CG-HMI into a system prototype, and performed extensive studies upon three real-world HMI datasets (two on car driving and the third one on e-scooter riding). We have corroborated the excellent performance (on average 13.11% higher accuracy than the other baselines in terms of precision, recall, and F1 measure) and effectiveness of CG-HMI in recognizing and extracting the important HMI concepts through cross-modality learning. Our CG-HMI studies also provide real-world implications (e.g., road safety and driving behaviors) about the interactions between the drivers and other traffic participants." @default.
- W4387086893 created "2023-09-28" @default.
- W4387086893 creator A5035112973 @default.
- W4387086893 creator A5053541912 @default.
- W4387086893 creator A5070251006 @default.
- W4387086893 date "2023-09-27" @default.
- W4387086893 modified "2023-10-02" @default.
- W4387086893 title "Cross-Modality Graph-based Language and Sensor Data Co-Learning of Human-Mobility Interaction" @default.
- W4387086893 cites W2016654760 @default.
- W4387086893 cites W2470673105 @default.
- W4387086893 cites W2885138528 @default.
- W4387086893 cites W2963037989 @default.
- W4387086893 cites W2963625095 @default.
- W4387086893 cites W2972422169 @default.
- W4387086893 cites W2982419388 @default.
- W4387086893 cites W2998456908 @default.
- W4387086893 cites W3005144376 @default.
- W4387086893 cites W3009130550 @default.
- W4387086893 cites W3032990727 @default.
- W4387086893 cites W3035337382 @default.
- W4387086893 cites W3035448883 @default.
- W4387086893 cites W3035564946 @default.
- W4387086893 cites W3102848065 @default.
- W4387086893 cites W3103179390 @default.
- W4387086893 cites W3119906317 @default.
- W4387086893 cites W3127151332 @default.
- W4387086893 cites W3155625103 @default.
- W4387086893 cites W3167214600 @default.
- W4387086893 cites W3173396651 @default.
- W4387086893 cites W3176858586 @default.
- W4387086893 cites W3187263517 @default.
- W4387086893 cites W3197340445 @default.
- W4387086893 cites W3211566356 @default.
- W4387086893 cites W3213373486 @default.
- W4387086893 cites W4221150352 @default.
- W4387086893 cites W4291034555 @default.
- W4387086893 doi "https://doi.org/10.1145/3610904" @default.
- W4387086893 hasPublicationYear "2023" @default.
- W4387086893 type Work @default.
- W4387086893 citedByCount "0" @default.
- W4387086893 crossrefType "journal-article" @default.
- W4387086893 hasAuthorship W4387086893A5035112973 @default.
- W4387086893 hasAuthorship W4387086893A5053541912 @default.
- W4387086893 hasAuthorship W4387086893A5070251006 @default.
- W4387086893 hasBestOaLocation W43870868931 @default.
- W4387086893 hasConcept C107457646 @default.
- W4387086893 hasConcept C127413603 @default.
- W4387086893 hasConcept C132525143 @default.
- W4387086893 hasConcept C144024400 @default.
- W4387086893 hasConcept C146978453 @default.
- W4387086893 hasConcept C154945302 @default.
- W4387086893 hasConcept C2776230583 @default.
- W4387086893 hasConcept C2779903281 @default.
- W4387086893 hasConcept C2780226545 @default.
- W4387086893 hasConcept C33954974 @default.
- W4387086893 hasConcept C36289849 @default.
- W4387086893 hasConcept C41008148 @default.
- W4387086893 hasConcept C64543145 @default.
- W4387086893 hasConcept C80444323 @default.
- W4387086893 hasConceptScore W4387086893C107457646 @default.
- W4387086893 hasConceptScore W4387086893C127413603 @default.
- W4387086893 hasConceptScore W4387086893C132525143 @default.
- W4387086893 hasConceptScore W4387086893C144024400 @default.
- W4387086893 hasConceptScore W4387086893C146978453 @default.
- W4387086893 hasConceptScore W4387086893C154945302 @default.
- W4387086893 hasConceptScore W4387086893C2776230583 @default.
- W4387086893 hasConceptScore W4387086893C2779903281 @default.
- W4387086893 hasConceptScore W4387086893C2780226545 @default.
- W4387086893 hasConceptScore W4387086893C33954974 @default.
- W4387086893 hasConceptScore W4387086893C36289849 @default.
- W4387086893 hasConceptScore W4387086893C41008148 @default.
- W4387086893 hasConceptScore W4387086893C64543145 @default.
- W4387086893 hasConceptScore W4387086893C80444323 @default.
- W4387086893 hasIssue "3" @default.
- W4387086893 hasLocation W43870868931 @default.
- W4387086893 hasOpenAccess W4387086893 @default.
- W4387086893 hasPrimaryLocation W43870868931 @default.
- W4387086893 hasRelatedWork W127837312 @default.
- W4387086893 hasRelatedWork W1555038932 @default.
- W4387086893 hasRelatedWork W2017261127 @default.
- W4387086893 hasRelatedWork W2026983969 @default.
- W4387086893 hasRelatedWork W2147938865 @default.
- W4387086893 hasRelatedWork W2805558008 @default.
- W4387086893 hasRelatedWork W2898887571 @default.
- W4387086893 hasRelatedWork W2980881520 @default.
- W4387086893 hasRelatedWork W4362570962 @default.
- W4387086893 hasRelatedWork W71023136 @default.
- W4387086893 hasVolume "7" @default.
- W4387086893 isParatext "false" @default.
- W4387086893 isRetracted "false" @default.
- W4387086893 workType "article" @default.