Matches in SemOpenAlex for { <https://semopenalex.org/work/W4361762496> ?p ?o ?g. }
- W4361762496 endingPage "28" @default.
- W4361762496 startingPage "1" @default.
- W4361762496 abstract "End-to-end deep learning models are increasingly applied to safety-critical human activity recognition (HAR) applications, e.g., healthcare monitoring and smart home control, to reduce developer burden and increase the performance and robustness of prediction models. However, integrating HAR models in safety-critical applications requires trust, and recent approaches have aimed to balance the performance of deep learning models with explainable decision-making for complex activity recognition. Prior works have exploited the compositionality of complex HAR (i.e., higher-level activities composed of lower-level activities) to form models with symbolic interfaces, such as concept-bottleneck architectures, that facilitate inherently interpretable models. However, feature engineering for symbolic concepts-as well as the relationship between the concepts-requires precise annotation of lower-level activities by domain experts, usually with fixed time windows, all of which induce a heavy and error-prone workload on the domain expert. In this paper, we introduce X-CHAR, an eXplainable Complex Human Activity Recognition model that doesn't require precise annotation of low-level activities, offers explanations in the form of human-understandable, high-level concepts, while maintaining the robust performance of end-to-end deep learning models for time series data. X-CHAR learns to model complex activity recognition in the form of a sequence of concepts. For each classification, X-CHAR outputs a sequence of concepts and a counterfactual example as the explanation. We show that the sequence information of the concepts can be modeled using Connectionist Temporal Classification (CTC) loss without having accurate start and end times of low-level annotations in the training dataset-significantly reducing developer burden. We evaluate our model on several complex activity datasets and demonstrate that our model offers explanations without compromising the prediction accuracy in comparison to baseline models. Finally, we conducted a mechanical Turk study to show that the explanations provided by our model are more understandable than the explanations from existing methods for complex activity recognition." @default.
- W4361762496 created "2023-04-04" @default.
- W4361762496 creator A5022681108 @default.
- W4361762496 creator A5040505438 @default.
- W4361762496 creator A5050029555 @default.
- W4361762496 creator A5065874576 @default.
- W4361762496 date "2022-03-27" @default.
- W4361762496 modified "2023-09-29" @default.
- W4361762496 title "X-CHAR" @default.
- W4361762496 cites W1849277567 @default.
- W4361762496 cites W2003792585 @default.
- W4361762496 cites W2052666245 @default.
- W4361762496 cites W2066792529 @default.
- W4361762496 cites W2099336098 @default.
- W4361762496 cites W2134067070 @default.
- W4361762496 cites W2161381512 @default.
- W4361762496 cites W2270470215 @default.
- W4361762496 cites W2282821441 @default.
- W4361762496 cites W2402069821 @default.
- W4361762496 cites W2553915786 @default.
- W4361762496 cites W2737725206 @default.
- W4361762496 cites W2764024122 @default.
- W4361762496 cites W2765813195 @default.
- W4361762496 cites W2773003563 @default.
- W4361762496 cites W2777460464 @default.
- W4361762496 cites W2794833149 @default.
- W4361762496 cites W2851629429 @default.
- W4361762496 cites W2886281300 @default.
- W4361762496 cites W2927224805 @default.
- W4361762496 cites W2945976633 @default.
- W4361762496 cites W2954709787 @default.
- W4361762496 cites W2962858109 @default.
- W4361762496 cites W2971670291 @default.
- W4361762496 cites W2973102973 @default.
- W4361762496 cites W2973204573 @default.
- W4361762496 cites W2980701909 @default.
- W4361762496 cites W2984911178 @default.
- W4361762496 cites W3023497337 @default.
- W4361762496 cites W3083281679 @default.
- W4361762496 cites W3107280982 @default.
- W4361762496 cites W3108887041 @default.
- W4361762496 cites W3123012394 @default.
- W4361762496 cites W4206647573 @default.
- W4361762496 cites W4220663024 @default.
- W4361762496 cites W4234531549 @default.
- W4361762496 doi "https://doi.org/10.1145/3580804" @default.
- W4361762496 hasPublicationYear "2022" @default.
- W4361762496 type Work @default.
- W4361762496 citedByCount "0" @default.
- W4361762496 crossrefType "journal-article" @default.
- W4361762496 hasAuthorship W4361762496A5022681108 @default.
- W4361762496 hasAuthorship W4361762496A5040505438 @default.
- W4361762496 hasAuthorship W4361762496A5050029555 @default.
- W4361762496 hasAuthorship W4361762496A5065874576 @default.
- W4361762496 hasBestOaLocation W43617624961 @default.
- W4361762496 hasConcept C104317684 @default.
- W4361762496 hasConcept C108583219 @default.
- W4361762496 hasConcept C119857082 @default.
- W4361762496 hasConcept C121687571 @default.
- W4361762496 hasConcept C149635348 @default.
- W4361762496 hasConcept C154945302 @default.
- W4361762496 hasConcept C185592680 @default.
- W4361762496 hasConcept C207685749 @default.
- W4361762496 hasConcept C2778827112 @default.
- W4361762496 hasConcept C2780513914 @default.
- W4361762496 hasConcept C41008148 @default.
- W4361762496 hasConcept C50644808 @default.
- W4361762496 hasConcept C55493867 @default.
- W4361762496 hasConcept C63479239 @default.
- W4361762496 hasConcept C8521452 @default.
- W4361762496 hasConceptScore W4361762496C104317684 @default.
- W4361762496 hasConceptScore W4361762496C108583219 @default.
- W4361762496 hasConceptScore W4361762496C119857082 @default.
- W4361762496 hasConceptScore W4361762496C121687571 @default.
- W4361762496 hasConceptScore W4361762496C149635348 @default.
- W4361762496 hasConceptScore W4361762496C154945302 @default.
- W4361762496 hasConceptScore W4361762496C185592680 @default.
- W4361762496 hasConceptScore W4361762496C207685749 @default.
- W4361762496 hasConceptScore W4361762496C2778827112 @default.
- W4361762496 hasConceptScore W4361762496C2780513914 @default.
- W4361762496 hasConceptScore W4361762496C41008148 @default.
- W4361762496 hasConceptScore W4361762496C50644808 @default.
- W4361762496 hasConceptScore W4361762496C55493867 @default.
- W4361762496 hasConceptScore W4361762496C63479239 @default.
- W4361762496 hasConceptScore W4361762496C8521452 @default.
- W4361762496 hasFunder F4320306076 @default.
- W4361762496 hasFunder F4320338279 @default.
- W4361762496 hasFunder F4320338295 @default.
- W4361762496 hasIssue "1" @default.
- W4361762496 hasLocation W43617624961 @default.
- W4361762496 hasOpenAccess W4361762496 @default.
- W4361762496 hasPrimaryLocation W43617624961 @default.
- W4361762496 hasRelatedWork W2968586400 @default.
- W4361762496 hasRelatedWork W3017600792 @default.
- W4361762496 hasRelatedWork W3034267371 @default.
- W4361762496 hasRelatedWork W3135542633 @default.
- W4361762496 hasRelatedWork W3148119887 @default.
- W4361762496 hasRelatedWork W3189515467 @default.