Matches in SemOpenAlex for { <https://semopenalex.org/work/W4308347920> ?p ?o ?g. }
- W4308347920 endingPage "119139" @default.
- W4308347920 startingPage "119139" @default.
- W4308347920 abstract "Recently, in-bed human pose estimation has attracted the interest of researchers due to its relevance to a wide range of healthcare applications. Compared to the general problem of human pose estimation, in-bed pose estimation has several inherent challenges, the most prominent being frequent and severe occlusions caused by bedding. In this paper we explore the effective use of images from multiple non-visual and privacy-preserving modalities such as depth, long-wave infrared (LWIR) and pressure maps for the task of in-bed pose estimation in two settings. First, we explore the effective fusion of information from different imaging modalities for better pose estimation. Secondly, we propose a framework that can estimate in-bed pose estimation when visible images are unavailable, and demonstrate the applicability of fusion methods to scenarios where only LWIR images are available. We analyze and demonstrate the effect of fusing features from multiple modalities. For this purpose, we consider four different techniques: (1) Addition, (2) Concatenation, (3) Fusion via learned modal weights, and 4) End-to-end fully trainable approach; with a state-of-the-art pose estimation model. We also evaluate the effect of reconstructing a data-rich modality (i.e., visible modality) from a privacy-preserving modality with data scarcity (i.e., long-wavelength infrared) for in-bed human pose estimation. For reconstruction, we use a conditional generative adversarial network. We conduct experiments on a publicly available dataset for feature fusion and visible image reconstruction. We conduct ablative studies across different design decisions of our framework. This includes selecting features with different levels of granularity, using different fusion techniques, and varying model parameters. Through extensive evaluations, we demonstrate that our method produces on par or better results compared to the state-of-the-art. The insights from this research offer stepping stones towards robust automated privacy-preserving systems that utilize multimodal feature fusion to support the assessment and diagnosis of medical conditions." @default.
- W4308347920 created "2022-11-11" @default.
- W4308347920 creator A5004998571 @default.
- W4308347920 creator A5007030707 @default.
- W4308347920 creator A5021573162 @default.
- W4308347920 creator A5043062569 @default.
- W4308347920 creator A5045736336 @default.
- W4308347920 creator A5083626840 @default.
- W4308347920 creator A5088315770 @default.
- W4308347920 date "2023-03-01" @default.
- W4308347920 modified "2023-10-06" @default.
- W4308347920 title "Privacy-Preserving in-bed pose monitoring: A fusion and reconstruction study" @default.
- W4308347920 cites W1936750108 @default.
- W4308347920 cites W1973525485 @default.
- W4308347920 cites W2097117768 @default.
- W4308347920 cites W2194775991 @default.
- W4308347920 cites W2302255633 @default.
- W4308347920 cites W2395611524 @default.
- W4308347920 cites W2559085405 @default.
- W4308347920 cites W2565639579 @default.
- W4308347920 cites W2603777577 @default.
- W4308347920 cites W2614504311 @default.
- W4308347920 cites W2770827753 @default.
- W4308347920 cites W2900753655 @default.
- W4308347920 cites W2916798096 @default.
- W4308347920 cites W2927069527 @default.
- W4308347920 cites W2960720104 @default.
- W4308347920 cites W2962793481 @default.
- W4308347920 cites W2963073614 @default.
- W4308347920 cites W2963402313 @default.
- W4308347920 cites W2963446712 @default.
- W4308347920 cites W2963800363 @default.
- W4308347920 cites W2964304707 @default.
- W4308347920 cites W3000322757 @default.
- W4308347920 cites W3021998000 @default.
- W4308347920 cites W3036324892 @default.
- W4308347920 cites W3083707308 @default.
- W4308347920 cites W3101855769 @default.
- W4308347920 cites W3115488635 @default.
- W4308347920 cites W3203064792 @default.
- W4308347920 cites W3207918547 @default.
- W4308347920 cites W3215884252 @default.
- W4308347920 cites W4205387210 @default.
- W4308347920 cites W602397586 @default.
- W4308347920 doi "https://doi.org/10.1016/j.eswa.2022.119139" @default.
- W4308347920 hasPublicationYear "2023" @default.
- W4308347920 type Work @default.
- W4308347920 citedByCount "2" @default.
- W4308347920 countsByYear W43083479202023 @default.
- W4308347920 crossrefType "journal-article" @default.
- W4308347920 hasAuthorship W4308347920A5004998571 @default.
- W4308347920 hasAuthorship W4308347920A5007030707 @default.
- W4308347920 hasAuthorship W4308347920A5021573162 @default.
- W4308347920 hasAuthorship W4308347920A5043062569 @default.
- W4308347920 hasAuthorship W4308347920A5045736336 @default.
- W4308347920 hasAuthorship W4308347920A5083626840 @default.
- W4308347920 hasAuthorship W4308347920A5088315770 @default.
- W4308347920 hasBestOaLocation W43083479202 @default.
- W4308347920 hasConcept C114614502 @default.
- W4308347920 hasConcept C119857082 @default.
- W4308347920 hasConcept C144024400 @default.
- W4308347920 hasConcept C153180895 @default.
- W4308347920 hasConcept C154945302 @default.
- W4308347920 hasConcept C2779903281 @default.
- W4308347920 hasConcept C2780226545 @default.
- W4308347920 hasConcept C31972630 @default.
- W4308347920 hasConcept C33923547 @default.
- W4308347920 hasConcept C33954974 @default.
- W4308347920 hasConcept C36289849 @default.
- W4308347920 hasConcept C41008148 @default.
- W4308347920 hasConcept C52102323 @default.
- W4308347920 hasConcept C87619178 @default.
- W4308347920 hasConcept C97931131 @default.
- W4308347920 hasConceptScore W4308347920C114614502 @default.
- W4308347920 hasConceptScore W4308347920C119857082 @default.
- W4308347920 hasConceptScore W4308347920C144024400 @default.
- W4308347920 hasConceptScore W4308347920C153180895 @default.
- W4308347920 hasConceptScore W4308347920C154945302 @default.
- W4308347920 hasConceptScore W4308347920C2779903281 @default.
- W4308347920 hasConceptScore W4308347920C2780226545 @default.
- W4308347920 hasConceptScore W4308347920C31972630 @default.
- W4308347920 hasConceptScore W4308347920C33923547 @default.
- W4308347920 hasConceptScore W4308347920C33954974 @default.
- W4308347920 hasConceptScore W4308347920C36289849 @default.
- W4308347920 hasConceptScore W4308347920C41008148 @default.
- W4308347920 hasConceptScore W4308347920C52102323 @default.
- W4308347920 hasConceptScore W4308347920C87619178 @default.
- W4308347920 hasConceptScore W4308347920C97931131 @default.
- W4308347920 hasLocation W43083479201 @default.
- W4308347920 hasLocation W43083479202 @default.
- W4308347920 hasOpenAccess W4308347920 @default.
- W4308347920 hasPrimaryLocation W43083479201 @default.
- W4308347920 hasRelatedWork W1461621550 @default.
- W4308347920 hasRelatedWork W1968716783 @default.
- W4308347920 hasRelatedWork W2020350089 @default.
- W4308347920 hasRelatedWork W2565829216 @default.
- W4308347920 hasRelatedWork W2588330143 @default.
- W4308347920 hasRelatedWork W2729514902 @default.