Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312947958> ?p ?o ?g. }
- W4312947958 endingPage "116539" @default.
- W4312947958 startingPage "116527" @default.
- W4312947958 abstract "Unhealthy lifestyle causes several chronic diseases in humans. Many products are introduced to avoid such illnesses and provide e-learning-based healthcare services. However, the main focus is still on providing comfortable and reliable solutions. Inertial measurement units (IMU) are considered as the most independent and non-intrusive way to monitor and analyze human health via motion patterns detection. Deep learning is also taken as an excellent tool to detect motion patterns from IMU data. In this paper, a deep-learning-based human motion detection approach for smart healthcare learning tool has been proposed. A novel hybrid descriptors-based pre-classification and multi-features analysis algorithm is proposed to classify the human motion for healthcare e-learning. For pre-processing, a quaternion-based filter is utilized to filter the IMU signals. An experiment is performed over the acceleration signals by using minimum and average gravity removal techniques. Next, signal segmentation of multiple time intervals has been applied to segment data and ultimately compare the results to decide which type provides better performance. Then, pre-classification is done using motion pattern identification in the form of active and passive patterns. During the features analysis phase, features have been extracted based on both active and passive motion patterns. Further, an orthogonal fuzzy neighborhood discriminant analysis technique has been used to reduce the dimensionality of the extracted feature vector. Finally, a deep learner known as long-short term memory has been applied to classify the actions of both active and passive motion features for healthcare e-learning systems. For this purpose, we utilized two datasets: REALDISP and wearable computing. The experimental results show that our proposed system for smart healthcare learning outperformed other state-of-the-art systems. The proposed implemented system provided 87.35% accuracy for REALDISP and 85.18% accuracy for wearable computing datasets. Furthermore, the classified motion patterns are provided to a smart healthcare advisor in order to provide live feedback about human health for immediate action." @default.
- W4312947958 created "2023-01-05" @default.
- W4312947958 creator A5015952754 @default.
- W4312947958 creator A5039552087 @default.
- W4312947958 creator A5057153004 @default.
- W4312947958 creator A5059829609 @default.
- W4312947958 creator A5072951124 @default.
- W4312947958 creator A5078943371 @default.
- W4312947958 creator A5089511340 @default.
- W4312947958 creator A5089971928 @default.
- W4312947958 creator A5091256982 @default.
- W4312947958 date "2022-01-01" @default.
- W4312947958 modified "2023-09-30" @default.
- W4312947958 title "Deep Human Motion Detection and Multi-Features Analysis for Smart Healthcare Learning Tools" @default.
- W4312947958 cites W1716980162 @default.
- W4312947958 cites W1968786811 @default.
- W4312947958 cites W1995705140 @default.
- W4312947958 cites W1996967927 @default.
- W4312947958 cites W2019539135 @default.
- W4312947958 cites W2037034232 @default.
- W4312947958 cites W2054040518 @default.
- W4312947958 cites W2106195076 @default.
- W4312947958 cites W2131775935 @default.
- W4312947958 cites W2152084636 @default.
- W4312947958 cites W2160146756 @default.
- W4312947958 cites W2201675152 @default.
- W4312947958 cites W2506886870 @default.
- W4312947958 cites W2600842288 @default.
- W4312947958 cites W2608147960 @default.
- W4312947958 cites W2785506286 @default.
- W4312947958 cites W2792140610 @default.
- W4312947958 cites W2809202603 @default.
- W4312947958 cites W2891454519 @default.
- W4312947958 cites W2907173324 @default.
- W4312947958 cites W2910361607 @default.
- W4312947958 cites W2971000371 @default.
- W4312947958 cites W2988212233 @default.
- W4312947958 cites W2995110559 @default.
- W4312947958 cites W2995628968 @default.
- W4312947958 cites W3005390993 @default.
- W4312947958 cites W3010153341 @default.
- W4312947958 cites W3016117922 @default.
- W4312947958 cites W3025043103 @default.
- W4312947958 cites W3027425326 @default.
- W4312947958 cites W3033847194 @default.
- W4312947958 cites W3034132178 @default.
- W4312947958 cites W3044651858 @default.
- W4312947958 cites W3044806259 @default.
- W4312947958 cites W3048952742 @default.
- W4312947958 cites W3066888720 @default.
- W4312947958 cites W3091356402 @default.
- W4312947958 cites W3097787953 @default.
- W4312947958 cites W3110336329 @default.
- W4312947958 cites W3112713116 @default.
- W4312947958 cites W3113327043 @default.
- W4312947958 cites W3122806818 @default.
- W4312947958 cites W3126445317 @default.
- W4312947958 cites W3127063527 @default.
- W4312947958 cites W3136648828 @default.
- W4312947958 cites W3153753207 @default.
- W4312947958 cites W3156791706 @default.
- W4312947958 cites W3161132553 @default.
- W4312947958 cites W3173506890 @default.
- W4312947958 cites W3199905074 @default.
- W4312947958 cites W3200796012 @default.
- W4312947958 cites W3211522713 @default.
- W4312947958 cites W3215734071 @default.
- W4312947958 cites W4206703131 @default.
- W4312947958 cites W4210352466 @default.
- W4312947958 cites W4214767594 @default.
- W4312947958 cites W4220941909 @default.
- W4312947958 cites W4220999568 @default.
- W4312947958 cites W4281781975 @default.
- W4312947958 cites W4282043382 @default.
- W4312947958 cites W4285169103 @default.
- W4312947958 cites W4285305255 @default.
- W4312947958 cites W4285495163 @default.
- W4312947958 doi "https://doi.org/10.1109/access.2022.3214986" @default.
- W4312947958 hasPublicationYear "2022" @default.
- W4312947958 type Work @default.
- W4312947958 citedByCount "3" @default.
- W4312947958 countsByYear W43129479582023 @default.
- W4312947958 crossrefType "journal-article" @default.
- W4312947958 hasAuthorship W4312947958A5015952754 @default.
- W4312947958 hasAuthorship W4312947958A5039552087 @default.
- W4312947958 hasAuthorship W4312947958A5057153004 @default.
- W4312947958 hasAuthorship W4312947958A5059829609 @default.
- W4312947958 hasAuthorship W4312947958A5072951124 @default.
- W4312947958 hasAuthorship W4312947958A5078943371 @default.
- W4312947958 hasAuthorship W4312947958A5089511340 @default.
- W4312947958 hasAuthorship W4312947958A5089971928 @default.
- W4312947958 hasAuthorship W4312947958A5091256982 @default.
- W4312947958 hasBestOaLocation W43129479581 @default.
- W4312947958 hasConcept C108583219 @default.
- W4312947958 hasConcept C119857082 @default.
- W4312947958 hasConcept C153180895 @default.
- W4312947958 hasConcept C154945302 @default.
- W4312947958 hasConcept C31972630 @default.