Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313016342> ?p ?o ?g. }
- W4313016342 abstract "Human pose estimation from single images is a challenging problem that is typically solved by supervised learning. Unfortunately, labeled training data does not yet exist for many human activities since 3D annotation requires dedicated motion capture systems. Therefore, we propose an unsupervised approach that learns to predict a 3D human pose from a single image while only being trained with 2D pose data, which can be crowd-sourced and is already widely available. To this end, we estimate the 3D pose that is most likely over random projections, with the likelihood estimated using normalizing flows on 2D poses. While previous work requires strong priors on camera rotations in the training data set, we learn the distribution of camera angles which significantly improves the performance. Another part of our contribution is to stabilize training with normalizing flows on high-dimensional 3D pose data by first projecting the 2D poses to a linear subspace. We outperform the state-of-the-art unsupervised human pose estimation methods on the benchmark datasets Human3.6M and MPI-INF-3DHP in many metrics." @default.
- W4313016342 created "2023-01-05" @default.
- W4313016342 creator A5007777373 @default.
- W4313016342 creator A5032112040 @default.
- W4313016342 creator A5058949470 @default.
- W4313016342 date "2022-06-01" @default.
- W4313016342 modified "2023-10-06" @default.
- W4313016342 title "ElePose: Unsupervised 3D Human Pose Estimation by Predicting Camera Elevation and Learning Normalizing Flows on 2D Poses" @default.
- W4313016342 cites W1905368000 @default.
- W4313016342 cites W2072706645 @default.
- W4313016342 cites W2101032778 @default.
- W4313016342 cites W2404595106 @default.
- W4313016342 cites W2502928967 @default.
- W4313016342 cites W2519469348 @default.
- W4313016342 cites W2554247908 @default.
- W4313016342 cites W2557698284 @default.
- W4313016342 cites W2583372902 @default.
- W4313016342 cites W2583585015 @default.
- W4313016342 cites W2604375920 @default.
- W4313016342 cites W2605947573 @default.
- W4313016342 cites W2611932403 @default.
- W4313016342 cites W2612706635 @default.
- W4313016342 cites W2797184202 @default.
- W4313016342 cites W2798646183 @default.
- W4313016342 cites W2934361577 @default.
- W4313016342 cites W2962806941 @default.
- W4313016342 cites W2962833508 @default.
- W4313016342 cites W2962896489 @default.
- W4313016342 cites W2963441822 @default.
- W4313016342 cites W2963590054 @default.
- W4313016342 cites W2964016027 @default.
- W4313016342 cites W2964179555 @default.
- W4313016342 cites W2964221239 @default.
- W4313016342 cites W2968940310 @default.
- W4313016342 cites W2970285700 @default.
- W4313016342 cites W2981691949 @default.
- W4313016342 cites W2989465897 @default.
- W4313016342 cites W2997288107 @default.
- W4313016342 cites W3034482680 @default.
- W4313016342 cites W3034581612 @default.
- W4313016342 cites W3034884701 @default.
- W4313016342 cites W3035072447 @default.
- W4313016342 cites W3035416506 @default.
- W4313016342 cites W3035551320 @default.
- W4313016342 cites W3035581100 @default.
- W4313016342 cites W3095608836 @default.
- W4313016342 cites W3098473649 @default.
- W4313016342 cites W3103184573 @default.
- W4313016342 cites W3106165820 @default.
- W4313016342 cites W3167491448 @default.
- W4313016342 cites W3203211072 @default.
- W4313016342 cites W3203617912 @default.
- W4313016342 cites W4214517305 @default.
- W4313016342 cites W4214586188 @default.
- W4313016342 cites W4214684804 @default.
- W4313016342 cites W4214770715 @default.
- W4313016342 doi "https://doi.org/10.1109/cvpr52688.2022.00652" @default.
- W4313016342 hasPublicationYear "2022" @default.
- W4313016342 type Work @default.
- W4313016342 citedByCount "6" @default.
- W4313016342 countsByYear W43130163422022 @default.
- W4313016342 countsByYear W43130163422023 @default.
- W4313016342 crossrefType "proceedings-article" @default.
- W4313016342 hasAuthorship W4313016342A5007777373 @default.
- W4313016342 hasAuthorship W4313016342A5032112040 @default.
- W4313016342 hasAuthorship W4313016342A5058949470 @default.
- W4313016342 hasBestOaLocation W43130163422 @default.
- W4313016342 hasConcept C107673813 @default.
- W4313016342 hasConcept C119857082 @default.
- W4313016342 hasConcept C13280743 @default.
- W4313016342 hasConcept C153180895 @default.
- W4313016342 hasConcept C154945302 @default.
- W4313016342 hasConcept C177264268 @default.
- W4313016342 hasConcept C177769412 @default.
- W4313016342 hasConcept C185798385 @default.
- W4313016342 hasConcept C199360897 @default.
- W4313016342 hasConcept C205649164 @default.
- W4313016342 hasConcept C2776321320 @default.
- W4313016342 hasConcept C31972630 @default.
- W4313016342 hasConcept C32834561 @default.
- W4313016342 hasConcept C41008148 @default.
- W4313016342 hasConcept C51632099 @default.
- W4313016342 hasConcept C52102323 @default.
- W4313016342 hasConcept C8038995 @default.
- W4313016342 hasConceptScore W4313016342C107673813 @default.
- W4313016342 hasConceptScore W4313016342C119857082 @default.
- W4313016342 hasConceptScore W4313016342C13280743 @default.
- W4313016342 hasConceptScore W4313016342C153180895 @default.
- W4313016342 hasConceptScore W4313016342C154945302 @default.
- W4313016342 hasConceptScore W4313016342C177264268 @default.
- W4313016342 hasConceptScore W4313016342C177769412 @default.
- W4313016342 hasConceptScore W4313016342C185798385 @default.
- W4313016342 hasConceptScore W4313016342C199360897 @default.
- W4313016342 hasConceptScore W4313016342C205649164 @default.
- W4313016342 hasConceptScore W4313016342C2776321320 @default.
- W4313016342 hasConceptScore W4313016342C31972630 @default.
- W4313016342 hasConceptScore W4313016342C32834561 @default.
- W4313016342 hasConceptScore W4313016342C41008148 @default.
- W4313016342 hasConceptScore W4313016342C51632099 @default.
- W4313016342 hasConceptScore W4313016342C52102323 @default.