Matches in SemOpenAlex for { <https://semopenalex.org/work/W2916401830> ?p ?o ?g. }
- W2916401830 abstract "Character animation plays an essential role in the area of featured film and computer games. Manually creating character animation by animators is both tedious and inefficient, where motion capture techniques (MoCap) have been developed and become the most popular method for creating realistic character animation products. Commercial MoCap systems are expensive and the capturing process itself usually requires an indoor studio environment. Procedural animation creation is often lacking extensive user control during the generation progress. Therefore, efficiently and effectively reusing MoCap data can brings significant benefits, which has motivated wider research in terms of machine learning based MoCap data processing. A typical work flow of MoCap data reusing can be divided into 3 stages: data capture, data management and data reusing. There are still many challenges at each stage. For instance, the data capture and management often suffer from data quality problems. The efficient and effective retrieval method is also demanding due to the large amount of data being used. In addition, classification and understanding of actions are the fundamental basis of data reusing. This thesis proposes to use machine learning on MoCap data for reusing purposes, where a frame work of motion capture data processing is designed. The modular design of this framework enables motion data refinement, retrieval and recognition. The first part of this thesis introduces various methods used in existing motion capture processing approaches in literature and a brief introduction of relevant machine learning methods used in this framework. In general, the frameworks related to refinement, retrieval, recognition are discussed. A motion refinement algorithm based on dictionary learning will then be presented, where kinematical structural and temporal information are exploited. The designed optimization method and data preprocessing technique can ensure a smooth property for the recovered result. After that, a motion refinement algorithm based on matrix completion is presented, where the low-rank property and spatio-temporal information is exploited. Such model does not require preparing data for training. The designed optimization method outperforms existing approaches in regard to both effectiveness and efficiency. A motion retrieval method based on multi-view feature selection is also proposed, where the intrinsic relations between visual words in each motion feature subspace are discovered as a means of improving the retrieval performance. A provisional trace-ratio objective function and an iterative optimization method are also included. A non-negative matrix factorization based motion data clustering method is proposed for recognition purposes, which aims to deal with large scale unsupervised/semi-supervised problems. In addition, deep learning models are used for motion data recognition, e.g. 2D gait recognition and 3D MoCap recognition. To sum up, the research on motion data refinement, retrieval and recognition are presented in this thesis with an aim to tackle the major challenges in motion reusing. The proposed motion refinement methods aim to provide high quality clean motion data for downstream applications. The designed multi-view feature selection algorithm aims to improve the motion retrieval performance. The proposed motion recognition methods are equally essential for motion understanding. A collection of publications by the author of this thesis are noted in publications section." @default.
- W2916401830 created "2019-03-02" @default.
- W2916401830 creator A5072097629 @default.
- W2916401830 date "2018-07-10" @default.
- W2916401830 modified "2023-09-27" @default.
- W2916401830 title "Motion capture data processing, retrieval and recognition" @default.
- W2916401830 cites W108943329 @default.
- W2916401830 cites W109378472 @default.
- W2916401830 cites W1479822238 @default.
- W2916401830 cites W1489548117 @default.
- W2916401830 cites W1492807824 @default.
- W2916401830 cites W1531492092 @default.
- W2916401830 cites W1533861849 @default.
- W2916401830 cites W1540377506 @default.
- W2916401830 cites W1551558534 @default.
- W2916401830 cites W1587285176 @default.
- W2916401830 cites W1596216457 @default.
- W2916401830 cites W1607637737 @default.
- W2916401830 cites W1686810756 @default.
- W2916401830 cites W173050375 @default.
- W2916401830 cites W1735317348 @default.
- W2916401830 cites W1736339626 @default.
- W2916401830 cites W1902027874 @default.
- W2916401830 cites W1904365287 @default.
- W2916401830 cites W1950788856 @default.
- W2916401830 cites W1952664243 @default.
- W2916401830 cites W1964792465 @default.
- W2916401830 cites W1965274140 @default.
- W2916401830 cites W1966555152 @default.
- W2916401830 cites W1967554269 @default.
- W2916401830 cites W1971358173 @default.
- W2916401830 cites W1971756996 @default.
- W2916401830 cites W1972508834 @default.
- W2916401830 cites W1976709621 @default.
- W2916401830 cites W1977273805 @default.
- W2916401830 cites W1977871568 @default.
- W2916401830 cites W1979378996 @default.
- W2916401830 cites W1979545636 @default.
- W2916401830 cites W1981271712 @default.
- W2916401830 cites W1983592444 @default.
- W2916401830 cites W1984219317 @default.
- W2916401830 cites W1984776140 @default.
- W2916401830 cites W1985469025 @default.
- W2916401830 cites W1986931325 @default.
- W2916401830 cites W1988608780 @default.
- W2916401830 cites W1989665047 @default.
- W2916401830 cites W1989696410 @default.
- W2916401830 cites W1991825408 @default.
- W2916401830 cites W1994454889 @default.
- W2916401830 cites W1995971015 @default.
- W2916401830 cites W1997695329 @default.
- W2916401830 cites W1998808035 @default.
- W2916401830 cites W1999789440 @default.
- W2916401830 cites W2001958562 @default.
- W2916401830 cites W2006761437 @default.
- W2916401830 cites W2008765809 @default.
- W2916401830 cites W2008824967 @default.
- W2916401830 cites W2009501510 @default.
- W2916401830 cites W2009702064 @default.
- W2916401830 cites W2010399676 @default.
- W2916401830 cites W2019669448 @default.
- W2916401830 cites W2019863495 @default.
- W2916401830 cites W2020836902 @default.
- W2916401830 cites W2021150171 @default.
- W2916401830 cites W2022013167 @default.
- W2916401830 cites W2022508996 @default.
- W2916401830 cites W2023866509 @default.
- W2916401830 cites W2024082504 @default.
- W2916401830 cites W2027387938 @default.
- W2916401830 cites W2028931980 @default.
- W2916401830 cites W2029343251 @default.
- W2916401830 cites W2029903115 @default.
- W2916401830 cites W2030061193 @default.
- W2916401830 cites W2032561514 @default.
- W2916401830 cites W2032893275 @default.
- W2916401830 cites W2033310064 @default.
- W2916401830 cites W203345490 @default.
- W2916401830 cites W2033777009 @default.
- W2916401830 cites W2034328688 @default.
- W2916401830 cites W2040270931 @default.
- W2916401830 cites W2042701174 @default.
- W2916401830 cites W2043216674 @default.
- W2916401830 cites W2043545458 @default.
- W2916401830 cites W2045405869 @default.
- W2916401830 cites W2045677161 @default.
- W2916401830 cites W2046658845 @default.
- W2916401830 cites W2048821851 @default.
- W2916401830 cites W2049077434 @default.
- W2916401830 cites W2052648922 @default.
- W2916401830 cites W2053186076 @default.
- W2916401830 cites W2056488622 @default.
- W2916401830 cites W2058001207 @default.
- W2916401830 cites W2059021619 @default.
- W2916401830 cites W2067502459 @default.
- W2916401830 cites W2072356364 @default.
- W2916401830 cites W2075506710 @default.
- W2916401830 cites W2076837173 @default.
- W2916401830 cites W2080592425 @default.
- W2916401830 cites W2082106313 @default.
- W2916401830 cites W2083382040 @default.