Return to search

Interactive video-realistic character animation from 4D performance capture

Recent advances in surface performance capture have demonstrated highly realistic reconstruction of human motion sequences acquired in a multi-camera studio. Freeviewpoint rendering techniques allow video-realistic replay of captured motions with interactive viewpoint control. However, current approaches do not provide methods for real-time motion and appearance manipulation, hindering the reuse of captured data. Non-sequential temporal alignment techniques enable the conversion of an unstructured set of 3D videos into temporally consistent mesh sequences with shared topology and vertex correspondence, known as 4D videos. The work presented in this thesis aims to develop methods to exploit 4D video datasets of human motion. In particular, a framework for real-time interactive control of a 4D video character created by the combination of multiple captured sequences is investigated. Our goal is to provide methods for video-realistic character animation and rendering with the flexibility and real-time interactive control associated with conventional skeleton driven animation. An approach for parametric motion control of mesh sequences is introduced by blending multiple semantically related 4D video motions. This enables real-time interactive control using high-level parameters such as speed and direction for a walk motion. A hybrid non-linear mesh blending method is introduced. Our approach provides both realistic deformations and real-time performance, allowing parametric mesh spaces to be built at run time. A novel graph representation, referred to as 4D Parametric Motion Graph (4DPMG), is presented to en~apsulate multiple independent parametric motion spaces and transition between them whilst maintaining natural non-rigid surface motions. 4DPMGs provide online path optimisation for transitions between parametric spaces of 4D video motions with low-latency, enabling responsive interactive character control with a large range of motions. The final piece in the puzzle to enable video-realistic animations is provided by 4D Video Textures (4DVT), a new approach for free-viewpoint appearance manipulation that maintains the visual realism of the source video data. 4DVT enables video-realistic rendering of novel motions by combining multiple 4D video textures to synthesise plausible dynamic surface appearance. View-dependant optical flow is used for online alignment of parametric appearance from multiple views. The research presented in this thesis demonstrates for the first time video-realistic interactive character animation from 4D actor performance capture. This represents a new approach to animated character production which achieves video-quality rendering and does not require highly skilled manual authoring.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:616893
Date January 2014
CreatorsGuix, Dan Casas
PublisherUniversity of Surrey
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation

Page generated in 0.0071 seconds