Return to search

Intuitive Generation of Realistic Motions for Articulated Human Characters

A long-standing goal in computer graphics is to create and control realistic motion for virtual human characters. Despite the progress made over the last decade, it remains challenging to design a system that allows a random user to intuitively create and control life-like human motions. This dissertation focuses on exploring theory, algorithms and applications that enable novice users to quickly and easily create and control natural-looking motions, including both full-body movement and hand articulations, for human characters.

More specifically, the goals of this research are: (1) to investigate generative statistical models and physics-based dynamic models to precisely predict how humans move and (2) to demonstrate the utility of our motion models in a wide range of applications including motion analysis, synthesis, editing and acquisition.

We have developed two novel generative statistical models from prerecorded motion data and show their promising applications in real time motion editing, online motion control, offline animation design, and motion data processing. In addition, we have explored how to model subtle contact phenomena for dexterous hand grasping and manipulation using physics-based dynamic models. We show for the first time how to capture physically realistic hand manipulation data from ambiguous image data obtained by video cameras.

Identiferoai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/149245
Date02 October 2013
CreatorsMin, Jianyuan
ContributorsChai, Jinxiang, Keyser, John, Schaefer, Scott, Hurtado, John E.
Source SetsTexas A and M University
Detected LanguageEnglish
TypeThesis, text
Formatapplication/pdf

Page generated in 0.002 seconds