The goal of this thesis is to synthesize believable motions of a
character interacting with its surroundings and manipulating objects through physical contacts and forces. Human-like autonomous avatars are in increasing demand in areas such as entertainment, education, and health care. Yet modeling the basic human motor skills of locomotion and manipulation remains a long-standing challenge in animation research. The seemingly simple tasks of navigating an uneven terrain or grasping cups of different shapes involve planning with complex kinematic and physical constraints as well as adaptation to unexpected perturbations. Moreover, natural movements exhibit unique personal characteristics that are complex to model. Although motion capture technologies allow virtual actors to use recorded human motions in many applications, the recorded motions are not directly applicable to tasks involving interactions for two reasons. First, the acquired data cannot be easily adapted to new environments or different tasks goals. Second, acquisition of accurate data is still a challenge for fine scale object manipulations. In this work, we utilize data to create natural looking animations, and mitigate data deficiency with physics-based simulations and numerical optimizations.
We develop algorithms based on a single reference motion for three types of control problems. The first problem focuses on motions without contact constraints. We use joint torque patterns identified from the captured motion to simulate responses and recovery of the same style under unexpected pushes. The second problem focuses on locomotion with foot contacts. We use contact forces to control an abstract dynamic model of the center of mass, which sufficiently describes the locomotion task in the input motion. Simulation of the abstract model under unexpected pushes or anticipated changes of the
environment results in responses consistent with both the laws of physics and the style of the input. The third problem focuses on fine scale object manipulation tasks, in which accurate finger motions and contact information are not available. We propose a sampling method to discover contact relations between the hand and the object from only the gross motion of the wrists and the object. We then use the abundant contact constraints to synthesize detailed finger motions. The algorithm creates finger motions of various styles for a diverse set of object shapes and tasks, including ones that are not present at capture time.
The three algorithms together control an autonomous character with dexterous hands to interact naturally with a virtual world. Our methods are general and robust across character structures and motion contents when testing on a wide variety of motion capture sequences and environments. The work in this thesis brings closer the motor skills of a virtual character to its human counterpart. It provides computational tools for the analysis of human biomechanics, and can potentially inspire the design of novel control algorithms for humanoid robots.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/47540 |
Date | 23 February 2012 |
Creators | Ye, Yuting |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Detected Language | English |
Type | Dissertation |
Page generated in 0.0021 seconds