Return to search

Relationship descriptors for interactive motion adaptation

In this thesis we present an interactive motion adaptation scheme for close interactions between skeletal characters and mesh structures, such as navigating restricted environments and manipulating tools. We propose a new spatial-relationship based representation to encode character-object interactions describing the kinematics of the body parts by the weighted sum of vectors relative to descriptor points selectively sampled over the scene. In contrast to previous discrete representations that either only handle static spatial relationships, or require offline, costly optimization processes, our continuous framework smoothly adapts the motion of a character to deformations in the objects and character morphologies in real-time whilst preserving the original context and style of the scene. We demonstrate the strength of working in our relationship-descriptor space in tackling the issue of motion editing under large environment deformations by integrating procedural animation techniques such as repositioning contacts in an interaction whilst preserving the context and style of the original animation. Furthermore we propose a method that can be used to adapt animations from template objects to novel ones by solving for mappings between the two in our relationship-descriptor space effectively transferring an entire motion from one object to a new one of different geometry whilst ensuring continuity across all frames of the animation, as opposed to mapping static poses only as is traditionally achieved. The experimental results show that our method can be used for a wide range of applications, including motion retargeting for dynamically changing scenes, multi-character interactions, and interactive character control and deformation transfer for scenes that involve close interactions. We further demonstrate a key use case in retargeting locomotion to uneven terrains and curving paths convincingly for bipeds and quadrupeds. Our framework is useful for artists who need to design animated scenes interactively, and modern computer games that allow users to design their own virtual characters, objects and environments, such that they can recycle existing motion data for a large variety of different configurations without the need to manually reconfigure motion from scratch or store expensive combinations of animation in memory. Most importantly it’s achieved in real-time.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:738967
Date January 2017
CreatorsAl-Ashqar, Rami
ContributorsKomura, Taku ; Vijayakumar, Sethu
PublisherUniversity of Edinburgh
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://hdl.handle.net/1842/29008

Page generated in 0.002 seconds