• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 4
  • 4
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Modeling and animating realistic faces from images /

Pighin, Fre︠d︡e︠r︡ic. January 1999 (has links)
Thesis (Ph. D.)--University of Washington, 1999. / Vita. Includes bibliographical references (p. 93-102).
2

Expressive facial animation transfer for virtual actors /

Zhao, Hui. January 2007 (has links)
Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2007. / Includes bibliographical references (leaves 37-41). Also available in electronic version.
3

Making FACES : the Facial Animation, Construction and Editing System

Patel, Manjula January 1991 (has links)
The human face is a fascinating, but extremely complex object; the research project described is concerned with the computer generation and animation of faces. However, the age old captivation with the face transforms into a major obstacle when creating synthetic faces. The face and head are the most visible attributes of a person. We master the skills of recognising faces and interpreting facial movement at a very early age. As a result, we are likely to notice the smallest deviation from our concept of how a face should appear and behave. Computer animation in general, is often perceived to be ``wooden' and very ``rigid'; the aim is therefore to provide facilities for the generation of believable faces and convincing facial movement. The major issues addressed within the project concern the modelling of a large variety of faces and their animation. Computer modelling of arbitrary faces is an area that has received relatively little attention in comparison with the animation of faces. Another problem that has been considered is that of providing the user with adequate and effective control over the modelling and animation of the face. The Facial Animation, Construction and Editing System or FACES was conceived as a system for investigating these issues. A promising approach is to look a little deeper than the surface of the skin. A three-layer anatomical model of the head, which incorporates bone, muscle, skin and surface features, has been developed. As well as serving as a foundation which integrates all the facilities available within FACES, the advantage of the model is that it allows differing strategies to be used for modelling and animation. FACES is an interactive system, which helps with both the generation and animation of faces, while hiding the structural complexities of the face from the user. The software consists of four sub-systems; CONSTRUCT and MODIFY cater for modelling functionality, while ANIMATE allows animation sequences to be generated and RENDER provides for shading and motion evaluation.
4

Morphable 3d Facial Animation Based On Thin Plate Splines

Erdogdu, Aysu 01 May 2010 (has links) (PDF)
The aim of this study is to present a novel three dimensional (3D) facial animation method for morphing emotions and facial expressions from one face model to another. For this purpose, smooth and realistic face models were animated with thin plate splines (TPS). Neutral face models were animated and compared with the actual expressive face models. Neutral and expressive face models were obtained from subjects via a 3D face scanner. The face models were preprocessed for pose and size normalization. Then muscle and wrinkle control points were located to the source face with neutral expression according to the human anatomy. Facial Action Coding System (FACS) was used to determine the control points and the face regions in the underlying model. The final positions of the control points after a facial expression were received from the expressive scan data of the source face. Afterwards control points were transferred to the target face using the facial landmarks and TPS as the morphing function. Finally, the neutral target face was animated with control points by TPS. In order to visualize the method, face scans with expressions composed of a selected subset of action units found in Bosphorus Database were used. Five lower-face and three-upper face action units are simulated during this study. For experimental results, the facial expressions were created on the 3D neutral face scan data of a human subject and the synthetic faces were compared to the subject&rsquo / s actual 3D scan data with the same facial expressions taken from the dataset.

Page generated in 0.1231 seconds