Spelling suggestions: "subject:"coacial expressiondata processing"" "subject:"coacial expression.at processing""
1 |
Intensity based methodologies for facial expression recognition.January 2001 (has links)
by Hok Chun Lo. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (leaves 136-143). / Abstracts in English and Chinese. / LIST OF FIGURES --- p.viii / LIST OF TABLES --- p.x / Chapter 1. --- INTRODUCTION --- p.1 / Chapter 2. --- PREVIOUS WORK ON FACIAL EXPRESSION RECOGNITION --- p.9 / Chapter 2.1. --- Active Deformable Contour --- p.9 / Chapter 2.2. --- Facial Feature Points and B-spline Curve --- p.10 / Chapter 2.3. --- Optical Flow Approach --- p.11 / Chapter 2.4. --- Facial Action Coding System --- p.12 / Chapter 2.5. --- Neural Network --- p.13 / Chapter 3. --- EIGEN-ANALYSIS BASED METHOD FOR FACIAL EXPRESSION RECOGNITION --- p.15 / Chapter 3.1. --- Related Topics on Eigen-Analysis Based Method --- p.15 / Chapter 3.1.1. --- Terminologies --- p.15 / Chapter 3.1.2. --- Principal Component Analysis --- p.17 / Chapter 3.1.3. --- Significance of Principal Component Analysis --- p.18 / Chapter 3.1.4. --- Graphical Presentation of the Idea of Principal Component Analysis --- p.20 / Chapter 3.2. --- EigenFace Method for Face Recognition --- p.21 / Chapter 3.3. --- Eigen-Analysis Based Method for Facial Expression Recognition --- p.23 / Chapter 3.3.1. --- Person-Dependent Database --- p.23 / Chapter 3.3.2. --- Direct Adoption of EigenFace Method --- p.24 / Chapter 3.3.3. --- Multiple Subspaces Method --- p.27 / Chapter 3.4. --- Detail Description on Our Approaches --- p.29 / Chapter 3.4.1. --- Database Formation --- p.29 / Chapter a. --- Conversion of Image to Column Vector --- p.29 / Chapter b. --- "Preprocess: Scale Regulation, Orientation Regulation and Cropping." --- p.30 / Chapter c. --- Scale Regulation --- p.31 / Chapter d. --- Orientation Regulation --- p.32 / Chapter e. --- Cropping of images --- p.33 / Chapter f. --- Calculation of Expression Subspace for Direct Adoption Method --- p.35 / Chapter g. --- Calculation of Expression Subspace for Multiple Subspaces Method. --- p.38 / Chapter 3.4.2. --- Recognition Process for Direct Adoption Method --- p.38 / Chapter 3.4.3. --- Recognition Process for Multiple Subspaces Method --- p.39 / Chapter a. --- Intensity Normalization Algorithm --- p.39 / Chapter b. --- Matching --- p.44 / Chapter 3.5. --- Experimental Result and Analysis --- p.45 / Chapter 4. --- DEFORMABLE TEMPLATE MATCHING SCHEME FOR FACIAL EXPRESSION RECOGNITION --- p.53 / Chapter 4.1. --- Background Knowledge --- p.53 / Chapter 4.1.1. --- Camera Model --- p.53 / Chapter a. --- Pinhole Camera Model and Perspective Projection --- p.54 / Chapter b. --- Orthographic Camera Model --- p.56 / Chapter c. --- Affine Camera Model --- p.57 / Chapter 4.1.2. --- View Synthesis --- p.58 / Chapter a. --- Technique Issue of View Synthesis --- p.59 / Chapter 4.2. --- View Synthesis Technique for Facial Expression Recognition --- p.68 / Chapter 4.2.1. --- From View Synthesis Technique to Template Deformation --- p.69 / Chapter 4.3. --- Database Formation --- p.71 / Chapter 4.3.1. --- Person-Dependent Database --- p.72 / Chapter 4.3.2. --- Model Images Acquisition --- p.72 / Chapter 4.3.3. --- Templates' Structure and Formation Process --- p.73 / Chapter 4.3.4. --- Selection of Warping Points and Template Anchor Points --- p.77 / Chapter a. --- Selection of Warping Points --- p.78 / Chapter b. --- Selection of Template Anchor Points --- p.80 / Chapter 4.4. --- Recognition Process --- p.81 / Chapter 4.4.1. --- Solving Warping Equation --- p.83 / Chapter 4.4.2. --- Template Deformation --- p.83 / Chapter 4.4.3. --- Template from Input Images --- p.86 / Chapter 4.4.4. --- Matching --- p.87 / Chapter 4.5. --- Implementation of Automation System --- p.88 / Chapter 4.5.1. --- Kalman Filter --- p.89 / Chapter 4.5.2. --- Using Kalman Filter for Trakcing in Our System --- p.89 / Chapter 4.5.3. --- Limitation --- p.92 / Chapter 4.6. --- Experimental Result and Analysis --- p.93 / Chapter 5. --- CONCLUSION AND FUTURE WORK --- p.97 / APPENDIX --- p.100 / Chapter I. --- Image Sample 1 --- p.100 / Chapter II. --- Image Sample 2 --- p.109 / Chapter III. --- Image Sample 3 --- p.119 / Chapter IV. --- Image Sample 4 --- p.135 / BIBLIOGRAPHY --- p.136
|
2 |
Physics based facial modeling and animation.January 2002 (has links)
by Leung Hoi-Chau. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2002. / Includes bibliographical references (leaves 70-71). / Abstracts in English and Chinese. / Chapter Chapter 1. --- Introduction --- p.1 / Chapter Chapter 2. --- Previous Works --- p.2 / Chapter 2.1. --- Facial animations and facial surgery simulations / Chapter 2.2. --- Facial Action Coding System (FACS) / Chapter 2.3. --- The Boundary Element Method (BEM) in Computer Graphics / Chapter Chapter 3. --- The Facial Expression System --- p.7 / Chapter 3.1. --- Input to the system / Chapter 3.1.1. --- Orientation requirements for the input mesh / Chapter 3.1.2. --- Topology requirements for the input mesh / Chapter 3.1.3. --- Type of the polygons of the facial mesh / Chapter 3.2. --- Facial Modeling and Feature Recognition / Chapter 3.3. --- User Control / Chapter 3.4. --- Output of the system / Chapter Chapter 4. --- Boundary Element Method (BEM) --- p.12 / Chapter 4.1. --- Numerical integration of the kernels / Chapter 4.1.1. --- P and Q are different / Chapter 4.1.2. --- P and Q are identical / Chapter 4.1.2.1. --- Evaluation of the Singular Traction Kernel / Chapter 4.1.2.2. --- Evaluation of the Singular Displacement Kernel / Chapter 4.2. --- Assemble the stiffness matrix / Chapter Chapter 5. --- Facial Modeling --- p.18 / Chapter 5.1. --- Offset of facial mesh / Chapter 5.2. --- Thickening of Face Contour / Chapter Chapter 6. --- Facial Feature Recognition --- p.22 / Chapter 6.1. --- Extract all contour edges from the facial mesh / Chapter 6.2. --- Separate different holes from the contour edges / Chapter 6.3. --- Locating the bounding boxes of different holes / Chapter 6.4. --- Determine the facial features / Chapter 6.4.1. --- Eye positions / Chapter 6.4.2. --- Mouth position and Face / Chapter 6.4.3. --- Nose position / Chapter 6.4.4. --- Skull position / Chapter Chapter 7. --- Boundary Conditions in the system --- p.28 / Chapter 7.1. --- Facial Muscles / Chapter 7.2. --- Skull Bone / Chapter 7.3. --- Facial Muscle recognition / Chapter 7.3.1. --- Locating muscle-definers / Chapter 7.3.2. --- Locating muscles / Chapter 7.4. --- Skull Bone Recognition / Chapter 7.5. --- Refine the bounding regions of the facial features / Chapter 7.6. --- Add/Remove facial muscles / Chapter Chapter 8. --- Muscles Movement --- p.40 / Chapter 8.1. --- Muscle contraction / Chapter 8.2. --- Muscle relaxation / Chapter 8.3. --- The Muscle sliders / Chapter Chapter 9. --- Pre-computation --- p.44 / Chapter 9.1. --- Changing the Boundary Values / Chapter Chapter 10 --- . Implementation --- p.46 / Chapter 10.1. --- Data Structure for the facial mesh / Chapter 10.2. --- Implementation of the BEM engine / Chapter 10.3. --- Facial modeling and the facial recognition / Chapter Chapter 11 --- . Results --- p.48 / Chapter 11.1. --- Example 1 (low polygon man face) / Chapter 11.2. --- Example 2 (girl face) / Chapter 11.3. --- Example 3 (man face) / Chapter 11.4. --- System evaluation / Chapter Chapter 12 --- . Conclusions --- p.67 / References --- p.70
|
Page generated in 0.1645 seconds