• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 189
  • 22
  • 18
  • 9
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 327
  • 327
  • 70
  • 65
  • 64
  • 54
  • 54
  • 52
  • 50
  • 37
  • 30
  • 27
  • 26
  • 24
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Orienting to emotion : a psychophysical approach /

Bannerman, Rachel L. January 2009 (has links)
Thesis (Ph.D.)--Aberdeen University, 2009. / Title from web page (viewed on Feb. 22, 2010). Includes bibliographical references.
22

Quantifying facial expression recognition across viewing conditions /

Goren, Deborah, January 2004 (has links)
Thesis (M.Sc.)--York University, 2004. Graduate Programme in Biology. / Typescript. Includes bibliographical references (leaves 59-66). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://wwwlib.umi.com/cr/yorku/fullcit?pMQ99314
23

Facial attraction: do emotional expressions really capture attention? /

Irons, Jessica. January 2005 (has links) (PDF)
Thesis (B.A. (Hons.)) - University of Queensland, 2005. / Includes bibliography.
24

Expressive facial animation transfer for virtual actors /

Zhao, Hui. January 2007 (has links)
Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2007. / Includes bibliographical references (leaves 37-41). Also available in electronic version.
25

Návrh metodiky riggingu biodat

Slonková, Hana January 2011 (has links)
No description available.
26

Judgment of feeling states from facial behavior: a bottom-up approach

Snodgrass, Jacalyn D. 05 1900 (has links)
A series of studies was conducted to examine the feasiblity of a bottom-up approach to the study of judgment of affective feeling states from facial behavior. Previous work on the judgment of emotion from facial expressions has taken a more top-down approach; observers judged the emotional meaning of a wholistic facial expression. Individual facial movements have sometimes then been identified within that complex expression, but the meaning of those individual movements has not been studied. A bottom-up approach begins by exploring the meaning of individual facial movements instead of complex facial expressions. In this approach the relationship between the emotional meaning of individual facial movements and complex facial expressions can be explored. It is argued that such an approach has the potential to explain judgment of not only a limited set of basic emotional expressions, but the full range of emotionally tinged feelings that individuals both experience in themselves and judge in others. Individual action units, as scored by Ekman and Friesen's (1978) Facial Action Coding System (FACS), and selected combinations of action units were presented to observers in three pairs of studies. Filmstrip sequences were used in the first pair of studies, and still photographs in the other two pairs. In the first study of each pair, observers judged the degree of pleasure and arousal expressed by the face. In the second study of each pair, observers rated how well each of a set of emotion terms described the feeling expressed by the face. Observers were found to reliably attribute meaning to individual action units on both scales. Additionally, pleasure and arousal judgments predicted emotion term ratings. The meaning attributed to combinations of action units was found to be related to the meanings of the individual action units occurring alone. Resultant ratings were shown to be meaningful within a dimensional model of emotion space. / Arts, Faculty of / Psychology, Department of / Graduate
27

Facial action determinants of pain judgment

Lee, Douglas Spencer January 1985 (has links)
Nonverbal indices of pain are some of the least researched sources of data for assessing pain. The extensive literature on the communicative functions of nonverbal facial expressions suggests that there is potentially much information to be gained in studying facial expressions associated with pain. Results from two studies support the position that facial expressions related to pain may indeed be a source of information for pain assessment. A review of the literature found several studies indicating that judges could make discriminations amongst levels of discomfort from viewing a person's facial expressions. Other studies found that the occurrence of a small set of facial movements could be used to discriminate amongst several levels of self-reported discomfort. However, there was no research directly addressing the question of whether judges ratings would vary in response to different patterns of the identified facial movements. Issues regarding the facial cues used by naive judges in making ratings of another person's discomfort were investigated. Four hypotheses were developed. From prior research using the Facial Action Coding System (FACS) (Ekman S. Friesen, 1978) a small set of facial muscle movements, termed Action Units (AUs), were found to be the best facial movements for discriminating amongst different levels of pain. The first hypothesis was that increasing the number of AUs per expression would lead to increased ratings of discomfort. The second hypothesis was that video segments with the AUs portrayed simultaneously would be rated higher than segments with the same AUs portrayed in a sequential configuration. Four encoders portrayed all configurations. The configurations were randomly editted onto video tape and presented to the judges. The judges used the scale of affective discomfort developed by Gracely, McGrath, and Dubner (1978). Twenty-five male and 25 female university students volunteered as judges. The results supported both hypotheses. Increasing the number of AUs per expression led to a sharp rise in judges' ratings. Video segments of overlapping AU configurations were rated higher than segments with non-averlapping configurations. Female judges always rated higher than male judges. The second study was methodologically similar to the first study. The major hypothesis was that expressions with only upper face AUs would be rated as more often indicating attempts to hide an expression than lower face expressions. This study contained a subset of expressions that were identical to ones used in the first study. This allowed for testing of the fourth hypothesis which stated that the ratings of this subset of expressions would differ between the studies due to the differences in the judgment conditions. Both hypotheses were again supported. Upper face expressions were more often judged as portraying attempts by the encoders to hide their expressions. Analysis of the fourth hypothesis revealed that the expressions were rated higher in study 2 than study 1. A sex of judge X judgment condition interaction indicated that females rated higher in study 1 but males rated higher in study 2. The results from these studies indicated that the nonverbal communication of facial expressions of pain was defined by a number of parameters which led judges to alter their ratings depending on the parameters of the facial expressions being viewed. While studies of the micro-behavioral aspects of facial expressions are new, the present studies suggest that such research is integral to understanding the complex communication functions of nonverbal facial expressions. / Arts, Faculty of / Psychology, Department of / Graduate
28

Nonverbal encoding and decoding of emotion in children :: data and theories.

Philippot, Pierre Rene 01 January 1987 (has links) (PDF)
No description available.
29

Biases in the decoding of others' facial expressions.

Donovan, Sean 01 January 1993 (has links) (PDF)
No description available.
30

Live Performance and Emotional Analysis of MathSpring Intelligent Tutor System Students

Gupta, Ankit 12 May 2020 (has links)
An important goal of Educational Data Mining is to provide data and visualization about students’ state of knowledge and students’ affective states. The combination of these provides an understanding of the easiness or hardness of the concepts being taught and the student’s comfortability in it. While various studies have been conducted on estimating students’ knowledge and affect, little research has been done to transform this collected (Raw) data into meaningful information that is more relatable to teachers, parents and other stakeholders, i.e. Non-Researchers. This research seeks to enhance existing Teacher Tools (An application designed within the MathSpring - An Intelligent Tutoring system) to generate a live dashboard for teachers to use in the classroom, as their students are using MathSpring. The system captures student performance and detects students’ facial expressions, which highlight students emotion and engagement, using a deep learning model that detects facial expressions. The live dashboard enables teachers to understand and juxtapose the state of knowledge and corresponding affect of students as they practice math problem solving. This should help teachers understand students’ state of mind better, and feed this information back to act and alter their instruction or interaction with each student in a personalized way. We present results of teachers' perceptions of the usefulness of the Live Dashboard, through a qualitative and quantitative survey.

Page generated in 6.6086 seconds