• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 36
  • 26
  • 6
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 169
  • 169
  • 62
  • 33
  • 23
  • 23
  • 23
  • 22
  • 21
  • 17
  • 16
  • 15
  • 12
  • 10
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

THE EFFECT OF STRESS ON THE DECODER'S COMMUNICATION CHANNELS

Keeley, Maureen Patricia, 1961- January 1987 (has links)
This thesis investigated the interaction of stress and decoding accuracy through the vocalic and facial kinesic channels and with regard to gender. Stress (high and low) was created for 372 undergraduate students using the Stroop Color-Word Test. Overall, results did not show that an increase in stress led to a decrease in decoding accuracy. However, the findings did suggest that stress was impacting on the decoding process. The researcher uncovered a main effect for channel such that the facial kinesic channel was the most accurate for decoding emotions. In addition, an ordinal interaction was found during the first time period which showed that stress was differentially affecting the four groups (kinesic, high and low stress; vocalic, high and low stress). Males and females were affected in a similar manner by stress, with females being consistently more accurate decoders than males regardless of the amount of stress or channel used.
22

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Zhao, Yisu 31 October 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way. This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states. In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points. Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level. The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
23

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Zhao, Yisu 31 October 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way. This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states. In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points. Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level. The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
24

Situational differences in rater's nonverbal cue utilization in the formation of leader perceptions

Redmond, Matthew R. 12 1900 (has links)
No description available.
25

Learners' perceptions of teachers' non-verbal behaviours in the foreign language class

Sime, Daniela January 2003 (has links)
This study explores the meanings that participants in a British ELT setting give to teachers' non-verbal behaviours. It is a qualitative, descriptive study of the perceived functions that gestures and other non-verbal behaviours perform in the foreign language classroom, viewed mainly from the language learners' perspective. The thesis presents the stages of the research process, from the initial development of the research questions to the discussion of the research findings that summarise and discuss the participants' views. There are two distinct research phases presented in the thesis. The pilot study explores the perceptions of 18 experienced language learners of teachers' non-verbal behaviours. The data is collected in interviews based on videotaped extracts of classroom interaction, presented to the participants in two experimental conditions, with and without sound. The findings of this initial study justify the later change of method from the experimental design to a more exploratory framework. In the main study, 22 learners explain, in interviews based on stimulated recall, their perceptions on their teachers' verbal and non-verbal behaviours as occurring within the immediate classroom context. Finally, learners' views are complemented by 20 trainee teachers' written reports of classroom observation and their opinions expressed in focus group interviews. The data for the main study were thus collected through a combination of methods, ranging from classroom direct observations and videotaped recordings, to semi-structured interviews with language learners. The research findings indicate that participants generally believe that gestures and other non-verbal behaviours playa key role in the language learning and teaching process. Learners identify three types of functions that non-verbal behaviours play in the classroom interaction: (i) cognitive, i.e. non-verbal behaviours which work as enhancers of the learning processes, (ii) emotional, i.e. non-verbal behaviours that function as reliable communicative devices of teachers' emotions and attitudes and (iii) organisational, i.e. non-verbal behaviours which serve as tools of classroom management and control. The findings suggest that learners interpret teachers' non-verbal behaviours in a functional manner and use these messages and cues in their learning and social interaction with the teacher. The trainee teachers value in a similar manner the roles that non-verbal behaviours play in the language teaching and learning. However, they seem to prioritise the cognitive and managerial functions of teachers' non-verbal behaviours over the emotional ones and do not consider the latter as important as the learners did. This study is original in relation to previous studies of language classroom interaction in that it: • describes the kinds of teachers' behaviours which all teachers and learners are familiar with, but which have seldom been foregrounded in classroom-based research; • unlike previous studies of non-verbal behaviour, investigates the perceiver's view of the others' non-verbal behaviour rather than its production; • documents these processes of perception through an innovative methodology of data collection and analysis; • explores the teachers' non-verbal behaviours as perceived by the learners themselves, suggesting that their viewpoint can be one window on the reality of language classrooms; • provides explanations and functional interpretations for the many spontaneous and apparently unimportant actions that teachers use on a routine basis; • identifies a new area which needs consideration in any future research and pedagogy of language teaching and learning.
26

Theorizing the translation of body language a study of nonverbal behaviors in literature /

Yung, Hiu-yu. January 2010 (has links)
Thesis (M. Phil.)--University of Hong Kong, 2010. / Includes bibliographical references (leaves 125-132). Also available in print.
27

Racial discrimination in the personnel setting strategies for change /

Rothwell, Judith Gordon. January 1900 (has links)
Thesis (Ph. D)--University of California, Santa Cruz, 1987. / Typescript. Includes bibliographical references (leaves 107-116).
28

An investigation of a model of social competence

White, Sara. January 2005 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Psychology. / Includes bibliographical references.
29

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Zhao, Yisu January 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way. This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states. In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points. Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level. The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
30

The effects of motivation on the interpretation of nonverbal behaviors.

Hrubes, Daniel 01 January 1998 (has links) (PDF)
No description available.

Page generated in 0.0647 seconds