Spelling suggestions: "subject:"[een] BODY LANGUAGE"" "subject:"[enn] BODY LANGUAGE""
11 |
The effects of the individual, the interaction and the measurement partner on the measurement of conversational distances in young adult womenHanchett, Effie S. January 1974 (has links)
Thesis (Ph. D.)--New York University, 1974. / Includes bibliographical references (leaves 57-62).
|
12 |
Nonverbal communication : race, gender, social class, world view and the PONS test ; implications for the therapeutic dyad /Stokes, DeVon Renard January 1984 (has links)
No description available.
|
13 |
Locus of control and nonverbal decoding accuracy among women with bulimia /Leclair, Norma J. January 1984 (has links)
No description available.
|
14 |
THE EFFECT OF STRESS ON THE DECODER'S COMMUNICATION CHANNELSKeeley, Maureen Patricia, 1961- January 1987 (has links)
This thesis investigated the interaction of stress and decoding accuracy through the vocalic and facial kinesic channels and with regard to gender. Stress (high and low) was created for 372 undergraduate students using the Stroop Color-Word Test. Overall, results did not show that an increase in stress led to a decrease in decoding accuracy. However, the findings did suggest that stress was impacting on the decoding process. The researcher uncovered a main effect for channel such that the facial kinesic channel was the most accurate for decoding emotions. In addition, an ordinal interaction was found during the first time period which showed that stress was differentially affecting the four groups (kinesic, high and low stress; vocalic, high and low stress). Males and females were affected in a similar manner by stress, with females being consistently more accurate decoders than males regardless of the amount of stress or channel used.
|
15 |
Human Emotion Recognition from Body Language of the Head using Soft Computing TechniquesZhao, Yisu 31 October 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way.
This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states.
In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points.
Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level.
The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
|
16 |
Human Emotion Recognition from Body Language of the Head using Soft Computing TechniquesZhao, Yisu 31 October 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way.
This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states.
In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points.
Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level.
The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
|
17 |
Situational differences in rater's nonverbal cue utilization in the formation of leader perceptionsRedmond, Matthew R. 12 1900 (has links)
No description available.
|
18 |
Learners' perceptions of teachers' non-verbal behaviours in the foreign language classSime, Daniela January 2003 (has links)
This study explores the meanings that participants in a British ELT setting give to teachers' non-verbal behaviours. It is a qualitative, descriptive study of the perceived functions that gestures and other non-verbal behaviours perform in the foreign language classroom, viewed mainly from the language learners' perspective. The thesis presents the stages of the research process, from the initial development of the research questions to the discussion of the research findings that summarise and discuss the participants' views. There are two distinct research phases presented in the thesis. The pilot study explores the perceptions of 18 experienced language learners of teachers' non-verbal behaviours. The data is collected in interviews based on videotaped extracts of classroom interaction, presented to the participants in two experimental conditions, with and without sound. The findings of this initial study justify the later change of method from the experimental design to a more exploratory framework. In the main study, 22 learners explain, in interviews based on stimulated recall, their perceptions on their teachers' verbal and non-verbal behaviours as occurring within the immediate classroom context. Finally, learners' views are complemented by 20 trainee teachers' written reports of classroom observation and their opinions expressed in focus group interviews. The data for the main study were thus collected through a combination of methods, ranging from classroom direct observations and videotaped recordings, to semi-structured interviews with language learners. The research findings indicate that participants generally believe that gestures and other non-verbal behaviours playa key role in the language learning and teaching process. Learners identify three types of functions that non-verbal behaviours play in the classroom interaction: (i) cognitive, i.e. non-verbal behaviours which work as enhancers of the learning processes, (ii) emotional, i.e. non-verbal behaviours that function as reliable communicative devices of teachers' emotions and attitudes and (iii) organisational, i.e. non-verbal behaviours which serve as tools of classroom management and control. The findings suggest that learners interpret teachers' non-verbal behaviours in a functional manner and use these messages and cues in their learning and social interaction with the teacher. The trainee teachers value in a similar manner the roles that non-verbal behaviours play in the language teaching and learning. However, they seem to prioritise the cognitive and managerial functions of teachers' non-verbal behaviours over the emotional ones and do not consider the latter as important as the learners did. This study is original in relation to previous studies of language classroom interaction in that it: • describes the kinds of teachers' behaviours which all teachers and learners are familiar with, but which have seldom been foregrounded in classroom-based research; • unlike previous studies of non-verbal behaviour, investigates the perceiver's view of the others' non-verbal behaviour rather than its production; • documents these processes of perception through an innovative methodology of data collection and analysis; • explores the teachers' non-verbal behaviours as perceived by the learners themselves, suggesting that their viewpoint can be one window on the reality of language classrooms; • provides explanations and functional interpretations for the many spontaneous and apparently unimportant actions that teachers use on a routine basis; • identifies a new area which needs consideration in any future research and pedagogy of language teaching and learning.
|
19 |
Racial discrimination in the personnel setting strategies for change /Rothwell, Judith Gordon. January 1900 (has links)
Thesis (Ph. D)--University of California, Santa Cruz, 1987. / Typescript. Includes bibliographical references (leaves 107-116).
|
20 |
An investigation of a model of social competenceWhite, Sara. January 2005 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Psychology. / Includes bibliographical references.
|
Page generated in 0.038 seconds