• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 94
  • 61
  • 27
  • 22
  • 7
  • 6
  • 5
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 263
  • 52
  • 51
  • 44
  • 37
  • 34
  • 32
  • 27
  • 24
  • 21
  • 20
  • 19
  • 19
  • 18
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Evaluating Appropriateness of Emg and Flex Sensors for Classifying Hand Gestures

Akumalla, Sarath Chandra 05 1900 (has links)
Hand and arm gestures are a great way of communication when you don't want to be heard, quieter and often more reliable than whispering into a radio mike. In recent years hand gesture identification became a major active area of research due its use in various applications. The objective of my work is to develop an integrated sensor system, which will enable tactical squads and SWAT teams to communicate when there is absence of a Line of Sight or in the presence of any obstacles. The gesture set involved in this work is the standardized hand signals for close range engagement operations used by military and SWAT teams. The gesture sets involved in this work are broadly divided into finger movements and arm movements. The core components of the integrated sensor system are: Surface EMG sensors, Flex sensors and accelerometers. Surface EMG is the electrical activity produced by muscle contractions and measured by sensors directly attached to the skin. Bend Sensors use a piezo resistive material to detect the bend. The sensor output is determined by both the angle between the ends of the sensor as well as the flex radius. Accelerometers sense the dynamic acceleration and inclination in 3 directions simultaneously. EMG sensors are placed on the upper and lower forearm and assist in the classification of the finger and wrist movements. Bend sensors are mounted on a glove that is worn on the hand. The sensors are located over the first knuckle of each figure and can determine if the finger is bent or not. An accelerometer is attached to the glove at the base of the wrist and determines the speed and direction of the arm movement. Classification algorithm SVM is used to classify the gestures.
2

Gestų atpažinimas / Gesture recognition

Bertašius, Mindaugas 13 June 2005 (has links)
The theme of Master project of Electronics engineer is actual for human-computer interaction (HCI). Hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI). In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. Working with computers has become an integral feature of our society. The keyboard and mouse are currently the main interfaces between man and computer. In other areas where 3D information is required, such as computer games, robotics and design, other mechanical devices such as roller-balls, joysticks and data-gloves are used. Humans communicate mainly by vision and sound, therefore, a man-machine interface would be more intuitive if it made greater use of vision and audio recognition. Another advantage is that the user not only can communicate from a distance, but need have no physical contact with the computer. However, unlike audio commands, a visual system would be preferable in noisy environments or in situations where sound would cause a disturbance. The visual system chosen was the recognition of hand gestures. There are many explorations of objects recognition in images and in video stream. Yearly improvements of computer technology lets researchers model and build more and more complex systems of recognition but it still remains a lot of unsolved problems and it is hard to decide witch system is the best. Our aim is to create a program... [to full text]
3

Meaningless movement or essential expression : A study about gestures

Bohlin, Stina January 2021 (has links)
The thesis investigates how body movements influence a musical performance, with the aim to reach a more expressive performance by an increased awareness of gestures. In the study, three versions of the same clarinet piece were recorded on video; one with me, one with my clarinet teacher and one with a fellow clarinet student. The study addresses the following research questions: - How do body movements correspond to musical intentions? - How are my gestures formed and influenced by my teachers’ gestures? - In what ways can a raised awareness of gestures affect my musical performance? The videos were coded and analysed using open coding. As a reference, each clarinettist notated their intended phrasing in the score. This was marked as phrases (slurs) and Goal Points, and was also annotated in ELAN. With the intention to answer the first research question, the body movements were compared with the performers intended phrasing. In order to answer the second research question, coded sequences from each performance were compared with each other to find similarities and differences, using both quantitative and qualitative methods. Finally, I recorded a second performance of the same piece to investigate whether awareness of gestures affected my performance. Results align with previous research and indicate that body gestures are unique for each performer and connected to musical intentions. Results also indicate a resemblance in movement patterns between my teacher’s performance and my own, suggesting that gestures can be transferred from teacher to student.
4

A Software Development Kit for Camera-Based Gesture Interaction

Cronin, Devlin 01 December 2013 (has links) (PDF)
Human-Computer Interaction is a rapidly expanding field, in which new implementations of ideas are consistently being released. In recent years, much of the concentration in this field has been on gesture-based control, either touch-based or camera-based. Even though camera-based gesture recognition was previously seen more in science fiction than in reality, this method of interaction is rising in popularity. There are a number of devices readily available to the average consumer that are designed to support this type of input, including the popular Microsoft Kinect and Leap Motion devices. Despite this rise in availability and popularity, development for these devices is currently an arduous task, unless only the most simple of gestures is required. The goal of this thesis is to develop a Software Development Kit (SDK) with which developers can more easily develop interfaces that utilize gesture-based control. If successful, this SDK could significantly reduce the amount of work (both in effort and in lines of code) necessary for a programmer to implement gesture control in an application. This, in turn, could help reduce the intellectual barrier which many face when attempting to implement a new interface. The developed SDK has three main goals. The SDK will place an emphasis on simplicity of code for developers using it; will allow for a variety of gestures, including gestures made by single or multiple trackable objects (e.g., hands and fingers), gestures performed in stages, and continuously-updating gestures; and will be device-agnostic, in that it will not be written exclusively for a single device. The thesis presents the results of a system validation study that suggests all of these goals have been met.
5

Exploring Measurement Estimation Through Learners Actions, Language, and Gestures

Harrison, Avery 09 April 2019 (has links)
This thesis intends to advance educational research by providing exploratory insights about the roles of, and relationships between, the actions, language, and gestures of college and elementary-aged students surrounding measurement estimation. To the best of my knowledge, prior research has examined the role of speech and gestures as they relate to areas of mathematics such as algebra and geometry, however, this work has not been extended to the area of measurement. Similarly, language and gesture have been explored but the three-way interplay between actions during problem-solving, and the language and gestures observed during explanations after problem solving has not been investigated in mathematics. To actualize the findings from this research in practice, this thesis uses the findings from two studies on behavior during measurement tasks to propose text and image support for an elementary-aged measurement game, EstimateIT!, to support students as they practice how to measure objects and develop conceptual skills through embodied game play. Specifically, this thesis intends to provide 1) a synthesis of the work on gestures in mathematics as well as the research methods used to study gestures, 2) a coding guide to analyze the gestures of mathematics learners, as well as their actions and language, 3) an application of the coding guide to explore the behavior of college and elementary students during measurement estimation tasks, and 4) proposals for action-guiding support for EstimateIT! to help elementary students develop and reinforce an understanding of measurement during gameplay based on the more mature strategies demonstrated by college students as they complete similar tasks.
6

Social Dominance and Conciliatory Gestures as Determinants of Reconciliation and Forgiveness

Cohen, Adam Daniel 01 January 2008 (has links)
In this project I evaluated the effect of social dominance on reconciliation and forgiveness. Based on studies of nonhuman primates, it was hypothesized that humans would be more likely to accept and reciprocate conciliatory gestures when made by more socially dominant people. It was also hypothesized that the moderating effect of relative dominance on a victim?s decision to forgive would not be as strong as relative dominance?s effect on a victim?s decision to reconcile. This hypothesis was based on the expectation that reconciliation is most essential for gaining access to transgressor-controlled resources. However, conciliatory gestures by less dominant transgressors more effectively elicited forgiveness and reconciliation, as these gestures were evidently more successful at making victims feel safe. Also, relative dominance did not have a greater effect on victims? conciliatory behaviors than on forgiveness.
7

Using Video Modeling to Teach Children with Autism to Give Verbal Compliments and Gesture Appropriately During Competitive Play

Macpherson, Kevin H. 01 January 2012 (has links)
The effects of a video-modeling intervention, given to five children with autism while playing kickball, were evaluated through a multiple-baseline design across subjects. The researcher targeted two social skills, verbal compliments and appropriate gestures, using the iPad as a portable video device to model the desired behaviors in situ, on second base mid-game. Children were required to verbally and non-verbally compliment their peers during the kickball games. After presented the video clip, children showed rapid mastery of the verbal complimenting skills, and displayed an increased but less profound number of gestures displayed in the intervention phase.
8

Metaphors and Gestures for Abstract Concepts in Academic English Writing

Zhao, Jun January 2007 (has links)
Gestures and metaphors are important mediational tools to materialize abstract conventions in the conceptual development process (Lantolf and Thorne, 2006): metaphors are used in the educational setting to simplify abstract knowledge for learners (Ungerer and Schmidt, 1996; Wee, 2005); gestures, through visual representation, can "provide additional insights into how humans conceptualize abstract concepts via metaphors" (Mittelberg, in press, p. 23).This study observed and videotaped four composition instructors and 54 ESL students at an American university to probe how their metaphorical expressions and gestures in a variety of naturally occurring settings, such as classroom teaching, student-teacher conferencing, peer reviewing and student presentations, represent the abstract rhetorical conventions of academic writing in English. By associating students' gestures with the instructors' metaphors and gestures, this study found evidence for the assistive roles of metaphors and gestures in the learning process. The final interviews elicited students' metaphors of academic writing in English and in their first languages. The interviewees were also asked to reflect upon the effectiveness of the metaphors and gestures they were exposed to.This study confirmed the roles of gestures in reflecting the abstract mental representation of academic writing. Twelve patterns were extracted from the instructors' data, including the linearity, container, building, journey metaphors and others. Of these twelve patterns, six were materialized in the students' gestural usage. The similarity of gestures found in the instructors' and students' data provided proof of the occurrence of learning. In the elicited data, students created pyramid, book, and banquet metaphors, to highlight features of academic writing in English and in their first languages. These new metaphors demonstrate students' ability to synthesize simple metaphors they encountered for a more complex one, which is more significant in the learning process. The interviews suggest that metaphors are better-perceived and more effective in relating abstract knowledge to the students. Gestures were not judged by the students to be helpful. This could result from the fact that gestures, other than emblems, are often understood unconsciously and are naturally used to provide additional information to the verbal utterance rather than replacing speech, which is more prominent perceptually and conceptually.
9

Distracting the imagination: does visuospatial or auditory interference influence gesture and speech during narrative production?

Smithson, Lisa Unknown Date
No description available.
10

Design and Realization of the Gesture-Interaction System Based on Kinect

Xu, Jie January 2014 (has links)
In the past 20 years humans have mostly used a mouse to interact with computers. However, with the rapidly growing use of computers, a need for alternative means of interaction has emerged. With the advent of Kinect, a brand-new way of human- computer interaction has been introduced. It allows the use of gestures - the most natural body-language - to communicate with computers, helping us get rid of traditional constraints and providing an intuitive method for executing operations. This thesis presents how to design and implement a program to help people interact with computers, without the traditional mouse, and with the support and help of a Kinect device (an XNA Game framework with Microsoft Kinect SDK v1.7). For dynamic gesture recognition, the Hidden Markov Model (HMM) and Dynamic Time Warping (DTW), are suggested. The use of DTW is being motivated by experimental analysis. A dynamic-gesture-recognition program is developed, based on DTW, to help computers recognize customized gestures by users. The experiment also shows that DTW can have rather good performance. As for further development, the use of the XNA Game 4.0 framework, which integrates the Kinect body tracking into DTW gesture recognition technologies, is introduced. Finally, a functional test is conducted on the interaction system. In addition to summarizing the results, the thesis also discusses what can be improved in the future.

Page generated in 0.057 seconds