Spelling suggestions: "subject:"headmovement"" "subject:"bodymovement""
1 |
Prosodic Noun Incorporation and Verb-Initial SyntaxClemens, Lauren Eby 21 October 2014 (has links)
To date, no real consensus has emerged among syntacticians about how to derive verb-initial order (V1); but the two main approaches, \(V^0\)-raising and VP-raising, receive particularly widespread support in the literature. The syntax of Niuean pseudo noun incorporation (PNI) has played an important role in the propagation of the VP-raising analysis (Massam 2001), especially for VSO languages and languages with a VSO option.
In this thesis, I present an analysis of the prosody of Niuean PNI and show that the PNI verb and incorporated argument form a prosodic constituent. While this result is consistent with the syntactic analysis of Massam (2001), it is also consistent with a prosodic restructuring analysis that explains the VOS order of PNI by appealing to prosodic well-formedness. I take the second approach. Specifically, the principle behind Selkirk's (1984) Sense Unit Condition requires that the verb and its internal argument(s) form a unique phonological phrase. In order to satisfy this requirement, the incorporated argument moves into a position adjacent to the verb at PF. Positionally motivated categorical feature sharing (Adger and Svenonius 2011; Pesetsky and Torrego 2007) allows PF to reference the head-argument relationship between the verb and its internal argument, even though they are not sent to PF in structurally adjacent positions.
The main result for the syntactic analysis of Niuean is that \(V^0\)-raising replaces VP-raising. The benefits of the \(V^0\)-raising approach include i) less phonologically vacuous structure in places where Niuean has overt morphology, e.g., a perpetually null \(T^0\) in the face of overt tense markers; and ii) observance of the idea that thematic roles are correlated to structural positions. Thus, the prosodic analysis of Niuean PNI has a number of positive outcomes for Niuean syntax, as well as the potential to simplify the derivation of VSO cross-linguistically. / Linguistics
|
2 |
Utilisation de l'eye-tracking pour l'interaction mobile dans un environnement réel augmenté / Robust gaze tracking for advanced mobile interactionJu, Qinjie 09 April 2019 (has links)
Les dispositifs d'eye-tracking ont un très fort potentiel en tant que modalité d'entrée en IHM (Interaction Homme Machine), en particulier en situation de mobilité. Dans cette thèse, nous nous concentrons sur la mise en œuvre de cette potentialité en mettant en évidence les scénarios dans lesquels l’eye-tracking possède des avantages évidents par rapport à toutes les autres modalités d’interaction. Au cours de nos recherches, nous avons constaté que cette technologie ne dispose pas de méthodes pratiques pour le déclenchement de commandes, ce qui réduit l'usage de tels dispositifs. Dans ce contexte, nous étudions la combinaison d'un eye-tracking et des mouvements volontaires de la tête lorsque le regard est fixe, ce qui permet de déclencher des commandes diverses sans utiliser les mains ni changer la direction du regard. Nous avons ainsi proposé un nouvel algorithme pour la détection des mouvements volontaires de la tête à regard fixe en utilisant uniquement les images capturées par la caméra de scène qui équipe les eye-trackers portés sur la tête, afin de réduire le temps de calcul. Afin de tester la performance de notre algorithme de détection des mouvements de la tête à regard fixe, et l'acceptation par l'utilisateur du déclenchement des commandes par ces mouvements lorsque ses deux mains sont occupées par une autre activité, nous avons effectué des expériences systématiques grâce à l'application EyeMusic que nous avons conçue et développée. Cette application EyeMusic est un système pour l'apprentissage de la musique capable de jouer les notes d’une mesure d'une partition que l’utilisateur ne comprend pas. En effectuant un mouvement volontaire de la tête qui fixe de son regard une mesure particulière d'une partition, l'utilisateur obtient un retour audio. La conception, le développement et les tests d’utilisabilité du premier prototype de cette application sont présentés dans cette thèse. L'utilisabilité de notre application EyeMusic est confirmée par les résultats expérimentaux : 85% des participants ont été en mesure d’utiliser tous les mouvements volontaires de la tête à regard fixe que nous avons implémentés dans le prototype. Le taux de réussite moyen de cette application est de 70%, ce qui est partiellement influencé par la performance intrinsèque de l'eye-tracker que nous utilisons. La performance de notre algorithme de détection des mouvements de la tête à regard fixe est 85%, et il n’y a pas de différence significative entre la performance de chaque mouvement de la tête testé. Également, nous avons exploré deux scénarios d'applications qui reposent sur les mêmes principes de commande, EyeRecipe et EyePay, dont les détails sont également présentés dans cette thèse. / Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. In this thesis, we concentrate in demonstrating this potential by highlighting the scenarios in which the eye-tracking possesses obvious advantages comparing with all the other interaction modalities. During our research, we find that this technology lacks convenient action triggering methods, which can scale down the performance of interacting by gaze. In this instance, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. We have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have implemented some tests in the EyeMusic application that we have designed and developed. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this thesis. The usability of our EyeMusic application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement. Apart from the EyeMusic application, we have explored two other scenarios that are based on the same control principles: EyeRecipe and EyePay, the details of these two applications are also presented in this thesis.
|
3 |
DA Head Movement During Locomotion in Patients with Bilateral Vestibular LossAkin, Faith W., Ashmead, D. A. 01 January 1998 (has links)
No description available.
|
4 |
Visual Contributions to the Vestibulo-Ocular Reflex during Balance Recovery TasksDiehl, Mark D. 01 January 2007 (has links)
Introduction: The Vestibulo-ocular reflex (VOR) is quantified by computing the ratio of head angular velocity and eye angular velocity (VOR Gain). This measure only includes head angular movements; linear translations are not accounted for. These investigations postulate an alternative method of VOR quantification, one that assesses retinal image stability during head angular and linear movements (Foveal fixation (FF)). This method was used to assess the role of vision in balance reactions. Methods: Experiment 1 : Ten Young subjects were fitted with an eye tracker linked to an EM kinematic recording system. This allowed for the recording of head, trunk and eye kinematics during the performance of gaze stabilization tasks. Subjects fixated an LED target while performing head flexion and extension exercises at four frequencies. Point-of-gaze analysis was performed by transforming eye-in-head and head-in-space data into eye-in-space data, which were compared to the known location of the targets. The distance between the eye vector target plane intersection and the target location provided an error that could be used to calculate the estimated image location on the retina. FF and VOR gain were compared with head angular velocities to determine correlations. Experiment 2: Balance was assessed in Young and Elderly following a series of perturbations. Dependent variables included: step latency, head and trunk angular velocity, VOR Gain, FF. Results: Correlations between head angular velocity and FF showed that retinal image stability degraded as head angular velocity increased. Elderly showed a more rapid degradation of FF with higher overall head angular velocities. Comparisons between rate of change of VOR and FF over velocity spectrum indicated a greater change in FF response. A negative linear correlation between FF and Step Latency was observed: there was no relationship between VOR gain and Step Latency. Conclusion: FF is a more sensitive measure of VOR than Gain as it accounts for angular and translational head movements. Its correlation with Step Latency suggests the importance of image stability in formulating responses following perturbation.
|
5 |
Spatial perception and progressive addition lensesHendicott, Peter Leslie January 2007 (has links)
Progressive addition lenses (PALs) are an increasingly preferred mode for the correction of presbyopia, gaining an increased share of the prescription lens market. Sales volumes are likely to increase over the next few years, given the increasing cohort of presbyopic patients in the population. This research investigated adaptation to PAL wear, investigating head movement parameters with and without progressive lenses in everyday visual tasks, and examined symptoms of spatial distortions and illusory movement in a crossover wearing trial of three PAL designs. Minimum displacement thresholds in the presence and absence of head movement were also investigated across the lens designs. Experiment 1 investigated head movements in two common visual tasks, a wordprocessing copy task, and a visual search task designed to replicate a natural environment task such as looking for products on supermarket shelving. Head movement parameters derived from this experiment were used to set head movement amplitude and velocity in the third experiment investigating minimum displacement thresholds across three PAL designs. Head movements were recorded with a Polhemus Inside Track head movement monitoring system which allows real time six degrees of freedom measurement of head position. Head position in azimuth, elevation and roll was extracted from the head movement recorder output, and data for head movement angular extent, average velocity (amplitude/duration) and peak velocity were calculated for horizontal head movements Results of the first experiment indicate a task dependent effect on head movement peak and average velocity, with both median head movement average and peak velocity being faster in the copy task. Visual task and visual processing demands were also shown to affect the slope of the main sequence of head movement velocity on head movement amplitude, with steeper slope in the copy task. A steeper slope, indicating a faster head movement velocity for a given head movement amplitude, was found for head movements during the copy task than in the search task. Processing demands within the copy task were also shown to affect the main sequence slopes of velocity on amplitude, with flatter slopes associated with the need for head movement to bring gaze to a specific point. These findings indicate selective control over head movement velocity in response to differing visual processing demands. In Experiment 2, parameters of head movement amplitude and velocity were assessed in a group of first time PAL wearers. Head movement amplitude, average and peak velocity were calculated from head movement recordings using the search task, as in Experiment 1. Head movements were recorded without PALs, on first wearing a PAL, and after one month of PAL wear to assess adaptation effects. In contrast to existing literature, PAL wear did not alter parameters of head movement amplitude and velocity in a group of first time wearers either on first wearing the lenses or after one month of wear: this is due to task related effects in this experiment compared to previous work. Task demand in this experiment may not have required wearers to use the progressive power corridor to accomplish identification of visual search targets, in contrast to previous studies where experimental conditions were designed to force subjects to use the progressive corridor. In Experiment 3, minimum displacement thresholds for random dot stimuli were measured in a repeated measures experimental design for a single vision lens as control, and three PAL designs. Thresholds were measured in central vision, and for two locations in the temporal peripheral field, 30° temporal fixation and 10° above and below the horizontal midline. Thresholds were determined with and without the subjects' head moving horizontally in an approximate sinusoidal movement at a frequency of about 0.7 Hz. Minimum displacement thresholds were not significantly affected by PAL design, although thresholds with PALs were higher than with a single vision lens control. Head movement significantly increased minimum displacement threshold across lens designs, by a factor of approximately 1.5 times. Results indicate that the local measures of minimum displacement threshold determined in this experiment are not sensitive to lens design differences. Sensitivity to motion with PAL lenses may be more a global than a localized response. For Experiment 4, symptoms of spatial distortion and illusory movement were investigated in a crossover wearing trial of three PAL designs, and related to optical characteristics of the lenses. Peripheral back vertex powers of the PALs were measured at two locations in the right temporal zone of the lenses, 15.6 mm temporal to the fitting cross, and 2.7 m above and below the horizontal to the fitting cross. These locations corresponded to the zones of the lenses through which minimum displacement thresholds were measured in the previous experiment. The effect of subjects' self movement on symptoms is able to discriminate between PAL designs, although subjective symptoms alone were not related to the lens design parameters studied. Subjects' preference for one PAL design over the other designs studied in this experiment is inversely related to the effect on subject movement on their symptoms of distortion. An optical parameter, blur strength, derived from the power vector components of the peripheral powers, may indicate preference for particular PAL designs, as higher blur strength values are associated with lower lens preference scores. Head movement amplitude and velocity are task specific, and are also influenced by visual processing demands within tasks. PALs do not affect head movement amplitude and velocity unless tasks are made demanding or performed in less natural situations designed to influence head movement behaviour. Both head movement and PALs have large effects on minimum displacement thresholds; these effects may be due in part to complexity of the subjects' task within the experiment. Minimum displacement thresholds however were not influenced by PAL design. The most sensitive indicator for subject's preference of PALs was the effect of subjects' self movement on their perception of symptoms, rather than the presence of actual symptoms. Blur strength should be further investigated for its role in PAL acceptance.
|
6 |
AUTOMATED DECLARATIVE GESTURE GENERATION FOR NON-EMOTIONAL HUMAN HUMANOID CONVERSATIONSingh, Aditi 06 December 2017 (has links)
No description available.
|
7 |
Binaural Hearing Effects of Mapping Microphone Array's Responses to a Listener's Head-Related Transfer FunctionsHughet, James 31 October 2011 (has links)
This thesis focuses on the mapping of the microphone array’s response to match the characteristics of a human subject’s Head-Related Transfer Function (HRTF). The mapping of the response is first explored with a ‘monaural HRTF matching’ that filters the response independent of the arrival angles. For arbitrary array geometry with the listener external to the acoustic, the monaural HRTF matching did not provide listeners with enough spatial information to precisely localize sound sources. To correct this, a preprocessor control algorithm was added to the HRTF matching, a ‘binaural HRTF matching’ process. The binaural HRTF matching increased the listeners’ performance in perceiving the location of a sound source. With the addition of simulated head movement, the listeners’ perception increased by 20%. An issue with this approach is the use of HRTFs other than the listeners’ measured HRTF, creating a psychoacoustic based error in localization, i.e., front/back confusion. / Master of Science
|
8 |
Biomechanical assessment of head and neck movements in neck pain using 3D movement analysisGrip, Helena January 2008 (has links)
Three-dimensional movement analysis was used to evaluate head and neck movement in patients with neck pain and matched controls. The aims were to further develop biomechanical models of head and neck kinematics, to investigate differences between subjects with non-specific neck pain and whiplash associated disorders (WAD), and to evaluate the potential of objective movement analysis as a decision support during diagnosis and follow-up of patients with neck pain. Fast, repetitive head movements (flexion, extension, rotation to the side) were studied in a group of 59 subjects with WAD and 56 controls. A back propagation artificial neural network classified vectors of collected movement variables from each individual according to group membership with a predictivity of 89%. The helical axis for head movement were analyzed in two groups of neck pain patients (21 with non-specific neck pain and 22 with WAD) and 24 matched controls. A moving time window with a cut-off angle of 4° was used to calculate finite helical axes. The centre of rotation of the finite axes (CR) was derived as the 3D intersection point of the finite axes. A downward migration of the axis during flexion/extension and a change of axis direction towards the end of the movements were observed. CR was at its most superior position during side rotations and at its most inferior during ball catching. This could relate to that side rotation was mainly done in the upper spine, while all cervical vertebrae were recruited to stabilize the head in the more complex catching task. Changes in movement strategy were observed in the neck pain groups: Neck pain subjects had lower mean velocities and ranges of movements as compared with controls during ball catching, which could relate to a stiffer body position in neck pain patients in order to stabilize the neck. In addition, the WAD group had a displaced axis position during head repositioning after flexion, while CR was displaced during fast side rotations in the non-specific neck pain group. Pain intensity correlated with axis and CR position, and may be one reason for the movement strategy changes. Increased amount of irregularities in the trajectory of the axis was found in the WAD group during head repositioning, fast repetitive head movements and catching. This together with an increased constant repositioning error during repositioning after flexion indicated motor control disturbances. A higher group standard deviation in neck pain groups indicated heterogeneity among subjects in this disturbance. Wireless motion sensors and electro-oculography was used simultaneously, as an initial step towards a portable system and towards a method to quantify head-eye co-ordination deficits in individuals with WAD. Twenty asymptomatic control subjects and six WAD subjects with eye disturbances (e.g. dizziness and double vision) were studied. The trial-to-trial repeatability was moderate to high for all evaluated variables (single intraclass correlation coefficients >0.4 in 28 of 32 variables). The WAD subjects demonstrated decreased head velocity, decreased range of head movement during gaze fixation and lowered head stability during head-eye co-ordination as possible deficits. In conclusion, kinematical analyses have a potential to be used as a support for physicians and physiotherapists for diagnosis and follow-up of neck pain patients. Specifically, the helical axis method gives information about how the movement is performed. However, a flexible motion capture system (for example based on wireless motion sensors) is needed. Combined analysis of several variables is preferable, as patients with different neck pain disorders seem to be a heterogeneous group.
|
9 |
Detect and Analyze the 3-D Head Movement Patterns in Marmoset Monkeys using Wireless Tracking SystemJanuary 2015 (has links)
abstract: Head movement is a natural orienting behavior for sensing environmental events around us. Head movement is particularly important for identifying through the sense of hearing the location of an out-of-sight, rear-approaching target to avoid danger or threat. This research aims to design a portable device for detecting the head movement patterns of common marmoset monkeys in laboratory environments. Marmoset is a new-world primate species and has become increasingly popular for neuroscience research. Understanding the unique patterns of their head movements will improve its values as a new primate model for uncovering the neurobiology of natural orienting behavior. Due to their relatively small head size (5 cm in diameter) and body weight (300-500 g), the device has to meet several unique design requirements with respect to accuracy and workability. A head-mount wireless tracking system was implemented based on inertial sensors that are capable of detecting motion in the Yaw, Pitch and Roll axes. The sensors were connected to the encoding station, which transmits wirelessly the 3-axis movement data to the decoding station at the sampling rate of ~175 Hz. The decoding station relays this information to the computer for real-time display and analysis. Different tracking systems, based on the accelerometer and Inertial Measurement Unit is implemented to track the head movement pattern of the marmoset head. Using these systems, translational and rotational information of head movement are collected, and the data analysis focuses on the rotational head movement in body-constrained marmosets. Three stimulus conditions were tested: 1) Alert, 2) Idle 3) Sound only. The head movement patterns were examined when the house light was turned on and off for each stimulus. Angular velocity, angular displacement and angular acceleration were analyzed in all three axes.
Fast and large head turns were observed in the Yaw axis in response to the alert stimuli and not much in the idle and sound-only stimulus conditions. Contrasting changes in speed and range of head movement were found between light-on and light-off situations. The mean peak angular displacement was 95 degrees (light on) and 55 (light off) and the mean peak angular velocity was 650 degrees/ second (light on) and 400 degrees/second (light off), respectively, in response to the alert stimuli. These results suggest that the marmoset monkeys may engage in different modes of orienting behaviors with respect to the availability of visual cues and thus the necessity of head movement. This study provides a useful tool for future studies in understanding the interplay among visual, auditory and vestibular systems during nature behavior. / Dissertation/Thesis / Masters Thesis Bioengineering 2015
|
10 |
Humorous implications and meanings : a multi-modal approach to sarcasm in interactional humor / Implications humoristiques : une étude multi-modale du sarcasme en interactionTabacaru, Sabina 05 December 2014 (has links)
Cette thèse examine les différentes façons utilisées pour construire de l’humour en interaction dans deux séries américaines contemporaines—/Dr. House/ et /The Big Bang Theory/. A travers les différentes techniques d’écriture, nous observons les éléments utilisés pour construire des énoncés humoristiques. Le dialogue entre les personnages occupe une place fondamentale puisqu’il est centré sur les points de convergence et donc sur l’idée d’intersubjectivité entre les interlocuteurs.Cette étude est basée sur une expérience originale qui implique l’examen de la gestuelle utilisée par les personnages dans les deux séries pour créer des effets humoristiques. Les gestes et les différentes techniques humoristiques ont été annotés dans le logiciel vidéo ELAN qui permet une vision plus large sur les processus créant l’humour en interaction.Les résultats montrent une visible préférence pour le sarcasme en tant que catégorie de l’humour la plus utilisée dans le corpus. De plus, le corpus montre aussi une prédilection pour l’utilisation de certaines expressions du visage (haussement et froncement des sourcils) ainsi que certains mouvements de la tête (inclinaison et hochement). Ces éléments sont repris et expliqués en fonction de leur rôle dans le contexte et dans l’attitude des locuteurs pour une meilleure compréhension de l’humour en interaction. / This dissertation examines the different techniques used to achieve humor in interaction in two contemporary American television-series—/House M.D./ and /The Big Bang Theory./ Through different writing techniques, we observe the elements that are used in the process of humorous meaning construction. The dialogue between interlocutors plays a central role since it centers on intersubjectivity, and hence, the common ground between speakers.This original study also implies the investigation of the different gestures used by interlocutors in the two series to create humorous effects. These /gestural triggers/ as well as the different humor types have been annotated in ELAN, which allows a more holistic view of the processes involved in humor.The results show an evident preference for sarcasm as well as a preference for certain facial expressions (raising eyebrows and frowning) and head movements (head tilts and head nods). These elements are explained in accordance with a given context and with the speakers’ attitude for a better understanding of humor in interaction.
|
Page generated in 0.0634 seconds