• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 17
  • 17
  • 8
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Spacing Out: Distal Attribution in Sensory Substitution

Pence, David Evan 04 June 2013 (has links)
No description available.
2

Sensorimotor transformations during grasping movements

Säfström, Daniel January 2006 (has links)
‘Sensorimotor transformations’ are processes whereby sensory information is used to generate motor commands. One example is the ‘visuomotor map’ that transforms visual information about objects to motor commands that activates various muscles during grasping movements. In the first study we quantified the relative impact (or ‘weighting’) of visual and haptic information on the sensorimotor transformation and investigated the principles that regulates the weighting process. To do this, we let subjects perform a task in which the object seen (visual object) and the object grasped (haptic object) were physically never the same. When the haptic object became larger or smaller than the visual object, subjects in the following trials automatically adapted their maximum grip aperture (MGA) when reaching for the object. The adaptation process was quicker and relied more on haptic information when the haptic objects increased in size than when they decreased in size. As such, sensory weighting is molded to avoid prehension error. In the second study we investigated the degree to which the visuomotor map could be modified. Normally, the relationship between the visual size of the object (VO) and the MGA can be expressed as a linear relationship, where MGA = a + b * VO. Our results demonstrate that subjects inter- and extrapolate in the visuomotor map (that is, they are reluctant to abandon the linear relationship) and that the offset (a) but not the slope (b) can be modified. In the third study, we investigated how a ‘new’ sensorimotor transformation can be established and modified. We therefore replaced the normal input of visual information about object size with auditory information, where the size of the object was log-linearly related to the frequency of a tone. Learning of an audiomotor map consisted of three distinct phases: during the first stage (~10-15 trials) there were no overt signs of learning. During the second stage there was a period of fast learning where the MGA became scaled to the size of the object until the third stage where the slope was constant. The purpose of the fourth study was to investigate the sensory basis for the aperture adaptation process. To do that, the forces acting between the fingertips and the object was measured as the subjects adapted. Our results indicate that information about when the fingers contacts the object, that is, the ‘timing’ of contact, is likely to be used by the CNS to encode an unexpected object size. Since injuries and disease can affect the sensorimotor transformations that controls the hand, knowledge about how these processes are established and modified may be used to develop techniques for sensory substitution and other rehabilitation strategies.
3

Modern Sensory Substitution for Vision in Dynamic Environments

January 2020 (has links)
abstract: Societal infrastructure is built with vision at the forefront of daily life. For those with severe visual impairments, this creates countless barriers to the participation and enjoyment of life’s opportunities. Technological progress has been both a blessing and a curse in this regard. Digital text together with screen readers and refreshable Braille displays have made whole libraries readily accessible and rideshare tech has made independent mobility more attainable. Simultaneously, screen-based interactions and experiences have only grown in pervasiveness and importance, precluding many of those with visual impairments. Sensory Substituion, the process of substituting an unavailable modality with another one, has shown promise as an alternative to accomodation, but in recent years meaningful strides in Sensory Substitution for vision have declined in frequency. Given recent advances in Computer Vision, this stagnation is especially disconcerting. Designing Sensory Substitution Devices (SSDs) for vision for use in interactive settings that leverage modern Computer Vision techniques presents a variety of challenges including perceptual bandwidth, human-computer-interaction, and person-centered machine learning considerations. To surmount these barriers an approach called Per- sonal Foveated Haptic Gaze (PFHG), is introduced. PFHG consists of two primary components: a human visual system inspired interaction paradigm that is intuitive and flexible enough to generalize to a variety of applications called Foveated Haptic Gaze (FHG), and a person-centered learning component to address the expressivity limitations of most SSDs. This component is called One-Shot Object Detection by Data Augmentation (1SODDA), a one-shot object detection approach that allows a user to specify the objects they are interested in locating visually and with minimal effort realizing an object detection model that does so effectively. The Personal Foveated Haptic Gaze framework was realized in a virtual and real- world application: playing a 3D, interactive, first person video game (DOOM) and finding user-specified real-world objects. User study results found Foveated Haptic Gaze to be an effective and intuitive interface for interacting with dynamic visual world using solely haptics. Additionally, 1SODDA achieves competitive performance among few-shot object detection methods and high-framerate many-shot object de- tectors. The combination of which paves the way for modern Sensory Substitution Devices for vision. / Dissertation/Thesis / Doctoral Dissertation Computer Engineering 2020
4

Affordances In The Design Of Virtual Environments

Gross, David Charles 01 January 2004 (has links)
Human-computer interaction design principles largely focus on static representations and have yet to fully incorporate theories of perception appropriate for the dynamic multimodal interactions inherent to virtual environment (VE) interaction. Theories of direct perception, in particular affordance theory, may prove particularly relevant to enhancing VE interaction design. The present research constructs a conceptual model of how affordances are realized in the natural world and how lack of sensory stimuli may lead to realization failures in virtual environments. Implications of the model were empirically investigated by examining three affordances: passability, catchability, and flyability. The experimental design involved four factors for each of the three affordances and was implemented as a 2 [subscript IV] [superscript 4-1] fractional factorial design. The results demonstrated that providing affording cues led to behavior closely in-line with real-world behavior. More specifically, when given affording cues participants tended to rotate their virtual bodies when entering narrow passageways, accurately judge balls as catchable, and fly when conditions warranted it. The results support the conceptual model and demonstrate 1) that substituting designed cues via sensory stimuli in available sensory modalities for absent or impoverished modalities may enable the perception of affordances in VEs; 2) that sensory stimuli substitutions provide potential approaches for enabling the perception of affordances in a VE which in the real world are cross-modal; and 3) that affordances relating to specific action capabilities may be enabled by designed sensory stimuli. This research lays an empirical foundation for a science of VE design based on choosing and implementing design properties so as to evoke targeted user behavior
5

MusiKeys: Exploring Auditory-Physical Feedback Replacement for Mid-Air Text-Entry

Krasner, Alexander Laurence 07 August 2023 (has links)
Extended reality (XR) technology is positioned to become more ubiquitous in life and the workplace in the coming decades, but the problem of how to best perform precision text-entry in XR remains unsolved. Physical QWERTY keyboards are the current standard for these kind of tasks, but if they are recreated virtually, the feedback information from sense of touch is lost. We designed and ran study with 24 participants to explore the effects of using auditory feedback to communicate this missing information that typists normally get from touching a physical keyboard. The study encompassed four VR mid-air keyboards with increasing levels of auditory information, along with a fifth physical keyboard for reference. We evaluated the auditory augmentations in terms of performance, usability, and workload, while additionally assessing the ability of our technique to communicate the touch-feedback information. Results showed that providing clicking feedback on key-press and key-release improves typing compared to not providing auditory feedback, which is consistent with literature on the topic. However, we also found that using audio to substitute the information contained in physical-touch feedback, in place of actual physical-touch feedback, yielded no statistically significant difference in performance. The information can still be useful, but potentially would take a lot of time to develop the muscle memory reflexes that typists already have when using physical keyboards to type. Nonetheless, we recommend others consider incorporating auditory feedback of key-touch into their mid-air keyboards, since it received the highest levels of user preference among keyboards tested. / Master of Science / Extended reality (XR) refers to technology that allows users to either immerse themselves in virtual worlds or incorporate virtual objects into the real world. XR is positioned to become more ubiquitous in life and the workplace in the coming decades, but the problem of how to best perform precision text-entry in XR remains unsolved. Physical QWERTY keyboards are the current standard for these kind of tasks, but if they are recreated virtually, the information inherent to sense of touch is lost. We designed and ran study with 24 participants to explore the effects of using auditory feedback to communicate this missing information that typists normally get from touching a physical keyboard. The study encompassed four virtual reality (VR) mid-air keyboards with increasing levels of auditory information, along with a fifth physical keyboard for reference. We evaluated the auditory augmentations in terms of performance, usability, and workload, while additionally assessing the ability of our technique to communicate the touch-feedback information. Results showed that providing clicking feedback on key-press and key-release improves typing compared to not providing auditory feedback, which is consistent with literature on the topic. However, we also found that using audio to substitute the information contained in physical-touch feedback, in place of actual physical-touch feedback, yielded no statistically significant difference in performance. The information can still be useful, but potentially would take a lot of time to develop the muscle memory reflexes that typists already have when using physical keyboards to type. Nonetheless, we recommend others consider incorporating auditory feedback of key-touch into their mid-air keyboards, since it received the highest levels of user preference among keyboards tested.
6

Performance and Usability of Force Feedback and Auditory Substitutions in a Virtual Environment Manipulation Task

Edwards, Gregory W. 27 December 2000 (has links)
Recent technology developments have made possible the creation of several commercial devices and a selected number of development platforms for the inclusion of haptics (the sense of touch) in virtual environments (VE). This thesis sought to investigate and develop a better understanding of whether or not haptics or sound substitutions improved manipulation performance or usability in VE applications. Twenty-four volunteers (12 males and 12 females) participated in a 2 (haptics) x 2 (sound) x 2 (gender) mixed factorial experiment in which they completed a VE manipulation task involving the assembly and disassembly of 5 interconnecting parts. Performance for the manipulation task was measured through completion time and the number of collisions made, as well as subjective measures of usability. Results indicated that completion times were slower and collision counts were higher for males with the addition of haptics (ptime = 0.03; pcollisions<0.05), while females exhibited a smaller increase in collision counts and no increase in completion time with the addition of haptics. Nonetheless, there were improved usability attributes when haptics were incorporated, more specifically, an increased sense of realism, perceived helpfulness and perceived utility in a design task (p<0.05 for all). Sound was found to be an effective substitute for haptics in most measures taken while the combination of sound and haptics versus either alone, did not demonstrate any signs of improving performance or any usability attributes. It is therefore recommended that sound substitution be used in VE manipulation tasks where the extra haptic information is desired, and minimizing completion time or collisions are the overall goal. Finally, for the utility of the feedback towards a design task, users ranked haptics as being more useful than sound, but ranked the combination of sound and haptics as being the best feedback condition (p<0.05). Further research is required to determine whether this belief is consistent with objective measures. / Master of Science
7

Complexity, the auditory system, and perceptual learning in naïve users of a visual-to-auditory sensory substitution device

Brown, David J. January 2015 (has links)
Sensory substitution devices are a non-invasive visual prostheses that use sound or touch to aid functioning in the blind. Algorithms informed by natural crossmodal correspondences convert and transmit sensory information attributed to an impaired modality back to the user via an unimpaired modality and utilise multisensory networks to activate visual areas of cortex. While behavioural success has been demonstrated in non-visual tasks suing SSDs how they utilise a metamodal brain, organised for function is still a question in research. While imaging studies have shown activation of visual cortex in trained users it is likely that naïve users rely on auditory characteristics of the output signal for functionality and that it is perceptual learning that facilitates crossmodal plasticity. In this thesis I investigated visual-to-auditory sensory substitution in naïve sighted users to assess whether signal complexity and processing in the auditory system facilitates and limits simple recognition tasks. In four experiments evaluating; signal complexity, object resolution, harmonic interference and information load I demonstrate above chance performance in naïve users in all tasks, an increase in generalized learning, limitations in recognition due to principles of auditory scene analysis and capacity limits that hinder performance. Results are looked at from both theoretical and applied perspectives with solutions designed to further inform theory on a multisensory perceptual brain and provide effective training to aid visual rehabilitation.
8

Distinguishing the senses : individuation and classification / Distinguer les sens : individuation et classification

Le Corre, Francois 11 December 2014 (has links)
Cette Dissertation porte sur deux problèmes théoriques concernant les modalités sensorielles. Le premier pose la question de savoir comment les sens sont individués (indépendamment de la façon dont on les conçoit communément). La stratégie consiste à tester la résistance des critères d'individuation proposés dans la littérature contre les objections qui leur ont été adressées. Je défends la thèse selon laquelle les sens sont individués par les propriétés environnementales auxquelles ils donnent accès, et je montre que ce critère résiste à toutes les objections qui lui ont été adressés. Le second problème concerne l'origine de notre croyance en exactement cinq sens. Sur la base d'observations issues de l'anthropologie et de la linguistique comparative, je défends que cette croyance résulte uniquement de ce qui nous a été enseigné. En outre, ce travail doctoral contient deux études supplémentaires. La première porte sur la question de savoir comment les gens distinguent ordinairement les sens. Je défends que nos concepts ordinaires des sens sont sensibles aux types de propriétés environnementales auxquelles les sens donnent accès ainsi qu'aux parties du corps auxquelles ils sont attachés. La seconde étude se concentre sur le phénomène de la substitution sensorielle, longtemps considéré comme un défi pour l'individuation des sens. Je défends que la substitution sensorielle n'est pas un défi de cet ordre parce que le type d'information environnementale traité par un outil de substitution sensorielle est métamodale, i.e. accessible par n'importe qu'elle modalité. / This Dissertation is concerned with two theoretical issues about the senses. The first issue focuses on the question how the senses are to be individuated (regardless of the way one commonly conceive of them). The strategy is to test the ability of the criteria of individuation available in the literature to withstand objections. I argue that the senses are to be individuated in terms of the environmental properties they give access to, and show that this criterion can withstand all of the objections it has received. The second issue is the question why do we believe in exactly five senses? On the basis of observations from anthropology and comparative linguistics, I argue that this belief result from what we been taught. In addition, this Dissertation contains two supplementary studies. The first study focuses on the question how people ordinarily distinguish among the senses. I argue that people are sensitive both to the types of environmental properties the senses give access to and to the body parts they are attached to. The second study is concerned with the phenomenon of sensory substitution that has long been considered as a challenge for the individuation of the senses. I argue that sensory substitution is not a challenge of this kink because the type of environmental information processed by a sensory substitution device is metamodal, i.e. accessible through any sensory modality
9

En forskningssammanställning och prototypframtagning för taktilt hörhjälpmedel : Ett konceptvalideringsprojekt / A research compilation and prototyping for a tactile hearing aid : A concept validation project

Fransson, Hilda January 2021 (has links)
Det här är ett examensarbete för högskoleingenjörsexamen i innovationsteknik och design. Fokus på projektet är att validera det fortsatta arbetet inom ett pågående produktutvecklingsprojekt av ett taktilt hörhjälpmedel. Projektets syfte är att undersöka forskning och teknik kopplat till taktil stimulering för att uppfatta ljud, och använda det som grund för beslutsfattning gällande den fortsatta inriktningen på utvecklingsprojektet. Målen för projektet är att leverera en sammanställning av aktuell forskning; en funktionsprototyp för testning av det senaste konceptet; samt resultat och analys av genomförda tester av det senaste konceptet gjorda med prototypen. Relevant och aktuell forskning och teknik kring sensorisk substitution för ljud-till-känsel-system har sammanställts. Forskningssammanställningen visar att sensorisk substitution för överföring av ljudinformation via hudens känselsinne är möjligt om tekniken fungerar och anpassas för ändamålet. Utifrån sammanställningen togs beslutet att fortsätta arbetet med att testa det senast framtagna konceptet inom utvecklingsprojektet. Detta koncept består av elektromagnetiska aktuatorer där spolarna fungerar som rörlig del, och aktuatorerna är satta i matrisformation. En funktionsprotototyp tillverkades för att testa hur spel mellan aktuatorerna påverkas av att ha spolen som rörlig del, och om aktuatorerna kan röra sig utan att påverkas av närliggande aktuatorer. Resultaten från testerna visar att den rörliga spolen löser tidigare problem med spel mellan aktuatorerna. Analysen av testresultaten har sammanställts i en FMEA och visar att även om grundpremissen för konceptet fungerar, finns det många risker som behöver hanteras i det framtida arbetet av produktutvecklingen. / This is a bachelor thesis in innovation and design engineering. The focus of this thesis is to validate the continuing work of an ongoing product development project developing a tactile hearing aid.  The purpose is to investigate research and technical solutions regarding tactile stimulation for sound perception and to use this as a basis for decision making for the continuing work of the product development project. The goal of this thesis is to deliver a compilation of state-of-the-art research; a works-like prototype for testing of the latest concept; and the results and analysis of tests performed with the prototype. A compilation was made of relevant research and technical solutions regarding sensory substitution for sound-to-touch systems. The compilation shows that sensory substitution for sound information transfer via the sense of touch is possible if the technical aspects work and are adapted for this purpose. Based on the results from the compilation a decision was made to continue the work by testing the development project’s latest actuator concept. The concept is based on solenoid actuators where the coils work as moving parts, and the actuators are set in a matrix formation. A works-like prototype of the actuators was made to test how play between the actuators is affected by having the coil as the moving part and whether the actuators can move without being affected by the actuators around them. The test results show that the moving coil solves the earlier problems regarding play between the actuators. The analysis of the test results has been compiled in a FMEA and shows that even though the basic premise of the concept works, there are a lot of risks which must be addressed during the continuing work on the product development.
10

Constance spatiale haptique et attribution distale / Haptic spatial constancy and distal attribution

Dupin, Lucile 30 November 2015 (has links)
L'environnement dans lequel nous évoluons nous apparaît comme stable. Il nous semble exister indépendamment de notre mouvement et du sens à partir duquel nous le percevons. Or, chacun de nos déplacements peut modifier l'information sensorielle : l'image rétinienne est constamment modifiée lorsque nous sommes en mouvement ou l'information tactile lorsque nous déplaçons la main pour identifier un objet. Par ailleurs, les éléments extérieurs peuvent se déplacer produisant également une modification de la stimulation de nos capteurs. La connaissance de notre propre mouvement dans cet environnement est un élément essentiel permettant de distinguer les modifications sensorielles liées à notre propre action de celles qui nous sont extérieure. Le mouvement est généralement réalisé par le capteur sensoriel: celui-ci subit la modification de stimulation liée à son propre déplacement. Le toucher est fondamentalement une modalité active car, l'objet exploré tactilement est généralement plus grand que la surface de peau utilisée pour ce faire. Le mouvement permet donc de reconstituer spatialement les caractéristiques de l'objet, en associant les informations locales successives. Mais, le nombre de degrés de liberté du mouvement en haptique complexifie cette opération, et l'on peut s'interroger sur la représentation de ce mouvement. Il a été observé que l'association sensorimotrice haptique apparaît pouvoir dépasser le cadre de son usage quotidien. Dans le cas de certains dispositifs de substitution sensorielle visuo-tactile, le participant, non-voyant ou bien les yeux bandés, déplace une caméra dont l'image est retransmise tactilement dans le dos grâce à une matrice de vibreurs tactiles. Après un entraînement, les participants rapportent percevoir l'élément filmé, et localisé dans l'espace. Or, il existe deux grandes différences par rapport à une perception tactile usuelle. Tout d'abord, la modification de la stimulation liée au mouvement correspond à une modalité distale, comme la vision, et non proximale comme l'est le toucher. Ensuite, le mouvement est réalisé par une partie du corps, le bras, et la conséquence sensorielle tactile est située sur une autre partie, le dos. Afin d'étudier l'association sensorimotrice dans la perception spatiale haptique, nous avons dû établir une méthodologie permettant de séparer le mouvement réalisé de la stimulation tactile résultante afin qu'ils soient situés à deux endroits distincts sur le corps. Nous avons tout d'abord choisi le cas où la conséquence tactile du mouvement d'une main est stimulée sur l'index de l'autre main. Cette expérience a montré qu'il existait une représentation du mouvement, abstraite de son origine, pouvant être associée avec la stimulation tactile situé sur l'autre main, sans entraînement. L'association sensorimotrice entre les deux bras peut être considérée comme un cas spécifique en raison de la coordination bimanuelle. Nous avons donc étudié cette association lorsque le mouvement est réalisé par le pied ou les yeux ou lorsqu'un point se déplace visuellement sans aucun mouvement du participant. Les résultats ont montré qu'il existe deux représentations complémentaires du mouvement. Une première, plus abstraite, correspondant au codage d'une direction dans l'espace. Elle est utilisée même lorsqu'il n'y a pas d'action mais simplement la visualisation d'un mouvement. L'autre représentation correspond aux caractéristiques du mouvement (amplitude/vitesse) et n'a pu être observé que lorsqu'il y avait une action. L'étude suivante s'est concentrée sur le référentiel de représentation de la direction du mouvement. Les résultats ont montré qu'il s'agissait en partie d'un référentiel centré sur le participant. Dans le cas où les bras étaient positionnés devant et à proximité du participant, présentaient une tendance plus instable voire allocentrique pour certains participants. / The world appears stable to us. It seems to exist independently from our own movements and the sense we use to perceive it. Our movements can however modify the sensory information we receive: the retinal image is constantly modified when we walk as is the tactile stimulation when we move the hand to identify an object. External objects can also move in the meantime, further modifying the stimulation. The knowledge of our own movement in the environment is necessary to distinguish sensory modifications linked to our own action and the ones linked to external movement. In most of the cases, the sensory receptor itself is moved in order to perceive and it generates the sensory modification linked to its movement. Touch is fundamentally an active modality. Explored objects are generally larger than the skin surface used to perceive them. Spatial characteristics of the objects are perceived thanks to the movement, by combining the successive local tactile information, which is made even more complicated by the degrees of freedom inherent to the movement in haptic perception. This leads to the question on how the movement is represented. It has been observed that sensory-motor combination in haptic goes beyond the way it is normally used on a daily basis. Participants -blind or blindfolded- uses visuo-tactile sensory substitution displays by moving a camera. The resulting image is "displayed" tactilely on the back of the participant using a matrix of vibrators. After a period of training, participants report that they perceive the filmed object, located in space. There are two main differences between this experiment and the usual haptic perception. First, the modification of the stimulation linked to the movement corresponds to a distal modality, like vision, and not a proximal modality, like haptic. Then, the movement is made with one part of the body, the arm, and the sensory consequence is perceived on another part of the body, the back. In order to study the sensory-motor association in spatial haptic perception, we have set up a method to divide location of the movement on the body from the location where the resulting tactile consequence is perceived. Our first study was the case of moving one hand and feeling the tactile consequence on the other hand's index finger. This experiment has shown that a representation of the movement, abstract from its location on the body, could be associated to the tactile simulation, without training. Moreover, we have studied this association in the situations where feet or eyes move, or for a seen motion (without any movement of the participant). Results have shown that there are two distinct and complementary representations of the movement. The first one, more abstract, corresponds to the spatial direction. It is used even when there is no action -visual movement-. The other representation corresponds to the movement's characteristics (amplitude/velocity) and hasn't been observed when participants are not moving. In another experiment, we have studied the reference frame of the movement's direction. Results have shown that the main part of the reference frame was egocentric but in the cases where the arms were positioned in front of and near from the participant, the reference frame was instable and even allocentric for some participants.

Page generated in 0.1252 seconds