• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 199
  • 93
  • 23
  • 20
  • 12
  • 5
  • 4
  • 4
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 460
  • 460
  • 189
  • 182
  • 140
  • 94
  • 88
  • 66
  • 63
  • 59
  • 55
  • 49
  • 48
  • 47
  • 43
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Assessing and Addressing the Assistive Technology Needs of Students with Learning Disabilities

Marks, Lori J. 19 November 1999 (has links)
No description available.
192

Spatial Reading System for Individuals with Blindness

Elglaly, Yasmine Nader Mohamed 06 May 2013 (has links)
In this research we introduce a novel reading system that enables Individuals with Blindness<br />or Severe Visual Impairment (IBSVI) to have equivalent spatial reading experience to their<br />sighted counterparts, in terms of being able to engage in different reading strategies e.g.<br />scanning, skimming, and active reading. IBSVI are enabled to read in a self-paced manner<br />with spatial access to the original layout of any electronic text document. This system<br />renders text on iPad-type devices, and reads aloud each word touched by the user\'s finger.<br />The user could move her finger smoothly along the lines to read continuously with the<br />support of tactile landmarks. A tactile overlay on the iPad screen helps IBSVI to navigate<br />a page, furnishing a framework of tactile landmarks to give IBSVI a sense of place on the<br />page. As the user moves her finger along the tangible pattern of the overlay, the text on the<br />screen that is touched is rendered audibly to speech. The system supports IBSVI to develop<br />and maintain a cognitive map of the structure and the layout of the page. IBSVI are enabled<br />to fuse audio, tactile landmarks, and spatial information in order to read.<br />The system\'s initial design is founded on a theoretical hypothesis. A participatory design<br />approach with IBSVI consultants was then applied to refine the initial design. The re"fined<br />design was tested in a usability study, which revealed two major issues with the tested<br />design. These issues are related to the lack of instant feedback from the system (psycho-<br />motorical problem), and the lack of conveying the semantic level of the page structure.<br />We adapted the reader design to solve the usability problems. The improved design was<br />tested in an experience sampling study. The results showed a leap in the system usability.<br />IBSVI participants successfully self-paced read spatial text. Further reading support was<br />then added to the system to improve the user experience while reading and interacting with<br />the system. We tested the latest design of the reader system with respect to its featured<br />function of enabling self-paced reading and re-finding information. A decomposition study<br />was conducted to evaluate the main components of the system; the tactile overlay, and the<br />intelligent active reading support. The results showed that both components are required<br />to achieve the best performance in terms of efficiency, effectiveness, and spatial perception.<br />We conducted an evaluation study to compare our reader system to the state-of-the-art<br />iBook with VoiceOver. The results show that our reader system is more effective than iBook<br />with VoiceOver in finding previously read information and in estimating the layout of the<br />page, implying that IBSVI were able to construct a cognitive map for the pages they read,<br />and perform advanced reading strategies. Our goal is to to enable IBSVI to access digital<br />reading materials effectively, so that they may have equal learning opportunities as their<br />sighted counterparts. / Ph. D.
193

Haptic Vision: Augmenting Non-visual Travel Tools, Techniques, and Methods by Increasing Spatial Knowledge Through Dynamic Haptic Interactions

January 2020 (has links)
abstract: Access to real-time situational information including the relative position and motion of surrounding objects is critical for safe and independent travel. Object or obstacle (OO) detection at a distance is primarily a task of the visual system due to the high resolution information the eyes are able to receive from afar. As a sensory organ in particular, the eyes have an unparalleled ability to adjust to varying degrees of light, color, and distance. Therefore, in the case of a non-visual traveler, someone who is blind or low vision, access to visual information is unattainable if it is positioned beyond the reach of the preferred mobility device or outside the path of travel. Although, the area of assistive technology in terms of electronic travel aids (ETA’s) has received considerable attention over the last two decades; surprisingly, the field has seen little work in the area focused on augmenting rather than replacing current non-visual travel techniques, methods, and tools. Consequently, this work describes the design of an intuitive tactile language and series of wearable tactile interfaces (the Haptic Chair, HaptWrap, and HapBack) to deliver real-time spatiotemporal data. The overall intuitiveness of the haptic mappings conveyed through the tactile interfaces are evaluated using a combination of absolute identification accuracy of a series of patterns and subjective feedback through post-experiment surveys. Two types of spatiotemporal representations are considered: static patterns representing object location at a single time instance, and dynamic patterns, added in the HaptWrap, which represent object movement over a time interval. Results support the viability of multi-dimensional haptics applied to the body to yield an intuitive understanding of dynamic interactions occurring around the navigator during travel. Lastly, it is important to point out that the guiding principle of this work centered on providing the navigator with spatial knowledge otherwise unattainable through current mobility techniques, methods, and tools, thus, providing the \emph{navigator} with the information necessary to make informed navigation decisions independently, at a distance. / Dissertation/Thesis / Doctoral Dissertation Computer Science 2020
194

The Impact of Smart Home Technology on Independence for Individuals Who Use Augmentative and Alternative Communication

Corso, Christina L. 10 September 2021 (has links)
No description available.
195

Design elektrohandbiku / Design of Electric Handbike

Korejz, Jiří January 2020 (has links)
This master thesis deals with the design of a handbike with electric propulsion. The final design is created in regard to knowledge from design and technical analysis and also to desficiencies of contemporary products. The purpose of this work is to create design of electrohandbike which will respect user and his needs from ergonomical and esthetic point of view.
196

MULTIMODAL DIGITAL IMAGE EXPLORATION WITH SYNCHRONOUS INTELLIGENT ASSISTANCE FOR THE BLIND

Ting Zhang (8636196) 16 April 2020 (has links)
Emerging haptic devices have granted individuals who are blind the capabilities to explore images in real-time, which has always been a challenge for them. However, when only haptic-based interaction is available, and no visual feedback is given, image comprehension demands time and major cognitive resources. This research developed an approach to improve blind people’s exploration performance by providing assisting strategies in various sensory modalities, when certain exploratory behavior is performed. There are three fundamental components developed in this approach: the user model, the assistance model, and the user interface. The user model recognizes users’ image exploration procedures. A learning framework utilizing spike-timing neural network is developed to classify the frequently applied exploration procedures. The assistance model provides different assisting strategies when certain exploration procedure is performed. User studies were conducted to understand the goals of each exploration procedure and assisting strategies were designed based on the discovered goals. These strategies give users hints of objects’ locations and relationships. The user interface then determines the optimal sensory modality to deliver each assisting strategy. Within-participants experiments were performed to compare three sensory modalities for each assisting strategy, including vibration, sound and virtual magnetic force. A complete computer-aided system was developed by integrating all the validated assisting strategies. Experiments were conducted to evaluate the complete system with each assisting strategy expressed through the optimal modality. Performance metrics including task performance and workload assessment were applied for the evaluation.
197

Exploration of Mandibular Inputs for Human-Machine Interfaces

Yaslam, Abdulaziz 05 1900 (has links)
The direct connection of the jaw to the brain allows it to retain its motor and sensory capabilities even after severe spinal cord injuries. As such, it can be an accessible means of providing inputs for people with paralysis to manipulate their environment. This paper explores the potential for using the jaw, specifically the mandible, as an alternative input to human-machine interface systems. Two tests were developed to test the mandible's ability to respond to visual stimuli. First, a visual response time test to measure the precision and accuracy of user input through a mandible-actuated button. Second, a choice response test to observe coordination between the mandible and a finger. Study results show that the mean response time of mandible inputs is 8.3% slower than the corresponding mean response time of performing the same task with a thumb. The delay in response after making a decision is statistically insignificant between the mandible- and finger-actuated inputs with the mandible being 2.67% slower. Based on these results, the increase in response time while using the mandibular input is minimal for new users. Coordination is feasible in tasks involving both the mandible and thumb. Extensive training with a made-to-fit device has the potential to enable a visual response time equivalent to the fingers in more complex tasks. The mandible is a viable option for accessible HMI for discreet inputs. Further testing into continuous input is needed to explore the mandible's potential as an input for body augments.
198

Pepe: an adaptive robot that helps children with autism to plan and self-manage their day.

Cañete Yaque, Raquel January 2021 (has links)
Covid19 has brought up physical and mental challenges for all of us. However, this is even more pronounced for those who suffer from psychological pathologies, such as Autism Spectrum Disorder (ASD). One of the main challenges that parents of children with ASD faced during the pandemic was to plan and structure a daily routine for their kids. The disruption of the routine, together with the difficulty of combining work and the care of the child has resulted in behavioral problems and stress and anxiety for both, parents and children.    This project focused on developing an adaptive robot that helps children with autism to plan and self-manage their day, with the end goal of becoming more independent. With adaptability, agencies, senses, and playfulness at the core of the design, Pepe is meant as a support tool for these children to use along the way. By collecting information from the performance of the kid, it is able to adapt its behavior to the child´s (and parent´s) needs and desires, and therefore progress with the child. It builds upon the principles of Positive Behavioral Support to prevent emotional crises by embracing a long-run negotiation process, by which the child gets gradually closer to the end goal of self-autonomy. Intending to be adapted to the accentuated needs of these children, it combines traditional and computational elements to make the most out of the experience. This project included in-depth user research together with parents and experts, an interdisciplinary design approach, and a prototyping phase in which a prototype was tested with children with ASD.
199

Anhörigas upplevelser av stöd för att vårda en närstående med demenssjukdom i hemmet

Bohte, Ethel, Jörholt, Maja January 2021 (has links)
Bakgrund: Var tredje sekund drabbas en person av en demenssjukdom i världen. De flesta av dessa bor kvar hemma i flera år och blir vårdade av en anhörig. Enligt svensk lag ska anhöriga erbjudas anhörigstöd av sin hemkommun. Syfte: Syftet var att belysa anhörigas upplevelse av stöd för att vårda närstående med demenssjukdom i hemmet. Metod: En litteraturöversikt baserad på kvalitativa vetenskapliga artiklar. Resultat: I sammanställningen av 10 kvalitativa vetenskapliga artiklar framkom tre kategorier: upplevelse av personlig information, upplevelse av vardagligt stöd samt upplevelse av reflektion. Anhöriga upplevde informationsbrist om att det fanns stöd att tillgå i vårdandet av sin närstående med demenssjukdom i hemmet, och hur de får tillgång till dessa stöd. De anhöriga som däremot fått information om stöd uppskattade också stöden då det bidrog till avlastning i vardagen och bättre livskvalitet. Dessutom upplevde anhöriga att stödet hade bidragande faktorer till att klara av vardagen. Många anhöriga hade en positiv upplevelse av att få stöd i skriftlig, muntlig och gruppvis form, ett stöd som också gav upphov till reflektion. Slutsats: Att vårda sin närstående med demenssjukdom kan vara både fysiskt och psykiskt påfrestande, vilket gör att behovet av stöd är viktigt för den anhöriga. Stöd till anhöriga behöver komma in i ett tidigt skede, men eftersom behovet av stöd ändras över tid krävs också kontinuerlig uppföljning för att en tillräcklig omvårdnad ska upprätthållas. / Background: Every three seconds, a person in the world suffers from dementia. Most of these stay at home for several years and are cared for by a relative. According to Swedish law, relatives must be offered relative support by their home municipality. Aim: The aim was to shed light on relatives' experience of support for caring for relatives with dementia at home. Method: A literature review based on qualitative scientific articles. Results: In the compilation of 10 qualitative scientific articles, three categories emerged: experience of personal information, experience of everyday support and experience of reflection. Relatives experienced a lack of information that there was support available in the care of a close relative with dementia at home, and how they get access to this support. The relatives who, on the other hand, received information about support also appreciated the support as it contributed to relief in everyday life and a better quality of life. In addition, relatives felt that the support had contributing factors to coping with everyday life. Many relatives had a positive experience of receiving support in written, oral and group form, a support that also gave rise to reflection. Conclusion: Caring for a close relative with dementia can be both physically and mentally stressful, which means that the need for support is important for the relative. Support for relatives needs to come in at an early stage, but since the need for support changes over time, continuous follow-up is also required in order maintain adequate care.
200

Enhancing emotional communication between autistic and non-autistic individuals through assistive Information Technology

Abouei, Mina January 2021 (has links)
Recognising people’s emotions is a promising research area in human-computer interaction as emotional communication plays a crucial role in humans’ lives. One of the main reasons for ineffective emotional communication is a deficit in understanding emotional signals such as facial expressions and body posture. There is a bidirectional challenge between autistic and non-autistic individuals since they display their emotional signals differently. This thesis discovers differences in emotional signals, in particular facial expressions, body posture, and physiological signals. Based on the interviews and questionnaires conducted in this thesis, the need to design an aid tool to assist autistic and non-autistic participants during their emotional communication is identified. Therefore, Emognition, a smartwatch, and its mobile application is designed to blur these differences and enhance the emotional communication between them. Furthermore, Emognition’s user evaluation indicates that the smartwatch could successfully detect nonautistic participants’ sadness and happiness. Also, they found the mobile application useful and aesthetically motivating to interact with. Even though we could not evaluate how well the Emognition recognises autistic participants’ sadness and happiness, it is promising to measure their emotions successfully by accurate sensors and, more importantly, by finding their autonomic response patterns to different emotions and enhance emotional communication between autistic and nonautistic people by Emognition.

Page generated in 0.0405 seconds