• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • Tagged with
  • 11
  • 11
  • 7
  • 6
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Improving web usability for the visually impaired

Kullman, Christoffer January 2009 (has links)
<p>The Web has opened up many possibilities for disabled people to interact with society, but there is unfortunately a lack of parity between the user interface presented to different users.</p><p>This dissertation presents a proof of concept on designing a spatial layout presentation for blind users using a screen reader. This is done in three steps by first conducting a survey to determine current practices of web developers, then implementing an instant spatial feedback and comparison function that present the spatial layout, and ends with an evaluation of the spatial layout presentation by the way of user testing</p><p>The survey yielded a set of guidelines for the realistic development of web technologies for disabled persons based on the participants answered. From the implementation a concept for spatial feedback functions that are portable and expandable is presented. The evaluation shows that the created spatial presentation method passes both objectively and subjectively.</p>
2

Improving web usability for the visually impaired

Kullman, Christoffer January 2009 (has links)
The Web has opened up many possibilities for disabled people to interact with society, but there is unfortunately a lack of parity between the user interface presented to different users. This dissertation presents a proof of concept on designing a spatial layout presentation for blind users using a screen reader. This is done in three steps by first conducting a survey to determine current practices of web developers, then implementing an instant spatial feedback and comparison function that present the spatial layout, and ends with an evaluation of the spatial layout presentation by the way of user testing The survey yielded a set of guidelines for the realistic development of web technologies for disabled persons based on the participants answered. From the implementation a concept for spatial feedback functions that are portable and expandable is presented. The evaluation shows that the created spatial presentation method passes both objectively and subjectively.
3

[en] SOUND AS A PROJECT TOOL IN DESIGN / [pt] O SOM COMO FERRAMENTA PROJETUAL NO DESIGN

MARCELO PEDRUZZI FERRANTI 12 November 2018 (has links)
[pt] A pesquisa propõe uma série de métodos e princípios para explorar o som em um processo projetual de design. Questões relativas aos aspectos sonoros parecem ser negligenciadas durante o processo projetual de design. Como poderia um designer, mesmo sem um treinamento formal acerca do som, utilizar o som em seus projetos de uma forma construtiva, intencional e objetiva? Para responder a essa pergunta, investigamos o impacto das mudanças sofridas pelas paisagens sonoras nas últimas décadas, ocasionadas pela industrialização e mecanização da sociedade. O impacto e a percepção dessas sonoridades pelo indivíduo também são discutidos, utilizando-se, para isso, os conhecimentos provenientes da acústica, psico-acústica e os mecanismos atencionais. Mapeamos, também, o estado da arte, evolução e aplicação desses sons em diferentes campos do saber, tais como a interação homem-computador, sonic interaction design, cinema e a musicologia, com a finalidade de obtermos um recorte teórico acerca do som que seja pertinente ao campo do design. Como resultado dessa exploração, propomos um conjunto de princípios norteadores e de métodos exploratórios sonoros para prototipar o som para o design. Por fim, aplicações e desdobramentos futuros em relação ao uso desses princípios e métodos são sugeridos. / [en] This research proposes a number of methods and principles to explore the sound in a design process. Issues related to sound aspects seems to be neglected during the design process. So, how could a designer, even without a formal sound or musical training, use sound in the projects in a constructive, intentional and objective way? In order to answer this question, we investigated the impact of changes in sound landscapes in the last decades, caused by the industrialization and mechanization of society, which made it possible to dissociate the sound from its emitter, allowing the former one to be manipulated, examined and dissected. The impact and perception of sounds by individuals are also discussed. To do so, the knowledge coming from acoustics, psychoacoustics and attentional mechanisms are used. We also mapped the state of the art, evolution and, application of these sounds in different fields of knowledge, such as human-computer interaction, sonic interaction design, cinema, and musicology, with the purpose of obtaining a relevant contribution to the design field. As a result of this exploration, we propose a set of guiding principles and exploratory methods for sounds as audible affordances for design. Finally, future applications and studies in relation to the use of these principles and methods are suggested.
4

Data Density and Trend Reversals in Auditory Graphs: Effects on Point Estimation and Trend Identification Tasks

Nees, Michael A. 28 February 2007 (has links)
Auditory graphsdisplays that represent graphical, quantitative information with soundhave the potential to make graphical representations of data more accessible to blind students and researchers as well as sighted people. No research to date, however, has systematically addressed the attributes of data that contribute to the complexity (the ease or difficulty of comprehension) of auditory graphs. A pair of studies examined the role of both data density (i.e., the number of discrete data points presented per second) and the number of trend reversals for both point estimation and trend identification tasks with auditory graphs. For the point estimation task, results showed main effects of both variables, with a larger effect attributable to performance decrements for graphs with more trend reversals. For the trend identification task, a large main effect was again observed for trend reversals, but an interaction suggested that the effect of the number of trend reversals was different across lower data densities (i.e., as density increased from 1 to 2 data points per second). Results are discussed in terms of data sonification applications and rhythmic theories of auditory pattern perception.
5

The Effect of Directional Auditory Cues on Driver Performance in a Simulated Truck Cab Environment

Powell, Jared Alan 09 January 2000 (has links)
A human factors experiment was conducted to investigate the potential benefits of using directional auditory cues in intelligent transportation system technologies in commercial vehicles. Twelve licensed commercial vehicle operators drove a commercial truck-driving simulator and were prompted to select highway numbers on a side-task display. Prompts were presented visually or aurally. Auditory prompts were presented either diotically (both ears simultaneously) or directionally (to either the left or right ear). The search task varied in map density and timing of the prompts in relation to speed limit changes. All experimental conditions were compared to a control condition containing no secondary task. Both driving performance (lane deviation, steering wheel angle, road heading angle error, accidents, and adherence to the speed limit) and secondary task performance (accuracy and response time) measures were collected. Results showed that drivers were able to respond more quickly and accurately to the search task when directional auditory cues were used. Results also showed that driving performance degrades when display density increases and that the use of directional auditory prompts lessens this deterioration of performance for high-density conditions. / Master of Science
6

Contextualizing Accessibility : Interaction for Blind Computer Users

Winberg, Fredrik January 2008 (has links)
Computer usage today is predominantly based on graphical interaction, where the visual presentation of information is essential both for input (hand-eye coordination when using a computer mouse), and output (seeing the information on a computer screen). This can create difficulties for blind computer users, both at an individual level when interacting with a computer, and also when collaborating with other computer users. The work presented in this thesis has investigated interaction for blind computer users in three stages. First investigating access to information by making studies on an interactive audio-only game, drawing conclusions about auditory direct manipulation and auditory interface design. Second studying collaboration between blind and sighted computer users in two different contexts, leading to questioning of commonly expressed design principles regarding access to collaboration. Finally studying accessibility in a working environment, finding out how technology, the assistive device used by the blind person, communication with others and professional knowledge interplayed to create an accessible work environment. Based on these empirical studies, the main conclusion from this work is a proposal of a research perspective, Assistive interfaces as cooperative interfaces. Here, the context where the interface is going to be used is in focus, and cooperative and social dimensions of interaction are acknowledged and highlighted. The design and analysis of assistive devices should be highly sensitive to the socio-interactional environment, and not just focusing on the single individual using an assistive device. / QC 20100921
7

Internal representations of auditory frequency: behavioral studies of format and malleability by instructions

Nees, Michael A. 16 November 2009 (has links)
Research has suggested that representational and perceptual systems draw upon some of the same processing structures, and evidence also has accumulated to suggest that representational formats are malleable by instructions. Very little research, however, has considered how nonspeech sounds are internally represented, and the use of audio in systems will often proceed under the assumption that separation of information by modality is sufficient for eliminating information processing conflicts. Three studies examined the representation of nonspeech sounds in working memory. In Experiment 1, a mental scanning paradigm suggested that nonspeech sounds can be flexibly represented in working memory, but also that a universal per-item scanning cost persisted across encoding strategies. Experiment 2 modified the sentence-picture verification task to include nonspeech sounds (i.e., a sound-sentence-picture verification task) and found evidence generally supporting three distinct formats of representation as well as a lingering effect of auditory stimuli for verification times across representational formats. Experiment 3 manipulated three formats of internal representation (verbal, visuospatial imagery, and auditory imagery) for a point estimation sonification task in the presence of three types of interference tasks (verbal, visuospatial, and auditory) in an effort to induce selective processing code (i.e., domain-specific working memory) interference. Results showed no selective interference but instead suggested a general performance decline (i.e., a general representational resource) for the sonification task in the presence of an interference task, regardless of the sonification encoding strategy or the qualitative interference task demands. Results suggested a distinct role of internal representations for nonspeech sounds with respect to cognitive theory. The predictions of the processing codes dimension of the multiple resources construct were not confirmed; possible explanations are explored. The practical implications for the use of nonspeech sounds in applications include a possible response time advantage when an external stimulus and the format of internal representation match.
8

An Investigation of Auditory Icons and Brake Response Times in a Commercial Truck-Cab Environment

Winters, John 11 June 1998 (has links)
In the driving task, vision, hearing, and the haptic senses are all used by the driver to gather required information. Future Intelligent Transportation Systems components are likely to further increase the volume of information available to or required by the driver, particularly in the case of commercial vehicle operators. The use of alternate modalities to present in-vehicle information is a possible solution to the potential overload of the visual channel. Auditory icons have been shown to improve operator performance and decrease learning and response times, not only in industrial applications, but also as emergency braking warnings. The use of auditory icons in commercial truck cabs has the potential to increase the number of auditory displays that can be distinguished and understood by commercial vehicle operators, and this experiment sought to determine the utility of auditory icons in that situation. Nine auditory icons were evaluated by commercial vehicle operators as they drove an experimental vehicle over public roads. A comparison of the data collected in the truck-cab environment to data collected in a laboratory study on the same auditory icons revealed some differences in the perceived meaning, perceived urgency, and association with the auditory icons' intended meanings between the two conditions. The presence of these differences indicates that driver evaluations of auditory icons can be affected by the environment, and testing should therefore be conducted in a situation that approximates the end-user environment as closely as possible. A comparison of the drivers' brake response times across the three warning conditions (no warning, auditory icon, and soft braking) was also conducted on a closed, secure handling course. Dependent measures included overall brake reaction time and its components, steering response time, time to initial driver action, and categorical measures of driver responses (steering, swerving, braking, and stopping). The results indicated numerically shorter mean response times (on the order of 0.5 seconds for Total Brake Response Time) for the two conditions with warnings, but the differences were not statistically significant. The most likely reason for this lack of significance is the extreme between-subject variability in response times in the no warning condition. An analysis of the response time variance across the three conditions did indicate significantly less variability in operator responses in the two warning conditions. Two of the five dependent measures (Brake Pedal Contact Time and Total Brake Response Time) exhibited significantly reduced variance in the auditory icon warning condition compared to the no warning condition. The soft braking warning condition exhibited significantly reduced variance for four of the dependent measures (Accelerator Reaction Time, Brake Pedal Contact Time, Total Brake Response Time, and First Reaction Time). These results indicate that a soft braking stimulus like that used in this study could potentially prove to be a more effective emergency braking warning than simple auditory warnings alone. / Master of Science
9

A Systematic Investigation into Induction and Mitigation Methods of Motion Sickness in Passengers of Automated Vehicles

Dam, Abhraneil 13 March 2025 (has links)
Automated vehicle technology can not only transform vehicle behavior on roadways, but also transform users from an active driver to a passenger, with increase in automation levels, such as going from SAE Levels 0 through 2, to Levels 3 through 5. As passengers engage in non-driving related tasks (NDRTs) inside a moving vehicle, they experience limited vehicle control and external awareness. Such conditions can lead to passengers becoming motion sick. Since two out of three passengers are prone to motion sickness, even mild symptoms of motion sickness can severely influence users’ experience in automated vehicles. This dissertation includes four studies to investigate the human factors challenge of motion sickness in passengers of automated vehicles. The first study consists of a systematic literature review following the PRISMA framework. Forty-one papers were selected to be qualitatively analyzed based on which an overarching research framework was proposed. The second study focused on verifying if driving styles simulated on a motion-based driving simulator could be used to artificially induce motion sickness in a safe controlled manner. The third study investigated two driving styles with and without an NDRT to corroborate the findings from the previous study. In the fourth and final study, the focus shifted to mitigating motion sickness. A novel auditory display was developed based on existing literature to reduce motion sickness. Findings from the second and third studies confirmed that strong lateral accelerations could indeed induce motion sickness, and engagement in a cognitively demanding task could lower motion sickness. Based on these findings, the Cognitive Distraction Effect was proposed in the third study. The fourth study, that utilized the verified motion sickness inducing condition from the second and third studies, found that the presence of repeated spatialized anticipatory auditory cues increased motion sickness due to the added sense of vection from the auditory stimuli. This was a unique observation that aligned with recent literature. Furthermore, the fourth study also found evidence in support of the Cognitive Distraction Effect. In summary, this dissertation provides a comprehensive investigation into developing our understanding of motion sickness in passengers of automated vehicles. Three unique contributions are proposed. One, it is possible to induce motion sickness in a safe replicable manner in a laboratory without the need for real-world driving. Second, cognitive engagement in a demanding task can suppress physiological symptoms of motion sickness, suggesting NDRT engagement could have benefits for mitigating motion sickness. Finally, the dissertation sheds new light on the senses that contribute towards development of motion sickness, in that even the hearing system has a role to play in maintaining balance and orientation, in addition to the visual and vestibular systems. / Doctor of Philosophy / The rise of automated vehicle technologies such as Advanced Drives Assistance Systems (ADAS), has the potential to transform drivers into passengers, with increased automation levels requiring less and less user input. These features can benefit users allowing them to utilize their transportation time in a manner of their choosing, while also improving safety. However, when engaging in such tasks that do not allow vehicle occupants to maintain control of their vehicle, or disconnect them from the external environment, passengers can become motion sick, influencing their overall wellbeing. Such conditions can cause users to not want to utilize the advance vehicle automation technologies. To improve users' comfort and experience, the current dissertation undertook research on the topic of motion sickness in passengers of automated vehicles. To that end, four studies were conducted. The first study reviewed the existing literature to identify 41 scientific papers. These papers were analyzed to reveal an overarching research framework that could guide future researchers and students. The second study developed aggressive and soft driving scenarios on a motion-based driving simulator to artificially induce motion sickness in a safe controlled manner. It was verified that the aggressive driving scenario producing sufficiently large lateral accelerations could induce motion sickness. It was also tested if performing a phone task would increase motion sickness, but this did not turn out to be the case. The third study built upon the second study, and tested the effect of being engaged and not engaged on a phone task in both the aggressive and soft driving scenarios. The results showed the effectiveness of the aggressive driving scenario to induce motion sickness, and that engagement in the phone task actually had a mitigating effect on motion sickness. This was explained by the mental distraction presented by the task that led to lower motion sickness. In the last study, the same aggressive driving scenario was used to induce motion sickness, but participants also received spatial auditory alerts before turning; it was expected to lower motion sickness by informing participants about the turn. However, the alerts showed an increase in motion sickness because participants felt increased sense of motion from the auditory alerts before the turns, which aligned with previous findings as well. In addition, the effect of mental distraction on lowering motion sickness was also observed here, confirming findings from the previous studies. Overall, the studies in this dissertation found a way to safely induce motion sickness without the dangers of real-world driving, it identified how being occupied in a task inside the vehicle may have a positive effect on motion sickness, and that auditory alerts should be developed within reason to inform passengers about upcoming motion.
10

"Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flicking

Jeon, Myounghoon 11 November 2010 (has links)
In a large number of electronic devices, users interact with the system by navigating through various menus. Auditory menus can complement or even replace visual menus, so research on auditory menus has recently increased with mobile devices as well as desktop computers. Despite the potential importance of auditory displays on touch screen devices, little research has been attempted to enhance the effectiveness of auditory menus for those devices. In the present study, I investigated how advanced auditory cues enhance auditory menu navigation on a touch screen smartphone, especially for new input gestures such as tapping, wheeling, and flicking methods for navigating a one-dimensional menu. Moreover, I examined if advanced auditory cues improve user experience, not only for visuals-off situations, but also for visuals-on contexts. To this end, I used a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the users of where they are in a long menu. In this study, each item in a menu was preceded by a sound based on the item's initial letter. One hundred and twenty two undergraduates navigated through an alphabetized list of 150 song titles. The study was a split-plot design with manipulated auditory cue type (text-to-speech (TTS) alone vs. TTS plus spindex), visual mode (on vs. off), and input gesture style (tapping, wheeling, and flicking). Target search time and subjective workload for the TTS + spindex were lower than those of the TTS alone in all input gesture types regardless of visual type. Also, on subjective ratings scales, participants rated the TTS + spindex condition higher than the plain TTS on being 'effective' and 'functionally helpful'. The interaction between input methods and output modes (i.e., auditory cue types) and its effects on navigation behaviors was also analyzed based on the two-stage navigation strategy model used in auditory menus. Results were discussed in analogy with visual search theory and in terms of practical applications of spindex cues.

Page generated in 0.0573 seconds