• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 10
  • 10
  • 6
  • 6
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Improving web usability for the visually impaired

Kullman, Christoffer January 2009 (has links)
<p>The Web has opened up many possibilities for disabled people to interact with society, but there is unfortunately a lack of parity between the user interface presented to different users.</p><p>This dissertation presents a proof of concept on designing a spatial layout presentation for blind users using a screen reader. This is done in three steps by first conducting a survey to determine current practices of web developers, then implementing an instant spatial feedback and comparison function that present the spatial layout, and ends with an evaluation of the spatial layout presentation by the way of user testing</p><p>The survey yielded a set of guidelines for the realistic development of web technologies for disabled persons based on the participants answered. From the implementation a concept for spatial feedback functions that are portable and expandable is presented. The evaluation shows that the created spatial presentation method passes both objectively and subjectively.</p>
2

Improving web usability for the visually impaired

Kullman, Christoffer January 2009 (has links)
The Web has opened up many possibilities for disabled people to interact with society, but there is unfortunately a lack of parity between the user interface presented to different users. This dissertation presents a proof of concept on designing a spatial layout presentation for blind users using a screen reader. This is done in three steps by first conducting a survey to determine current practices of web developers, then implementing an instant spatial feedback and comparison function that present the spatial layout, and ends with an evaluation of the spatial layout presentation by the way of user testing The survey yielded a set of guidelines for the realistic development of web technologies for disabled persons based on the participants answered. From the implementation a concept for spatial feedback functions that are portable and expandable is presented. The evaluation shows that the created spatial presentation method passes both objectively and subjectively.
3

[en] SOUND AS A PROJECT TOOL IN DESIGN / [pt] O SOM COMO FERRAMENTA PROJETUAL NO DESIGN

MARCELO PEDRUZZI FERRANTI 12 November 2018 (has links)
[pt] A pesquisa propõe uma série de métodos e princípios para explorar o som em um processo projetual de design. Questões relativas aos aspectos sonoros parecem ser negligenciadas durante o processo projetual de design. Como poderia um designer, mesmo sem um treinamento formal acerca do som, utilizar o som em seus projetos de uma forma construtiva, intencional e objetiva? Para responder a essa pergunta, investigamos o impacto das mudanças sofridas pelas paisagens sonoras nas últimas décadas, ocasionadas pela industrialização e mecanização da sociedade. O impacto e a percepção dessas sonoridades pelo indivíduo também são discutidos, utilizando-se, para isso, os conhecimentos provenientes da acústica, psico-acústica e os mecanismos atencionais. Mapeamos, também, o estado da arte, evolução e aplicação desses sons em diferentes campos do saber, tais como a interação homem-computador, sonic interaction design, cinema e a musicologia, com a finalidade de obtermos um recorte teórico acerca do som que seja pertinente ao campo do design. Como resultado dessa exploração, propomos um conjunto de princípios norteadores e de métodos exploratórios sonoros para prototipar o som para o design. Por fim, aplicações e desdobramentos futuros em relação ao uso desses princípios e métodos são sugeridos. / [en] This research proposes a number of methods and principles to explore the sound in a design process. Issues related to sound aspects seems to be neglected during the design process. So, how could a designer, even without a formal sound or musical training, use sound in the projects in a constructive, intentional and objective way? In order to answer this question, we investigated the impact of changes in sound landscapes in the last decades, caused by the industrialization and mechanization of society, which made it possible to dissociate the sound from its emitter, allowing the former one to be manipulated, examined and dissected. The impact and perception of sounds by individuals are also discussed. To do so, the knowledge coming from acoustics, psychoacoustics and attentional mechanisms are used. We also mapped the state of the art, evolution and, application of these sounds in different fields of knowledge, such as human-computer interaction, sonic interaction design, cinema, and musicology, with the purpose of obtaining a relevant contribution to the design field. As a result of this exploration, we propose a set of guiding principles and exploratory methods for sounds as audible affordances for design. Finally, future applications and studies in relation to the use of these principles and methods are suggested.
4

Data Density and Trend Reversals in Auditory Graphs: Effects on Point Estimation and Trend Identification Tasks

Nees, Michael A. 28 February 2007 (has links)
Auditory graphsdisplays that represent graphical, quantitative information with soundhave the potential to make graphical representations of data more accessible to blind students and researchers as well as sighted people. No research to date, however, has systematically addressed the attributes of data that contribute to the complexity (the ease or difficulty of comprehension) of auditory graphs. A pair of studies examined the role of both data density (i.e., the number of discrete data points presented per second) and the number of trend reversals for both point estimation and trend identification tasks with auditory graphs. For the point estimation task, results showed main effects of both variables, with a larger effect attributable to performance decrements for graphs with more trend reversals. For the trend identification task, a large main effect was again observed for trend reversals, but an interaction suggested that the effect of the number of trend reversals was different across lower data densities (i.e., as density increased from 1 to 2 data points per second). Results are discussed in terms of data sonification applications and rhythmic theories of auditory pattern perception.
5

The Effect of Directional Auditory Cues on Driver Performance in a Simulated Truck Cab Environment

Powell, Jared Alan 09 January 2000 (has links)
A human factors experiment was conducted to investigate the potential benefits of using directional auditory cues in intelligent transportation system technologies in commercial vehicles. Twelve licensed commercial vehicle operators drove a commercial truck-driving simulator and were prompted to select highway numbers on a side-task display. Prompts were presented visually or aurally. Auditory prompts were presented either diotically (both ears simultaneously) or directionally (to either the left or right ear). The search task varied in map density and timing of the prompts in relation to speed limit changes. All experimental conditions were compared to a control condition containing no secondary task. Both driving performance (lane deviation, steering wheel angle, road heading angle error, accidents, and adherence to the speed limit) and secondary task performance (accuracy and response time) measures were collected. Results showed that drivers were able to respond more quickly and accurately to the search task when directional auditory cues were used. Results also showed that driving performance degrades when display density increases and that the use of directional auditory prompts lessens this deterioration of performance for high-density conditions. / Master of Science
6

Contextualizing Accessibility : Interaction for Blind Computer Users

Winberg, Fredrik January 2008 (has links)
Computer usage today is predominantly based on graphical interaction, where the visual presentation of information is essential both for input (hand-eye coordination when using a computer mouse), and output (seeing the information on a computer screen). This can create difficulties for blind computer users, both at an individual level when interacting with a computer, and also when collaborating with other computer users. The work presented in this thesis has investigated interaction for blind computer users in three stages. First investigating access to information by making studies on an interactive audio-only game, drawing conclusions about auditory direct manipulation and auditory interface design. Second studying collaboration between blind and sighted computer users in two different contexts, leading to questioning of commonly expressed design principles regarding access to collaboration. Finally studying accessibility in a working environment, finding out how technology, the assistive device used by the blind person, communication with others and professional knowledge interplayed to create an accessible work environment. Based on these empirical studies, the main conclusion from this work is a proposal of a research perspective, Assistive interfaces as cooperative interfaces. Here, the context where the interface is going to be used is in focus, and cooperative and social dimensions of interaction are acknowledged and highlighted. The design and analysis of assistive devices should be highly sensitive to the socio-interactional environment, and not just focusing on the single individual using an assistive device. / QC 20100921
7

Internal representations of auditory frequency: behavioral studies of format and malleability by instructions

Nees, Michael A. 16 November 2009 (has links)
Research has suggested that representational and perceptual systems draw upon some of the same processing structures, and evidence also has accumulated to suggest that representational formats are malleable by instructions. Very little research, however, has considered how nonspeech sounds are internally represented, and the use of audio in systems will often proceed under the assumption that separation of information by modality is sufficient for eliminating information processing conflicts. Three studies examined the representation of nonspeech sounds in working memory. In Experiment 1, a mental scanning paradigm suggested that nonspeech sounds can be flexibly represented in working memory, but also that a universal per-item scanning cost persisted across encoding strategies. Experiment 2 modified the sentence-picture verification task to include nonspeech sounds (i.e., a sound-sentence-picture verification task) and found evidence generally supporting three distinct formats of representation as well as a lingering effect of auditory stimuli for verification times across representational formats. Experiment 3 manipulated three formats of internal representation (verbal, visuospatial imagery, and auditory imagery) for a point estimation sonification task in the presence of three types of interference tasks (verbal, visuospatial, and auditory) in an effort to induce selective processing code (i.e., domain-specific working memory) interference. Results showed no selective interference but instead suggested a general performance decline (i.e., a general representational resource) for the sonification task in the presence of an interference task, regardless of the sonification encoding strategy or the qualitative interference task demands. Results suggested a distinct role of internal representations for nonspeech sounds with respect to cognitive theory. The predictions of the processing codes dimension of the multiple resources construct were not confirmed; possible explanations are explored. The practical implications for the use of nonspeech sounds in applications include a possible response time advantage when an external stimulus and the format of internal representation match.
8

An Investigation of Auditory Icons and Brake Response Times in a Commercial Truck-Cab Environment

Winters, John 11 June 1998 (has links)
In the driving task, vision, hearing, and the haptic senses are all used by the driver to gather required information. Future Intelligent Transportation Systems components are likely to further increase the volume of information available to or required by the driver, particularly in the case of commercial vehicle operators. The use of alternate modalities to present in-vehicle information is a possible solution to the potential overload of the visual channel. Auditory icons have been shown to improve operator performance and decrease learning and response times, not only in industrial applications, but also as emergency braking warnings. The use of auditory icons in commercial truck cabs has the potential to increase the number of auditory displays that can be distinguished and understood by commercial vehicle operators, and this experiment sought to determine the utility of auditory icons in that situation. Nine auditory icons were evaluated by commercial vehicle operators as they drove an experimental vehicle over public roads. A comparison of the data collected in the truck-cab environment to data collected in a laboratory study on the same auditory icons revealed some differences in the perceived meaning, perceived urgency, and association with the auditory icons' intended meanings between the two conditions. The presence of these differences indicates that driver evaluations of auditory icons can be affected by the environment, and testing should therefore be conducted in a situation that approximates the end-user environment as closely as possible. A comparison of the drivers' brake response times across the three warning conditions (no warning, auditory icon, and soft braking) was also conducted on a closed, secure handling course. Dependent measures included overall brake reaction time and its components, steering response time, time to initial driver action, and categorical measures of driver responses (steering, swerving, braking, and stopping). The results indicated numerically shorter mean response times (on the order of 0.5 seconds for Total Brake Response Time) for the two conditions with warnings, but the differences were not statistically significant. The most likely reason for this lack of significance is the extreme between-subject variability in response times in the no warning condition. An analysis of the response time variance across the three conditions did indicate significantly less variability in operator responses in the two warning conditions. Two of the five dependent measures (Brake Pedal Contact Time and Total Brake Response Time) exhibited significantly reduced variance in the auditory icon warning condition compared to the no warning condition. The soft braking warning condition exhibited significantly reduced variance for four of the dependent measures (Accelerator Reaction Time, Brake Pedal Contact Time, Total Brake Response Time, and First Reaction Time). These results indicate that a soft braking stimulus like that used in this study could potentially prove to be a more effective emergency braking warning than simple auditory warnings alone. / Master of Science
9

"Spindex" (speech index) enhances menu navigation user experience of touch screen devices in various input gestures: tapping, wheeling, and flicking

Jeon, Myounghoon 11 November 2010 (has links)
In a large number of electronic devices, users interact with the system by navigating through various menus. Auditory menus can complement or even replace visual menus, so research on auditory menus has recently increased with mobile devices as well as desktop computers. Despite the potential importance of auditory displays on touch screen devices, little research has been attempted to enhance the effectiveness of auditory menus for those devices. In the present study, I investigated how advanced auditory cues enhance auditory menu navigation on a touch screen smartphone, especially for new input gestures such as tapping, wheeling, and flicking methods for navigating a one-dimensional menu. Moreover, I examined if advanced auditory cues improve user experience, not only for visuals-off situations, but also for visuals-on contexts. To this end, I used a novel auditory menu enhancement called a "spindex" (i.e., speech index), in which brief audio cues inform the users of where they are in a long menu. In this study, each item in a menu was preceded by a sound based on the item's initial letter. One hundred and twenty two undergraduates navigated through an alphabetized list of 150 song titles. The study was a split-plot design with manipulated auditory cue type (text-to-speech (TTS) alone vs. TTS plus spindex), visual mode (on vs. off), and input gesture style (tapping, wheeling, and flicking). Target search time and subjective workload for the TTS + spindex were lower than those of the TTS alone in all input gesture types regardless of visual type. Also, on subjective ratings scales, participants rated the TTS + spindex condition higher than the plain TTS on being 'effective' and 'functionally helpful'. The interaction between input methods and output modes (i.e., auditory cue types) and its effects on navigation behaviors was also analyzed based on the two-stage navigation strategy model used in auditory menus. Results were discussed in analogy with visual search theory and in terms of practical applications of spindex cues.
10

Auditory displays : A study in effectiveness between binaural and stereo audio to support interface navigation

Bergqvist, Emil January 2014 (has links)
This thesis analyses if the change of auditory feedback can improve the effectiveness of performance in the interaction with a non-visual system, or with a system used by individuals with visual impairment. Two prototypes were developed, one with binaural audio and the other with stereo audio. The interaction was evaluated in an experiment where 22 participants, divided into two groups, performed a number of interaction tasks. A post-interview were conducted together with the experiment. The result of the experiment displayed that there were no great difference between binaural audio and stereo regarding the speed and accuracy of the interaction. The post-interviews displayed interesting differences in the way participants visualized the virtual environment that affected the interaction. This opened up interesting questions for future studies.

Page generated in 0.0465 seconds