Return to search

Auditory display for internet-based E-healthcare robotic system. / CUHK electronic theses & dissertations collection

A psychological experiment based on a MIDI (Musical Instrument Digital Interface) sequence auditory interface was conducted initially to examine the rationale of using acoustic information in teleoperation. The experiment was designed to separately test subjects' perceptions of obstacle location and proximities of obstacles. The results revealed the potential use of audio stimuli in teleoperation tasks as well as several drawbacks about this interface. The interface translates information into a single audio stream, as a result, fails to exploit the spatial ability of the ear. Therefore, it was considered to represent the information acquired from the robotic communication sensors---microphones pair and one camera---by means of spatial audio in an ecological way. Firstly, a monitoring method based on the two microphones has been developed to supplement the narrow view of the camera, so that a better understanding of the environment can be formed. The developed bio-mimetic algorithm based on a new Aibo's head model is able to locate the sound event with 10° resolution. Afterwards, a new strategy for vision to audio sensory substitution has been proposed in which the task is concentrated on the spatial motion perception for mobile robot operation. After tracking a moving target from monocular image sequence by an active contour model, the spatial positions of the moving were determined by a pinhole camera model and camera calibration. Accordingly, the corresponding relations of the two modalities, e.g., spatial direction and scaled depth, were built for translation. / A scientific way of using auditory feedback as the substitute for visual feedback is proposed in the thesis to guarantee that the E-healthcare robotic system still functions under the conditions of image losses, visual fails and low-bandwidth communication links. This study is an experimental exploration into a relatively new topic about real-time robotic control. / Conclusions and recommendations for further research about the successful and extended usage of auditory display in teleoperation are also included. / Finally, an experimental e-healthcare robotic system has been developed with which high-frequency interactive contacts between patients and physicians or/and family members can be realized. Specifically, a new network protocol, Trinomial Protocol, has been implemented to facilitate data communication between client and server. Using two protocols: TCP and Trinomial Protocol, we have conducted experiments over a local network and the trans-pacific Internet. The experimental results about roundtrip time (RTT) and sending rate showed that there were large spikes corresponding to severe delay jitters when TCP was used and much less variance in RTTs when Trinomial protocol was used. To sum up, the Trinomial Protocol achieves better performance than the TCP. With this system, we also carried out some psychological experiments to compare the teleoperation performance under different sensory feedback conditions. The time it took to finish the task and the distance away to the target when the robot was controlled to stop were recorded for all the experiments. In addition, subjective workload assessments based on a set of NASA Task Load Index were collected. For the completion time of the task, the difference between the different modalities was not large. Even for vision only feedback, the average completion time was slightly larger than the auditory feedback. After pair t-test analysis, it was found there was no significant difference. Results of distance perception showed that the target was perceived more correctly using bimodal audiovisual integration than vision only condition, but less precise when compared with auditory only condition. As to the workload assessments, the average workload was 9.5973 for the auditory condition and 8.6147 for the visual one. There was no significant difference between them. The experimental results demonstrate the effectiveness of our proposed auditory display approaches in navigating a robot remotely. / Liu Rong. / "September 2006." / Adviser: Max O. H. Meng. / Source: Dissertation Abstracts International, Volume: 68-03, Section: B, page: 1765. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (p. 128-140). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.

Identiferoai:union.ndltd.org:cuhk.edu.hk/oai:cuhk-dr:cuhk_343923
Date January 2006
ContributorsLiu, Rong, Chinese University of Hong Kong Graduate School. Division of Electronic Engineering.
Source SetsThe Chinese University of Hong Kong
LanguageEnglish, Chinese
Detected LanguageEnglish
TypeText, theses
Formatelectronic resource, microform, microfiche, 1 online resource (xvi, 140 p. : ill.)
RightsUse of this resource is governed by the terms and conditions of the Creative Commons “Attribution-NonCommercial-NoDerivatives 4.0 International” License (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Page generated in 0.0021 seconds