Return to search

Gaze control for detail and overview in image exploration / Gaze control for detail and overview in image exploration

Eye tracking technology has made it possible to accurately and consistently track a users gaze position on a screen. The human eyes center of focus, where it can see the most detailed information, is quite small at a given moment. The peripheral vision of humans have a much lower level of details than the center of gaze. Knowing this, it is possible to display a view that increases the level of resolution at the position of the users gaze point on the screen, while the rest of the screen keeps a lower resolution. An implementation of such a system can generate a representation of data with both detail and overview. The results indicate that even with simple gaze data processing it is possible to use gaze control to help explore details of a high resolution image. Gaze data processing often involve a compromise between stability, responsiveness and latency. A low latency, highly responsive gaze data filter would increase the risk for lens oscillation, and demand a higher concentration level from the viewer then a slower filter would. Applying a gaze data filter that allowed for smooth and stable lens movement for small saccades and responsive movements for large saccades proved successfully. With the uses of gaze control the user might be able to use a gaze aware application more efficient since gaze precedes actions. Gaze control would also reduce the need for hand motions which could provide a improved work environment for people interacting with computer.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-121686
Date January 2015
CreatorsRauhala, Sebastian
PublisherLinköpings universitet, Medie- och Informationsteknik, Linköpings universitet, Tekniska högskolan
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0021 seconds