Spelling suggestions: "subject:"multitouch technology"" "subject:"mutitouch technology""
1 |
Participatory gesture design: an investigation of user-defined gestures for conducting an informational search using a tablet deviceRakubutu, Tsele 06 March 2014 (has links)
Multi-touch technology, used in consumer products such as the iPad, enables users to register multiple points
of contact at the same time; this enables a user to interact with a touch screen interface using several fingers
on one hand, or even both hands. This affords interface designers the opportunity to define gestural
interactions based on what is most natural for users and not on merely what can be recognised and processed
by technology. In light of this, the research question that this study aimed to address was: what is the most
intuitive user-defined gesture set for conducting an informational search on a multi-touch tablet web
browser?
In addressing this research question, the aim of this study was to create a user-defined gesture set for
conducting an informational search on a multi-touch tablet web browser, based on gestures elicited from
participants with little or no experience with touch screen devices. It was necessary to use these participants
as users who are familiar with touch screen interfaces would draw upon the gestures they have learnt or used
before, and would therefore be biased in the gestures they proposed. Inexperienced or naïve users would
simply provide gestures that came naturally to them, providing a more accurate reflection of what a typical,
unbiased user would do. A set of hypotheses, relating to the gestures that would be elicited from this
participant group, were drawn up and investigated. These investigations yielded the following key findings:
• The use of two-handed gestures should be limited.
• If two-handed gestures are developed for a specific function, an alternative one-handed gesture
should be made available.
• It is not be advisable to create completely novel gestures for tablet web browsing that do not
correspond to any of the ways in which desktop web browsing is performed.
• Should novel gestures be developed for tablet web browsing, gestures that are desktop computing
adaptations, including those that require menu access, should be made available as alternatives to
users.
• Tasks should be designed is such a way that they may be completed with a variety of gestures.
• Complex tasks should be designed in such a way that they may be achieved through varying
combinations of gestures.
These findings may assist interface designers and developers in the gestures they design or developer for
their applications. In addition to these findings, the study presents a coherent, user-defined gesture set that
may be used in practice by designers or developers.
|
2 |
Μελέτη και αξιολόγηση τεχνολογιών πολλαπλής αφής / Study and evaluation of multi-touch technologiesΠανόπουλος, Βασίλειος 07 June 2013 (has links)
Οι τεχνολογίες πολλαπλής αφής παρουσιάζουν μεγάλο ερευνητικό και τεχνολογικό ενδιαφέρον στον τομέα της αλληλεπίδρασης ανθρώπου-υπολογιστή από τις προηγούμενες κιόλας δεκαετίες. Το κόστος όμως τέτοιων συστημάτων καθιστούσε απαγορευτική τη χρήση τους εκτός έξω από το χώρο ερευνητικών εργαστηρίων. Οι εξελίξεις στον τομέα αυτό είναι ραγδαίες με αποτέλεσμα πλέον τα συστήματα αυτά να καθίστανται προσιτά και ιδιαίτερα ελκυστικά, προσφέροντας έτσι τη δυνατότητα ευρύτερης αποδοχής τους. Η διπλωματική αυτή εργασία, που εκπονήθηκε στο Εργαστήριο της Ερευνητικής Ομάδας Αλληλεπίδρασης Ανθρώπου Υπολογιστή του Τμήματος Ηλεκτρολόγων Μηχανικών και Τεχνολογίας Υπολογιστών του Πανεπιστημίου Πατρών, υπό την επίβλεψη του Καθ. Νικόλαου Αβούρη, περιγράφει σε πρώτο στάδιο τη μελέτη και αξιολόγηση των τεχνολογιών αφής και ιδιαίτερα των οπτικών καθώς αυτές επιτρέπουν την κατασκευή συστημάτων μεγάλων διαστάσεων. Έπειτα από επιλογή της καταλληλότερης τεχνολογίας με βάση τις προαπαιτούμενες προδιαγραφές παρουσιάζεται η κατασκευή ενός λειτουργικού συστήματος πολλαπλής αφής με χρήση της τεχνικής ‘Diffused Illumination’. Η αναγνώριση των σημείων αφής επετεύχθη με χρήση του προγράμματος ανοιχτού λογισμικού CommunityCoreVision (CCV). Στόχος της κατασκευής είναι η αξιολόγηση του ίδιου του συστήματος καθώς και η έρευνα και η αξιολόγηση νέων τρόπων αλληλεπίδρασης με το σύστημα αυτό. Η αξιολόγηση πραγματοποιήθηκε μέσω πειραματικής διαδικασίας που κατέδειξε την αποτελεσματικότητα του συστήματος στη διαχείριση αντικειμένων σε σχέση με έναν επιτραπέζιο υπολογιστή. Η εφαρμογή του πειράματος αναπτύχθηκε με τη γλώσσα Python και το Kivy. Η διαδικασία αυτή οδήγησε στην απόκτηση πρακτικών και θεωρητικών γνώσεων σε μεγάλο μέρος του γνωστικού φάσματος του τομέα της κατασκευής ολοκληρωμένων συστημάτων. / When interacting with a regular desktop computer, indirect devices such as a mouse or keyboard are used to control the computer. Results of the interaction are displayed on a monitor. Current operating systems are restricted to one pointing device. With the introduction of multi-touch, a new form of human computer interaction is introduced. Multi-touch combines display technology with sensors which are capable of tracking multiple points of input. The idea is that this would allow users to interact with the computer in a natural way. Furthermore, unlike interaction on a desktop computer, multi-touch allows multiple users to interact with the same devices at the same time.
Due to recent innovations multi-touch technology has become affordable. For this project an optical based multi-touch device has been designed and constructed at the Human Computer Interaction group of the Electrical and Computer Engineering department of the University of Patras. To perform multi-touch point tracking we used CommunityCoreVision (CCV), a free open source multi-touch framework. To demonstrate the possibilities of multi-touch input technology the system was tested with existing applications, which are controlled by gestures. Multi-touch systems are often stand-alone systems that do not have external input devices attached, however, such devices can be used to support difficult task such as writing. In order to simplify common tasks, a gesture recognition engine has been evaluated along with a multi-touch mouse driver compatible with Windows 7 that supports TUIO messages. Through an experiment designed with the Python based Kivy framework, we evaluate how multi-touch input performs on a specific object manipulation task compared to conventional mouse input.
|
Page generated in 0.0656 seconds