Spelling suggestions: "subject:"gesture.based interaction"" "subject:"featurebased interaction""
1 |
Natural Gesture Based Interaction for Handheld Augmented RealityGao, Lei January 2013 (has links)
The goal of this research thesis is to explore and evaluate a novel interaction interface performing canonical manipulations in 3D space for Augmented Reality (AR) on handheld devices. Different from current handheld AR applications usually using touch-screen based interaction methods, we developed a 3D gesture based interaction approach for handheld AR using an attached RGB-Depth camera to provide intuitive 3D interaction experience in 3D space. By identifying fingertips and mapping their 3D positions into the coordinate system of AR virtual scene, our proposed method allows users to perform operations on virtual objects using their fingers in midair with six-degrees-of-freedom (6DOF). We applied our methods in two systems: (1) a client-server handheld AR system, and (2) a standalone handheld tablet AR system. In order to evaluate the usability of our gesture-based interface we conducted a user study in which we compared the performance to a 2D touch-based interface. From the results, we concluded that traditional 2D touch-based interface performed faster than our proposed 3D gesture-based interface. However, our method proved a high entertainment value, suggesting great possibilities for leisure applications.
|
2 |
A manual alphabet for touchless gesture-controlled writing input with a myoelectric device : Design, evaluation and user experienceBieber Bardt, Raphaela January 2015 (has links)
The research community around gesture-based interaction has so far not paid attention to the possibility of replacing the keyboard with natural gestures for writing purposes. Additionally, insight into the actual user experience of such an interaction style is only insufficiently provided. This work presents a novel approach for text input that is based on a manual alphabet, MATImyo. The hand alphabet was developed in a user-centered design process involving potential users in pre-studies, design process and evaluation procedure. In a Wizard-of-Oz style experiment with accompanying interviews, the alphabet’s quality as input language for composing electronic texts was evaluated and the user experience of such an interaction style assessed. MATImyo was found to be very suitable as gestural input language with a positive user experience. The whole process of designing MATImyo and evaluating its suitability and user experience was based on the principles of Embodied Interaction, which was chosen as theoretical framework. This work contributes to understanding the bigger picture of the user experience of gesture-based interaction and presents a novel, more natural text input method.
|
3 |
Increasing the expressive power of gesture-based interaction on mobile devices / Augmenter le pouvoir d'expression de l'interaction gestuelle sur les appareils mobilesAlvina, Jessalyn 13 December 2017 (has links)
Les interfaces mobiles actuelles permettent aux utilisateurs de manipuler directement les objets affichés à l’écran avec de simples gestes, par exemple cliquer sur des boutons ou des menus ou pincer pour zoomer. Pour accéder à un espace de commande plus large, les utilisateurs sont souvent forcés de passer par de nombreuses étapes, rendant l’interaction inefficace et laborieuse. Des gestes plus complexes sont un moyen puissant d’accéder rapidement à l’information ainsi que d’exécuter des commandes plus efficacement [5]. Ils sont en revanche plus difficiles à apprendre et à contrôler. Le “Gesture Typing” (saisie de texte par geste) est une alternative intéressante au texte tapé: il permet aux utilisateurs de dessiner un geste sur leur clavier virtuel pour entrer du texte, depuis la première jusqu’à la dernière lettre d’un mot. Dans cette thèse, j’augmente le pouvoir d’expression de l’interaction mobile en tirant profit de la forme et la dynamique du geste et de l’espace d'écran, pour invoquer des commandes ainsi que pour faciliter l’appropriation dans différents contextes d’usage. Je conçois "Expressive Keyboard", qui transforme la variation du geste en un résultat riche et je démontre plusieurs applications dans le contexte de la communication textuelle. Et plus, je propose "CommandBoard", un clavier gestuel qui permet aux utilisateurs de sélectionner efficacement des commandes parmi un vaste choix tout en supportant la transition entre les novices et les experts. Je démontre plusieurs applications de "CommandBoard", dont chacune offre aux utilisateurs un choix basé sur leurs compétences cognitives et moteur, ainsi que différentes tailles et organisations de l’ensemble des commandes. Ensemble, ces techniques donnent un plus grand pouvoir expressif aux utilisateurs en tirant profit de leur contrôle moteur et de leur capacité à apprendre, à contrôler et à s’approprier. / Current mobile interfaces let users directly manipulate the objects displayed on the screen with simple stroke gestures, e.g. tap on soft buttons or menus or pinch to zoom. To access a larger command space, the users are often forced to go through long steps, making the interaction cumbersome and inefficient. More complex gestures offer a powerful way to access information quickly and to perform a command more efficiently [5]. However, they are more difficult to learn and control. Gesture typing [78] is an interesting alternative to input text: it lets users draw a gesture on soft keyboards to enter text, from the first until the final letter in a word. In this thesis, I increase the expressive power of mobile interaction by leveraging the gesture’s shape and dynamics and the screen space to produce rich output, to invoke commands, and to facilitate appropriation in different contexts of use. I design "Expressive Keyboard" that transforms the gesture variations into rich output, and demonstrate several applications in the context of textbased communication. As well, I propose "CommandBoard", a gesture keyboard that lets users efficiently select commands from a large command space while supporting the transition from novices to experts. I demonstrate different applications of "CommandBoard", each offers users a choice, based on their cognitive and motor skills, as well as the size and organization of the current command set. Altogether, these techniques give users more expressive power by leveraging human’s motor control and cognitive ability to learn, to control, and to appropriate.
|
4 |
Gesture Mapping for Interaction Design: An Investigative Process for Developing Interactive Gesture LibrariesKuhlman, Lane M. 03 September 2009 (has links)
No description available.
|
5 |
Gesture-based interaction for Centralized Traffic Control / Gestbaserad interaktion för centraliserad fjärrstyrning av tågtrafikMilivojevic, Mladen January 2016 (has links)
Ever wondered how trains arrive and depart on time? Traffic Control systems are there to help control the traffic flow, with many operators monitoring and taking actions when necessary. Operators of the traffic control systems use a keyboard and a mouse to interact. Current user interface and work setup consist of many usability issues that can be improved in order to increase operator’s efficiency and productivity in controlling the traffic. Interviewing users of the system and researching on related topics led to a newly proposed design in interaction, user interface as well as some suggestions for increasing productivity. Gesture-based interaction is introduced and simulated for traffic control systems that tend to improve the operation. Various gestures are designed such as panning, zooming and hovering the map using Leap Motion controller which enables intuitive interaction. These gestures aim to solve identified usability issues discovered during the interview with the user. The project aims to answer the following question: Can gesture-based interaction solve usability issues and establish the intuitive use of the CTC system? Performing exploratory research on this topic involved designing, implementing and testing hand gestures with users. From an ergonomic perspective, body posture and hand position of the operator is examined and suggested to use sit-to-stand workstations in order to reduce pain and discomfort while working. Gesture-based interaction eliminates finding mouse cursor on large screens, it enables fast request of detailed information and also it provides a better overview of the map surroundings. Laboratory tests confirm that gesture-based interaction brings more natural and intuitive use of traffic control systems. There is a big potential for gesture-based interaction to increase usability and bring efficient controlling for operators. It would reduce delays of the train and maintain safe traffic flow. / Har du någonsin undrat hur tåg anländer och avgaå i tid? Trafikledningssystem (CTC-system) hjälper till att kontrollera trafikflöet där operatörer övervakar och vidtar åtgärder vid behov. Operatörer av ett trafikledningssystem använder idag ett tangentbord och en mus för att interagera. Det nuvarande användargränssnittet och arbetsinstallationen består av många användbarhetsproblem som kan förbättras för att öka operatörens effektivitet och produktivitet för att kontrollera trafiken. Intervjuer med användare av systemet samt forskning om ämnet ledde till ett nytt föreslag av interaktion, utformning av användargränssnitt samt några förslag för att öka produktiviteten. Den gestbaserade interaktionen som infördes och simulerades för trafikkontrollsystemet tenderar att förbättra funktionen. Olika gester utformades som möjliggör för användaren att panorera, zooma och sväva över kartan. Gesterna implementerades med hjälp av Leap Motion Controller som möjliggör intuitiv interaktion. Dessa gester syftar till att lösa identifierade användbarhetsproblem som upptäcks under intervjuerna med användarna. Syftet med detta arbete var att svara på följande forskningsfråga: Kan gestbaserad interaktion lösa användbarhetsproblem och etablera intuitiv användning av CTC-systemet? Den explorativa forskning som utfördes i detta arbete inkluderade att utforma, genomföra och testa gester med användare. Kroppshållning och handposition för operatorerna undersöktes ur ett ergonomiskt perspektiv och studien föreslår att använda sitt-till-stå arbetsstationer för att minska smärta och obehag under arbetet. Gestbaserad interaktion eliminerar problemet att hitta muspekaren på stora skärmar, vilket gör det enkelt att snabbt hitta detaljerad information och ger även en bättre överblick över kartans omgivning. Laboratorietester bekräftar att gestbaserad interaktion ger mer naturlig och intuitiv användning av trafikledningssystemet. Det finns en stor potential för gestbaserad interaktion för att öka användbarheten och ge en effektiv kontroll för operatörerna. Det skulle minska förseningarna av tåget och upprätthålla ett säkert trafikflöde.
|
6 |
Artefatos e linguagens de interação com sistemas digitais contemporâneos = os anéis interativos ajustáveis para a televisão digital interativa / Artifacts and languages of interaction with contemporary digital systems : the adjustable interactive rings for interactive digital televisionMiranda, Leonardo Cunha de 16 August 2018 (has links)
Orientador: Maria Cecília Calani Baranauskas / Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-16T23:18:27Z (GMT). No. of bitstreams: 1
Miranda_LeonardoCunhade_D.pdf: 4387805 bytes, checksum: b138e75ab0fe002567c99b3af4fd3a50 (MD5)
Previous issue date: 2010 / Resumo: A digitalização da transmissão da televisão terrestre no Brasil e, consequentemente, a possibilidade de oferta de interatividade na televisão estabelece um novo paradigma de interação do telespectador com essa mídia com extremo potencial de impacto social, especialmente para a população brasileira. Entretanto, a existência de artefatos digitais comumente utilizados para a interação com o sistema de televisão hoje praticado não garante que esses dispositivos sejam os mais adequados aos avanços propostos com a Televisão Digital Interativa (TVDI). Além disso, a convivência de um número cada vez maior de equipamentos que fazem uso de controle remoto leva a interfaces mais complexas considerando os problemas existentes com o controle remoto já discutido na literatura por vários autores. O foco desta pesquisa de doutorado foi, portanto, investigar o design da interação nessa nova mídia com o objetivo de propor, desenvolver e validar novas formas de interação entre os usuários e a TVDI. Com base no entendimento de que uma interação mais direta com a TVDI passa pela necessidade de fazer com que o foco da interação se dirija mais à interface das aplicações interativas do que ao artefato físico de interação, chegamos a alguns resultados desta pesquisa. A tecnologia resultante desta pesquisa de doutorado saiu do plano das ideias, passando pelo seu projeto conceitual, de forma participativa, até sua implementação e validação junto a representantes do público-alvo. Podemos destacar algumas contribuições decorrentes da realização desta pesquisa no contexto dos artefatos físicos de interação com a TVDI: i) taxonomia para os artefatos físicos de interação; ii) recomendações de uso dos artefatos físicos de interação conhecidos na literatura; iii) análise sócio-técnica do domínio/contexto
de novos artefatos físicos de interação; iv) diretrizes para novos artefatos físicos de interação; v) modelo de interação baseado em gestos via artefato físico de interação; vi) guidelines de design para novos artefatos físicos de interação; vii) especificações do design do produto e da linguagem de interação de um novo artefato digital para a TVDI; viii) implementações de protótipos de hardware e software do novo artefato digital para a TVDI; e ix) validação das especificações e dos protótipos do novo artefato digital para a TVDI junto ao público-alvo. / Abstract: The digitalization of terrestrial television broadcasting in Brazil and consequently the possibility of offering interactivity on television establish a new paradigm of interaction for the spectator with the media that has great potential to make a social impact, especially for the Brazilian population. However, the existence of digital artifacts commonly used to interact with current television system does not guarantee that those devices are adequate to the developments with the Interactive Digital Television (iDTV). Moreover, the coexistence of an increasing number of devices that make use of the remote control could result in more complex interfaces, considering the problems with remote control already discussed in the literature by several authors. The objective of this Ph.D. research was to investigate the interaction design in the iDTV with the purpose of proposing, developing and validating new ways of interaction with this new media. The research results are grounded in the understanding that a more direct interaction with iDTV involves making the focus of the interaction more on the interface of the interactive applications than on the physical artifact of interaction itself. The technology resulting from this research involved since its conceptual design, with a participatory approach, to its implementation and validation with real users from the target audience. Some contributions of this research in the context of physical artifacts of interaction with the iDTV can be highlighted: i) taxonomy for the physical artifacts of interaction; ii) use recommendations of physical artifacts of interaction known in the literature; iii) socio-technical analysis of the domain/context of new physical artifacts of interaction; iv) guidelines for new physical artifacts of interaction; v) gesture based interaction model via physical artifact of interaction; vi) design guidelines for new physical artifacts of interaction; vii) product design and interaction language specifications for new digital artifact for iDTV; viii) implementations of hardware and software prototypes for the new digital artifact for iDTV; and ix) validation of the specifications and prototypes of the new digital artifact for iDTV with the target audience. / Doutorado / Interação Humano-Computador / Doutor em Ciência da Computação
|
7 |
gestUI: a model-driven method for including gesture-based interaction in user interfacesParra González, Luis Otto 13 October 2017 (has links)
The research reported and discussed in this thesis represents a novel approach to define custom gestures and to include gesture-based interaction in user interfaces of the software systems with the aim of help to solve the problems found in the related literature about the development of gesture-based user interfaces.
The research is conducted according to Design Science methodology that is based on the design and investigation of artefacts in a context. In this thesis, the new artefact is the model-driven method to include gesture-based interaction in user interfaces. This methodology considers two cycles: the main cycle is an engineering cycle where we design a model-driven method to include interaction based on gestures. The second cycle is the research cycle, we define two research cycles: the first research cycle corresponds to the validation of the proposed method with an empirical evaluation and the second cycle corresponds to the technical action research to validate the method in an industrial context.
Additionally, Design Science provides us the clues on how to conduct the research, be rigorous, and put in practice scientific rules. Besides Design Science has been a key issue for organising our research, we acknowledge the application of this framework since it has helps us to report clearly our findings.
The thesis presents a theoretical framework introducing concepts related with the research performed, followed by a state of the art where we know about the related work in three areas: Human-computer Interaction, Model-driven paradigm in Human-Computer Interaction and Empirical Software Engineering.
The design and implementation of gestUI is presented following the Model-driven Paradigm and the Model-View-Controller design pattern. Then, we performed two evaluations of gestUI: (i) an empirical evaluation based on ISO 25062-2006 to evaluate usability considering effectiveness, efficiency and satisfaction. Satisfaction is measured with perceived ease of use, perceived usefulness and intention of use, and (ii) a technical action research to evaluate user experience and usability. We use Model Evaluation Method, User Experience Questionnaire and Microsoft Reaction cards as guides to perform the aforementioned evaluations.
The contributions of our thesis, limitations of the tool support and the approach are discussed and further work are presented. / La investigación reportada y discutida en esta tesis representa un método nuevo para definir gestos personalizados y para incluir interacción basada en gestos en interfaces de usuario de sistemas software con el objetivo de ayudar a resolver los problemas encontrados en la literatura relacionada respecto al desarrollo de interfaces basadas en gestos de usuarios.
Este trabajo de investigación ha sido realizado de acuerdo a la metodología Ciencia del Diseño, que está basada en el diseño e investigación de artefactos en un contexto. En esta tesis, el nuevo artefacto es el método dirigido por modelos para incluir interacción basada en gestos en interfaces de usuario. Esta metodología considera dos ciclos: el ciclo principal, denominado ciclo de ingeniería, donde se ha diseñado un método dirigido por modelos para incluir interacción basada en gestos. El segundo ciclo es el ciclo de investigación, donde se definen dos ciclos de este tipo. El primero corresponde a la validación del método propuesto con una evaluación empírica y el segundo ciclo corresponde a un Technical Action Research para validar el método en un contexto industrial.
Adicionalmente, Ciencia del Diseño provee las claves sobre como conducir la investigación, sobre cómo ser riguroso y poner en práctica reglas científicas. Además, Ciencia del Diseño ha sido un recurso clave para organizar la investigación realizada en esta tesis. Nosotros reconocemos la aplicación de este marco de trabajo puesto que nos ayuda a reportar claramente nuestros hallazgos.
Esta tesis presenta un marco teórico introduciendo conceptos relacionados con la investigación realizada, seguido por un estado del arte donde conocemos acerca del trabajo relacionado en tres áreas: Interacción Humano-Ordenador, paradigma dirigido por modelos en Interacción Humano-Ordenador e Ingeniería de Software Empírica.
El diseño e implementación de gestUI es presentado siguiendo el paradigma dirigido por modelos y el patrón de diseño Modelo-Vista-Controlador. Luego, nosotros hemos realizado dos evaluaciones de gestUI: (i) una evaluación empírica basada en ISO 25062-2006 para evaluar la usabilidad considerando efectividad, eficiencia y satisfacción. Satisfacción es medida por medio de la facilidad de uso percibida, utilidad percibida e intención de uso; y, (ii) un Technical Action Research para evaluar la experiencia del usuario y la usabilidad. Nosotros hemos usado Model Evaluation Method, User Experience Questionnaire y Microsoft Reaction Cards como guías para realizar las evaluaciones antes mencionadas.
Las contribuciones de nuestra tesis, limitaciones del método y de la herramienta de soporte, así como el trabajo futuro son discutidas y presentadas. / La investigació reportada i discutida en aquesta tesi representa un mètode per definir gests personalitzats i per incloure interacció basada en gests en interfícies d'usuari de sistemes de programari. L'objectiu és ajudar a resoldre els problemes trobats en la literatura relacionada al desenvolupament d'interfícies basades en gests d'usuaris.
Aquest treball d'investigació ha sigut realitzat d'acord a la metodologia Ciència del Diseny, que està basada en el disseny i investigació d'artefactes en un context. En aquesta tesi, el nou artefacte és el mètode dirigit per models per incloure interacció basada en gests en interfícies d'usuari. Aquesta metodologia es considerada en dos cicles: el cicle principal, denominat cicle d'enginyeria, on es dissenya un mètode dirigit per models per incloure interacció basada en gestos. El segon cicle és el cicle de la investigació, on es defineixen dos cicles d'aquest tipus. El primer es correspon a la validació del mètode proposat amb una avaluació empírica i el segon cicle es correspon a un Technical Action Research per validar el mètode en un context industrial.
Addicionalment, Ciència del Disseny proveeix les claus sobre com conduir la investigació, sobre com ser rigorós i ficar en pràctica regles científiques. A més a més, Ciència del Disseny ha sigut un recurs clau per organitzar la investigació realitzada en aquesta tesi. Nosaltres reconeixem l'aplicació d'aquest marc de treball donat que ens ajuda a reportar clarament les nostres troballes.
Aquesta tesi presenta un marc teòric introduint conceptes relacionats amb la investigació realitzada, seguit per un estat del art on coneixem a prop el treball realitzat en tres àrees: Interacció Humà-Ordinador, paradigma dirigit per models en la Interacció Humà-Ordinador i Enginyeria del Programari Empírica.
El disseny i implementació de gestUI es presenta mitjançant el paradigma dirigit per models i el patró de disseny Model-Vista-Controlador. Després, nosaltres hem realitzat dos avaluacions de gestUI: (i) una avaluació empírica basada en ISO 25062-2006 per avaluar la usabilitat considerant efectivitat, eficiència i satisfacció. Satisfacció es mesura mitjançant la facilitat d'ús percebuda, utilitat percebuda i intenció d'ús; (ii) un Technical Action Research per avaluar l'experiència del usuari i la usabilitat. Nosaltres hem usat Model Evaluation Method, User Experience Questionnaire i Microsoft Reaction Cards com guies per realitzar les avaluacions mencionades.
Les contribucions de la nostra tesi, limitacions del mètode i de la ferramenta de suport així com el treball futur són discutides i presentades. / Parra González, LO. (2017). gestUI: a model-driven method for including gesture-based interaction in user interfaces [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/89090
|
Page generated in 0.1322 seconds