• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 518
  • 107
  • 87
  • 38
  • 36
  • 34
  • 19
  • 14
  • 7
  • 6
  • 6
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 1007
  • 1007
  • 294
  • 201
  • 186
  • 153
  • 150
  • 139
  • 127
  • 123
  • 117
  • 99
  • 99
  • 94
  • 93
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Green gamification : changing habits through long-term engagement and stories

Kronisch, Devan C. 23 August 2019 (has links)
Gamification offers methods for influencing human behaviour that are not available with other approaches to behaviour-change interventions. Its widespread and successful use in business, education, and health care notwithstanding, it has so far not been extensively used for improving sustainability, nor have its underlying psychological principles been studied in depth. This dissertation investigates gamification with a special focus on the role of perspective taking and emotion. A gamified behaviour-change app was compared with a standard app and a webpage for its effect on participants’ sustainable behaviours. During the one month period the participants engaged with the intervention, they kept diaries about their experience with sustainability and the technology. Furthermore, the influence of dramatic elements was tested through an augmented reality approach. Gamification encourages longer engagement with the intervention, thereby influencing behaviour. Specifically, gamification increases knowledge about and willingness to invest effort into sustainable behaviours. Dramatic elements, using the power of narrative persuasion and immersion, are important aspects to consider in gamification. The theory of behavioural choice can fruitfully serve as a psychological model of how gamification affects behaviour. / Graduate
272

AUGMENTING THE REALITY : Can AR Technology Entice Consumer Engagement? A Quantitative Study

Hellgren, André, von Pongracz, Simon January 2019 (has links)
Today, advances in the technological sector spurs invention toward new heights. What can be achieved today was just decades ago science fiction. Recent years, augmented reality has emerged which is explained by Kipper & Rampolla (2012, p. 1) as a technology that combines the real world with virtual objects which creates a supplement to reality. With its ability to strengthen the impressions of reality by weaving the physical and the digital world together, enables it to be used in various settings. The retail industry has been struggling as of late, with e-commerce flourishing on one hand but contrastingly classic brick-and-mortar stores foreclosing by the thousands. Thus, a technology that has the ability to combine these two channels would thus act as a mitigating force enabling customers to virtually try on their clothes or make furniture digitally appear in their living room. There are numerous possibilities with this technology, given that it can be used in different industries as well with examples from the marketing and gaming industries as the most prominent. What is evident is its ability to interact and engage, making it a usable tool for many activities. Thus, through this thesis we study if augmented reality can affect consumer engagement, and if so which attributes of it has significant positive relationships with the dimensions of consumer engagement. In this thesis, we first provide a framework in which to measure augmented reality in general settings quantitatively, through the use of attributes. These attributes consist of; Interactivity, Playfulness (Escapism & Enjoyment), Service Excellence, Aesthetics, Ease of Use and Perceived Usefulness. We then hypothesize the attributes relationship with two dimensions of consumer engagement identified by Hollebeek et al. (2014); Affection and Cognitive Processing. However, Ease of Use and Service Excellence were not tested in this thesis, as a result of unsatisfactory loadings in the factor analysis. Through an online survey, 79 useful responses were collected and used in testing the hypotheses. Significant positive relationships were found for all tested attributes and Affection, and further significant positive relationships were found between Aesthetics and Perceived Usefulness with Cognitive Processing. It is our belief that this thesis further develops and solidify the current work with consumer engagement quantitively by validating the use of a known framework. Further, it adds to the literature by adopting a general definition of the concept of consumer engagement. This thesis also adds to quantitative work with augmented reality by creating and using a framework in which to study the attributes of augmented reality in a general setting, which has not been done previously. For practitioners, this thesis provides insight into which attributes of augmented reality systems should be emphasized in order to maximize consumer engagement. The thesis ends in suggestions for future research, where we call upon further testing on consumer engagement across different contexts with the use of Hollebeek et al.’s (2014) framework. Such work could lead to a universally accepted quantitative scale for measuring consumer engagement. Lastly, adopting the framework for augmented reality presented in this thesis and applying it to further contexts could yield valuable results, and further tests on Ease of Use and Service Excellence to validate their importance for consumer engagement would be of utmost interest.
273

Ferramenta de áudio conferência espacial implementando conceitos de realidade aumentada. / Spatial audio conference tool implementing augmented reality concepts.

Bulla Junior, Romeo 29 October 2009 (has links)
Este trabalho apresenta uma ferramenta para conferência de áudio 3D (espacial) implementando conceitos de Realidade Aumentada (RA). O objetivo desta ferramenta é aprimorar a sensação de presença e melhorar a interatividade entre seus participantes remotos, por meio de benefícios proporcionados pela utilização de técnicas de áudio espacial (implementadas em avatares de áudio) pela: maior facilidade de concentração e atenção em um único participante e pelos efeitos positivos na memorização dos conteúdos pelos participantes como conseqüência da melhor inteligibilidade e compreensão. A motivação desta implementação reside em sua utilização como ferramenta de comunicação síncrona no ambiente de aprendizagem eletrônica Tidia-Ae, auxiliando na realização de atividades colaborativas e, possivelmente, nos processos de ensino e aprendizagem à distância. A ferramenta implementada foi integrada ao sistema Tidia-Ae e os resultados dos experimentos realizados demonstraram sua efetividade com relação às melhorias proporcionadas pelo processamento de áudio espacial. / This work presents a 3D (spatial) audio conference tool implementing Augmented Reality (AR) concepts. The main intent of this tool is to enhance the sense of presence and increase the interactivity among remote participants, by implementing spatial audio techniques in audio avatars. The use of such techniques facilitates focusing the attention on anyone specific participant of the conference and enables a positive effect on memory retention, resulting in a better intelligibility and comprehension. The motivation of this implementation lies on its appliance as a synchronous communication tool within the Tidia-Ae e-Learning system, thus aiding on collaborative activities realization and, possibly, on teaching and learning processes. The results of the experiments showed the effectiveness provided by the spatial audio processing when applied in such environment.
274

Elaboração e aplicação de um novo objeto de aprendizagem para o ensino de preparo cavitário em Dentística - introdução da realidade aumentada / Development and aplication of a new learning object for teaching a tooth preparation for Operative Dentistry - augmented reality as a tool in teaching dentistry

Trung, Luciana Cardoso Espejo 13 December 2012 (has links)
O processo de ensino-aprendizagem vem sofrendo constante mudança por conta do desenvolvimento das novas tecnologias de informação e comunicação (TIC). Com isto, podem ser criados novos objetos de aprendizagem (OA), que auxiliem o aluno na compreensão de técnicas que muitas vezes têm dificuldade em entender através dos métodos tradicionais. A realidade aumentada (RA) é um recurso tecnológico introduzido como ferramenta de ensino com grande potencial. A RA tem o objetivo de suplementar o mundo real com objetos virtuais 3D, gerados computacionalmente, de tal forma que aparentem coexistir no espaço real. O objetivo deste estudo foi desenvolver um OA com o recurso da RA e aplicá-lo no ensino de preparos dentais para onlays áuricas em Dentística. Foram elaborados questionários (Q) para verificar a habilidade computacional dos participantes (Q1) e a Aceitação do OA (Q2). Fizeram uso do OA 77 participantes, entre eles: alunos de graduação (n=28), professores e alunos de pós-graduação de dentística e prótese (n=30) e alunos de cursos de atualização em dentística e prótese (n=19). Análises de consistência interna (Coeficiente de concordância de Kappa, Alfa de Cronbach e Análise de Conglomerados) demonstraram alta confiabilidade dos questionários. Testes de regressão linear simples entre a variável resposta Score do Q2 e as demais variáveis explicativas: Score Q1, idade, gênero e grupo mostraram grande aceitação por toda amostra estudada, independente da sua habilidade computacional (p=0,99; R2=0,00%), gênero (p=0,27; R2=1,6%), idade (p=0,27; R2=0,1%) ou grupo ao qual pertenciam (p=0,53; R2=1,9%). Conclui-se que a metodologia aplicada foi capaz de desenvolver um OA com RA para o ensino do preparo cavitário de grande aceitação pela amostra estudada. / The teaching and learning process is undergoing in constant change due to the development of new information and communication technologies (ICT). These new developments brings together the necessity of creating new learning objects (LO), that help students to understand the techniques which are difficult to understand trough traditional methods. Augmented reality (AR) is a new introduced technology as a teaching tool with great educational potential. The AR is intended to supplement the real world with 3D virtual objects generated computationally, this way they appear to coexist in real space. The objective of this study was to develop a LO with AR and apply it to teaching steps of preparations for dental auric onlays. Questionnaires (Q) were designed to verify the computational ability (Q1) of the participants and the acceptance of the LO (Q2). The LO was used by 77 participants, among them: undergraduate students (n = 28), professor and postgraduate students in dentistry and prosthesis (n = 30) and professional students refresher dentistry and prosthesis course (n = 19). Analyses of internal consistency (Kappa coefficient, Cronbach´s Alpha and Performance of Conglomerates) demonstrated high degree of confidence of the questionnaires. Tests of simple linear regression were made between the response variable and others variables: Score Q1, age, gender e and group. The results showed wide acceptance, regardless of their computational ability (p=0,99; R2=0,00%), gender (p=0,27; R2=1,6%), age (p=0,27; R2=0,1%) or group to which they belonged (p=0,53; R2=1,9%). It was concluded that the methodology used was able to develop a LO with AR for teaching cavity preparation with high index of acceptance in all groups studied.
275

Exploration, Study and Application of Spatially Aware Interactions Supporting Pervasive Augmented Reality

Ke Huo (5929790) 10 June 2019 (has links)
<div>With rapidly increasing mobile computing devices and high speed networks, large amounts of digital information and intelligence from the surrounding environment have been introduced into our everyday life. However, much of the context and content is in textual and in 2D. To access the digital contents spontaneously, augmented reality~(AR) has become a promising surrogate to bridge the physical with the digital world. Thanks to the vast improvement to the personal computing devices, AR technologies are emerging in realistic scenarios. Commercially available software development kits~(SDKs) and hardware platforms have started to expose AR applications to a large population. </div><div> </div><div>In a broader level, this thesis focuses on investigating suitable interactions metaphors for the evolving AR. In particular, this work leverages the spatial awareness in AR environment to enable spatially-aware interactions. This work explores (i) spatial inputs around AR devices using the local spatial relationship between the AR devices and the scene, (ii) spatial interactions within the surrounding environment exploiting the global spatial relationship among multiple users as well as between the users and the environment. In this work, I mainly study four spatially-aware AR interactions: (i) 3D tangible interactions by directly mapping input to the continuous and discrete volume around the device, (ii) 2D touch input in 3D context by projecting the screen input to the real world, (iii) location aware interactions which use the locations of the real/virtual objects in the AR scene as spatial references, and (iv) collaborative interactions referring to a commonly shared AR scene. This work further develop the enabling techniques including a magnetic sensing based 3D tracking of tangible devices relative to a handheld AR device, a projection based 3D sketching technique for in-situ AR contents creation, a localization method for spatially mapping the smart devices into the AR scene, and a registration approach for resolving the transformations between multiple SLAM AR devices. Moreover, I build systems towards allowing pervasive AR experiences. Primarily, I develop applications for increasing the flexibility of AR contents manipulation, creation and authoring, intuitively interacting with the smart environment, and spontaneously collaborating within a co-located AR scene.</div><div> </div><div>The main body of the research has contributed to multiple on-going collaborative projects. I briefly discuss the key results and visions from these projects including (i) autonomous robotic exploration and mapping of smart environment where the spatial relationship between the robot and the smart devices is resolved, and (ii) human-robot-interaction in AR where the spatial intelligence can be seamlessly exchanged between the human and the robot. Further, I suggest future research projects leveraging three critical features from AR, namely situatedness, mobility, and the capability to support spatial collaborations.</div>
276

Realidade aumentada musical para reabilitação: estudo de caso em musicoterapia. / Musical augmented reality for rehabilitation: a case study in music therapy.

Corrêa, Ana Grasielle Dionísio 14 October 2011 (has links)
Musicoterapia é a ciência que utiliza elementos sonoro-ritmico-musicais no tratamento, reeducação, reabilitação e recuperação de indivíduos com diversas patologias ou ainda na área preventiva. Muitas vezes, pacientes com deficiência física grave, necessitam de adaptações nos instrumentos musicais para realizar o fazer musical musicoterapêutico. Algumas adaptações são feitas sob encomenda pelo musicoterapeuta e, portanto, em pequenas quantidades para o setor. Algumas vezes, um recurso adaptador de prática instrumental atende apenas às necessidades de uma determinada incapacidade física, sendo que para outras, este mesmo recurso pode ser desconfortável. Para alguns pacientes pode ser interessante colocar em prática algumas orientações recebidas na sessão de Musicoterapia em ambiente domiciliar. Entretanto, a situação econômica de alguns pacientes associada ao elevado preço de alguns instrumentos musicais adaptados, dificulta ou inviabiliza a continuidade do tratamento em ambiente domiciliar. Neste trabalho, buscou-se investigar se seria possível conceber um sistema eletrônico interativo capaz de apoiar e ampliar as estratégias de intervenções musicoterapêuticas. A metodologia da pesquisa seguiu a estratégia exploratória, de natureza tecnológica aplicada, tendo como objetivo a geração de um produto com finalidades imediatas, com base em conhecimentos prévios, capaz de viabilizar testes e estudos em situações reais de uso. Colaboraram nesta pesquisa musicoterapeutas, terapeutas ocupacionais e pacientes em tratamento de reabilitação motora. A partir do levantamento e estudos sobre o estado da arte, bem como de observações de sessões de Musicoterapia, foi concebida uma proposta de sistema de Realidade Aumentada musical para reabilitação. A partir desta proposta, foram implementadas e avaliadas três versões do sistema. A primeira avaliação foi realizada com uma especialista em Musicoterapia a fim de verificar a aplicabilidade do sistema. A segunda avaliação foi realizada durante uma intervenção de Musicoterapia na Associação de Assistência à Criança Deficiente (AACD) e, em outro momento, durante uma intervenção de Terapia Ocupacional em domicílio. A terceira avaliação foi feita em intervenções de Musicoterapia na AACD e na Associação Brasileira de Distrofia Muscular (ABDIM). A análise dos dados coletados permitiu constatar que este sistema traz os seguintes benefícios para apoiar intervenções de reabilitação motora: aumento da motivação e satisfação dos pacientes e facilitação do fazer musical de pessoas com deficiência física que possuem dificuldades em manusear os instrumentos musicais convencionais. / Music therapy is the science of using sound-rhythmic-musical elements in treatment, reeducation, recovery and rehabilitation of individuals with various diseases or in preventive activities. Often, patients with severe physical disability need to adapt musical instruments to perform \"music making\" activities in music therapy. Some adaptations are made by music therapists and, therefore, in small quantities. Sometimes, a resource adapter for instrumental practice serves only the needs of a particular disability, and for others, the same feature can be uncomfortable. Also, may be interesting for some patients to put practice some guidelines received at the music therapy session in their home environment. However, the economic situation of some patients associated with the elevated price of some adapted musical instruments hampers or prevents the continuation of care in home environments. In this study, we sought to investigate whether it would be possible to design an interactive electronic system able to support and expand the music therapist intervention strategies. The research methodology followed the exploratory strategy of applied technological, aiming to generate a product with immediate goals, based on prior knowledge, capable of delivering tests and studies in real use. Contributors to this research included music therapists, occupational therapists and patients under motor rehabilitation treatment. From the survey and studies on the state of the art, as well as observations of music therapy sessions, a proposal for an augmented reality musical system for rehabilitation was designed. Based on this proposal, were implemented and evaluated three versions of system. The first evaluation was performed with a specialist in Music Therapy to verify the applicability of system. The second evaluation was carried out during a Music Therapy intervention in the Assistance Association for Children with Disability (AACD) and, on another occasion, during an occupational therapy intervention at home. The third evaluation was performed in Music Therapy interventions in the AACD and the Brazilian Association of Muscular Dystrophy (ABDIM). The data collected analysis allowed us to observe some benefits that this technology brings to support motor rehabilitation interventions: increased morale and satisfaction of patients and facilitation of \"music making\" activities along people with physical disabilities who have difficulty handling conventional musical instruments.
277

A QoE Model to Evaluate Semi-Transparent Augmented-Reality System

Zhang, Longyu 21 February 2019 (has links)
With the development of three-dimensional (3D) technologies, the demand for high-quality 3D content, 3D visualization, and flexible and natural interactions are increasing. As a result, semi-transparent Augmented-Reality (AR) systems are emerging and evolving rapidly. Since there are currently no well-recognized models to evaluate the performance of these systems, we proposed a Quality-of-Experience (QoE) taxonomy for semi-transparent AR systems containing three levels of influential QoE parameters, through analyzing existing QoE models in other related areas and integrating the feedbacks received from our user study. We designed a user study to collect training and testing data for our QoE model, and built a Fuzzy-Inference-System (FIS) model to estimate the QoE evaluation and validate the proposed taxonomy. A case study was also conducted to further explore the relationships between QoE parameters and technical QoS parameters with functional components of Microsoft HoloLens AR system. In this work, we illustrate the experiments in detail and thoroughly explain the results obtained. We also present the conclusion and future work.
278

Double vision : a practice-based investigation of art and differential perception

Lyons, David January 2017 (has links)
<i>Double Vision: A practice- led investigation of art and differential perception</i> is a series of five interrelated practice-led research studies into artistic expression controlling perceptual experiences between audiences of varying visual acuities. Significant refinements  occurred between the first and second, and second and third studies. The last four studies were conducted with the aim of understanding vision’s influence on perception. <i>Double Vision’s</i> lead methodological approach was artistic practice. Other methods were employed according to the needs of that practice. They included iteration, collaboration, exhibition and testing. The research questions of <i>Double Vision</i> were refined in response to the results of artistic practice. That evolution resulted in two interrelated questions: <i>Can artwork be intentionally created to be experienced differently dependent on one’s visual abilities? </i>and<i> If so, can those experiences be shared?</i> A further question, <i>‘Can an analogy to colour deficient vision be created that engages both those with colour vision deficiency and the typically sighted?’, </i>concludes the investigations. Artwork was realized through printmaking, animation and multimedia formats. Its context and content derived from many forms, notably the Ishihara <i>Test for Colour Deficiency</i>, writings of William Blake, contemporary music and philosophy. Augmented reality was employed to facilitate the translation of visual perceptions between targeted audiences. A number of exhibitions were held exploring these themes.
279

Supporting Multi-User Interaction in Co-Located and Remote Augmented Reality by Improving Reference Performance and Decreasing Physical Interference

Oda, Ohan January 2016 (has links)
One of the most fundamental components of our daily lives is social interaction, ranging from simple activities, such as purchasing a donut in a bakery on the way to work, to complex ones, such as instructing a remote colleague how to repair a broken automobile. While we interact with others, various challenges may arise, such as miscommunication or physical interference. In a bakery, a clerk may misunderstand the donut at which a customer was pointing due to the uncertainty of their finger direction. In a repair task, a technician may remove the wrong bolt and accidentally hit another user while replacing broken parts due to unclear instructions and lack of attention while communicating with a remote advisor. This dissertation explores techniques for supporting multi-user 3D interaction in augmented reality in a way that addresses these challenges. Augmented Reality (AR) refers to interactively overlaying geometrically registered virtual media on the real world. In particular, we address how an AR system can use overlaid graphics to assist users in referencing local objects accurately and remote objects efficiently, and prevent co-located users from physically interfering with each other. My thesis is that our techniques can provide more accurate referencing for co-located and efficient referencing for remote users and lessen interference among users. First, we present and evaluate an AR referencing technique for shared environments that is designed to improve the accuracy with which one user (the indicator) can point out a real physical object to another user (the recipient). Our technique is intended for use in otherwise unmodeled environments in which objects in the environment, and the hand of the indicator, are interactively observed by a depth camera, and both users wear tracked see-through displays. This technique allows the indicator to bring a copy of a portion of the physical environment closer and indicate a selection in the copy. At the same time, the recipient gets to see the indicator's live interaction represented virtually in another copy that is brought closer to the recipient, and is also shown the mapping between their copy and the actual portion of the physical environment. A formal user study confirms that our technique performs significantly more accurately than comparison techniques in situations in which the participating users have sufficiently different views of the scene. Second, we extend the idea of using a copy (virtual replica) of physical object to help a remote expert assist a local user in performing a task in the local user's environment. We develop an approach that uses Virtual Reality (VR) or AR for the remote expert, and AR for the local user. It allows the expert to create and manipulate virtual replicas of physical objects in the local environment to refer to parts of those physical objects and to indicate actions on them. The expert demonstrates actions in 3D by manipulating virtual replicas, supported by constraints and annotations. We performed a user study of a 6DOF alignment task, a key operation in many physical task domains. We compared our approach with another 3D approach that also uses virtual replicas, in which the remote expert identifies corresponding pairs of points to align on a pair of objects, and a 2D approach in which the expert uses a 2D tablet-based drawing system similar to sketching systems developed for prior work by others on remote assistance. The study shows the 3D demonstration approach to be faster than the others. Third, we present an interference avoidance technique (Redirected Motion) intended to lessen the chance of physical interference among users with tracked hand-held displays, while minimizing their awareness that the technique is being applied. This interaction technique warps virtual space by shifting the virtual location of a user's hand-held display. We conducted a formal user study to evaluate Redirected Motion against other approaches that either modify what a user sees or hears, or restrict the interaction capabilities users have. Our study was performed using a game we developed, in which two players moved their hand-held displays rapidly in the space around a shared gameboard. Our analysis showed that Redirected Motion effectively and imperceptibly kept players further apart physically than the other techniques. These interaction techniques were implemented using an extensible programming framework we developed for supporting a broad range of multi-user immersive AR applications. This framework, Goblin XNA, integrates a 3D scene graph with support for 6DOF tracking, rigid body physics simulation, networking, shaders, particle systems, and 2D user interface primitives. In summary, we showed that our referencing approaches can enhance multi-user AR by improving accuracy for co-located users and increasing efficiency for remote users. In addition, we demonstrated that our interference-avoidance approach can lessen the chance of unwanted physical interference between co-located users, without their being aware of its use.
280

Vídeo-Avatar com detecção de colisão para realidade aumentada e jogos. / Video Avatar with collision detection for augmented reality and games.

Nakamura, Ricardo 03 July 2008 (has links)
A proposta deste trabalho é demonstrar a viabilidade de um sistema para inserção de um vídeo-avatar interativo em um ambiente virtual 3D, utilizando-se somente um computador pessoal e câmeras domésticas. Sua contribuição, em relação a trabalhos similares, consiste em integrar técnicas e algoritmos em uma solução inovadora de baixo custo computacional, voltada principalmente para aplicações de educação e entretenimento. Este trabalho expande as pesquisas realizadas anteriormente no Laboratório de Tecnologias Interativas sobre vídeo-avatares para teleconferência. O vídeo-avatar proposto é posicionado corretamente em relação a outros objetos do ambiente virtual e pode interagir com eles, sem a utilização de técnicas de reconstrução geométrica que apresentam altos custos de processamento. A demonstração da viabilidade da proposta é feita através da implementação de protótipos. / The proposal of this work is to demonstrate the feasibility of a system for the insertion of an interactive video avatar in a 3D virtual environment, using a single personal computer and home-use cameras. Its contribution, relative to similar works, consists in integrating techniques and algorithms in an innovative solution with low computational cost, aimed mainly at educational and entertainment applications. This work extends research previously performed at the Laboratório de Tecnologias Interativas about video avatars for teleconferencing. The proposed video avatar is correctly positioned in relation to other objects in the virtual environment and is capable of interacting with them, without resorting to geometric reconstruction techniques that present high processing costs. The demonstration of the feasibility of the proposal is performed through the implementation of prototypes.

Page generated in 0.2512 seconds