• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 522
  • 107
  • 87
  • 38
  • 36
  • 34
  • 19
  • 15
  • 7
  • 6
  • 6
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 1012
  • 1012
  • 294
  • 203
  • 186
  • 154
  • 151
  • 140
  • 128
  • 125
  • 117
  • 100
  • 99
  • 96
  • 94
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
711

A demonstrator for human-robot collaboration with augmented reality for future evaluations of user experiences

Yattou Belkhrouf, Najwa January 2022 (has links)
Industries are becoming more and more demanding with new technology and trying to improve the productivity of their lines by adding technological methods and thus obtaining greater flexibility and time savings. One of these methods is to train their workers with the new augmented reality technology, which saves training time since the user can learn independently by following the steps indicated on the device. Another method is to add robots to the production line to carry out those tasks that are supposed to be repetitive and tiring for humans. To squeeze more and get the best out of the robot and the human, companies choose to combine their virtues and put them working together hand in hand as a human-robot collaboration.In this project, the demonstrator includes an assembly car process realized in a human-robot collaboration system, where the human and the collaborative robot communicate through an augmented reality device, Hololens2. This demonstrator might be used for user experience studies to evaluate if the human can realize an assembly process following the instructions in a head-mounted device without previous experience and collaborate with a robot. / <p>Program: - (Utbytesstudenter)</p>
712

A comparative study of tracking methods for a guided walking city tour in outdoor spaces for tourists through AR on smartphones. / En jämförande studie av spårningsmetoder för en utomhusapplikation för guidade stadsresor genom AR på mobiltelefoner.

Schmitz, Lisa January 2017 (has links)
Recent advancements in mobile phone technology have al- lowed mobile augmented reality (MAR) to become feasible. Today’s mobile phones have enough computing power to dis- play augmented reality content and new frameworks make MAR development more accessible. It is no surprise that one of the most popular areas of applications are city tours as this has been a target field since the early days of aug- mented reality (AR) [8]. Without altering the appearance of the city, virtual content can be placed to bring hidden information, such as the city’s history, closer to tourists. The most common choice of the tracking method for this type of application is location-based tracking. Relying only on the GPS signal and sensors like the accelerometer and the gyroscope, the position of the phone is tracked. The location of the digital content in the real world is given by geospatial coordinates. Unfortunately, the accuracy of the sensors is insu⇥cient for accurate placement. Furthermore, the technology’s main advantage over other techniques, such as marker-based tracking, is that the application does not require any change in the city environment. In contrast to that, the other leading technique, marker-based tracking, is a computer vision technology that requires visual clues to work. Marker images would have to be placed in the city for the marker-based tracking technology to function. How- ever, location-based tracking can cause erratic behaviour of the virtual objects, which decreases the quality of the ex- perience. This paper compares location-based and marker- based tracking to show the user experience strengths and weaknesses of both methods to provide design guidelines for choosing the most suitable tracking technology when de- veloping an outdoor walking application. In order to un- cover the strengths and weaknesses, one experimental proto- types for each tracking technology has been developed. The analysis of the results of a controlled user study highlights the comparative strengths and weaknesses of each technol- ogy, location-based and marker-based tracking. The mea- sured user experience di⇤erences demonstrate that for scenes where AR application designers and city o⇥cials are lead- ing to incorporate visual markers, visual-based tracking will outperform location-based tracking. / En jämförande studie av spårningsmetoder för en utomhusapplikation för guidade stadsresor genom AR på mobiltelefoner. Nya tekniska framsteg för mobiltelefoner har gjort Mobile Augmented Reality (MAR) genomförbart. Dagens mobiltelefoner har tillräcklig beräknings-förmåga för att visa Augmented Reality (AR) innehåll och nya frameworks gör MAR- utveckling mer tillgänglig. Det är ingen överraskning att ett av de mest populära användningsområdena är stadsresor eftersom det har varit ett fokus sedan de första dagarna av AR. Utan att ändra utseendet av staden kan virtuellt innehåll placeras för att föra gömd information, till exempel stadens historia, närmare turisterna. Det vanligaste valet av spårningsmetod för dessa AR- applikationer är platsbaserad spårning. Genom endast förlitande på GPS-signaloch sensorer som accelerometern och gyroskop spåras positionen och rotationenav telefonen, och platsen av det digitala innehållet i den verkliga världen ges av geospatiala koordinater. Tyvärr är noggrannheten hos sensorerna ej tillräcklig för korrekt placering. Teknikens största fördel jämfört med andra tekniker, till exempel markörbaserad spårning, är att applikationen inte kräver förändringar i stadsmiljön. I kontrast till det är den andra ledande tekniken,markörbaserad spårning, en datasynteknik som kräver visuella indikationer för att fungera. Markörbilder skulle behöva placeras i staden för att den markörbaserade spårtekniken ska fungera. Emellertid kan platsbaserad spårning orsaka oregelbundet beteende hos de virtuella objekten, vilket minskar kvaliteten på upplevelsen. I denna rapport jämförs platsbaserad och markörbaserad spårning för att visa styrkor och svageter med användarupplevelsen i båda metoderna. Detta görs i syfte av att ge designriktlinjer för att välja den mest lämpliga spårningstekniken för utveckling av en utomhusapplikation. För att finna dessa styrkor och svagheter implementerades en experimentell prototyp för varje spårningsteknik. Analysen av användarstudieren framhäver de motsvarande styrkorna och svagheterna hos platsbaserad och markörbaserad spårning. De mättaskillnaderna inom användarupplevelsen visar att för scener där AR-applikationsdesigners och stadens tjänstemän är villiga att införa visuella markörer så är markörbaserad spårning bättre än platsbaserad spårning.
713

Did you notice that? A comparison between auditory and vibrotactile feedback in an AR environment

Granlund, Linnéa January 2019 (has links)
There are different ways to interact with different hardware, therefore it is important to have an understanding about what factors that affect the experience when designing interactions and interfaces. This study focuses on exploring how auditory and vibrotactile feedback are perceived by the users when they interact in a virtual AR environment. An application was developed to the AR glasses Magic Leap with different interactions, both passive and active. An experimental study was conducted with 28 participants that got to interact in this virtual environment. The study included two parts. First the participants interacted in the virtual environment where they did a think aloud. Thereafter they were interviewed. There were a total of three test cases. One with only auditory feedback, one with vibrotactile feedback, and a third that had both auditory and vibrotactile feedback. Seven of the 28 participants acted as a control group that did not have any feedback to their interactions. The study shows that using only vibrotactile feedback creates different impressions depending on earlier experiences with the same AR environment. Using only auditory feedback created an atmosphere that were close to reality. Having both feedbacks active at the same time reduced the noticed feedback and some interactions were here not even noticed at all. Passive interactions were more noticed than active interactions in all cases. / Det finns flera olika sätt att interagera med olika hårdvaror och därför är det viktigt att ha en förståelse kring vilka faktorer som påverkar upplevelsen när man designar för diverse gränssnitt och interaktioner. Den här studien fokuserar på att utforska hur auditiv och vibrationsåterkoppling uppfattas av användaren när de interagerar i en virtuell AR-miljö. En applikation var utvecklad till AR-glasögonen Magic Leap One med olika aktiva och passiva interaktioner.En experimentell studie genomfördes med 28 deltagare som fick interagera i en virtuell miljö. Studien bestod av av två delar. Först fick deltagarna interagera i en virtuell miljö där de gjorde en think aloud. Efter detta blev de intervjuade. Det var totalt tre testfall, ett hade endast auditiv återkoppling, ett hade vibrationsåterkoppling och det sista hade både auditiv och vibrationsåterkoppling. Sju av de 28 deltagarna agerade kontrollgrupp och de hade ingen återkoppling på deras interaktioner.Studien visade att bara använda vibrationsåterkoppling skapade olika intryck beroende på de tidigare erfarenheterna i samma AR-miljö. Att endast använda auditiv återkoppling skapade en atmosfär som vara nära verkligheten. Att ha båda återkopplingarna aktiva samtidigt reducerade den totala märkta återkopplingen och några interaktioner hade inte någon person som noterade någon av dem. Passiva interaktioner var mer uppmärksammade än aktiva interaktioner i alla testfallen.
714

Exploring Designs for Enhancing the In-store Customer Experience through Digital Product Information in Fashion Retail / Undersökning av designförslag för att förstärka kundupplevelsen i fysiska butiker genom digital produktinformation i modedetaljhandeln

Jonsson, Martina January 2018 (has links)
The ongoing consumer transition from offline to online shopping in the fashion retail industry requires retailers to take action. Not only do consumers shop more online, they also go online for research of retail products. Forecasts tell that bringing the online experience to offline stores might bridge the gap between the two channels. The online experience provides high-end digital content, and puts a demand on the product information offline as this was found crucial for the customer experience. The marketing possibilities in-store was found to be an advantage to bricks-and-mortar retailers. Thus, this study aims to investigate how the customer experience can be enhanced in retail bricks-and-mortar stores through digital product information. A survey was conducted to identify user requirements in terms of product information. An augmented reality prototype was formed to satisfy the identified user requirements. The prototype was tested in two user studies that evaluated the content, visualization, interaction and satisfaction. The prototype was iterated between the two user studies. The most crucial parameters of fashion retail product information were established, together with implications for the visual representation and interaction. It was found that there were unfulfilled user needs with existing service options, which were satisfied with the use of an augmented reality prototype for product information retrieval. The use of AR for this purpose also proved to be able to contribute to an omnichannel solution for multi-channel retailers. The conclusion was thus that the customer in-store experience could be enhanced by the introduction of an augmented reality prototype for product information retrieval, taking into account the implications for content, visualization and interaction provided in this study. / Den pågående konsumentövergången från offline till online shopping i modedetaljhandeln kräver att detaljhandlare vidtar åtgärder. Förutom att konsumenterna handlar mer online, använder de också onlinebutiker allt mer för undersökning av produkter. Prognoser förtäljer att införandet av onlineupplevelsen till offline-butiker kan överbrygga klyftan mellan de två kanalerna. Onlineupplevelsen tillhandahåller högklassigt digitalt innehåll och ställer krav på produktinformationen offline, eftersom denna konstaterades vara en avgörande faktor för kundupplevelsen. Marknadsföringsmöjligheterna i fysiska butiker har visat sig vara en fördel för detaljhandlare som existerar i offlinekanalen. Således syftar denna studie till att undersöka hur kundupplevelsen kan förstärkas i fysiska detaljhandelsbutiker genom digital produktinformation. En enkätundersökning genomfördes för att identifiera användarnas krav när det gäller produktinformation. En augmented reality-prototyp formades i anspråk att tillfredsställa de identifierade användarkraven. Prototypen testades i två användarstudier, som utvärderade prototypens innehåll, visualisering, interaktion och tillfredsställelse. Prototypen itererades mellan de två användarstudierna. De mest kritiska parametrarna för produktinformation fastställdes, tillsammans med implikationer för visuell representation och interaktion. Det kunde konstateras att en AR-prototyp kunde tillfredsställa ännu omötta användarbehov för inhämtning av produktinformation. Användningen av AR för detta ändamål visade sig också ha möjligheten att bidra till en omnichannel-lösning för modehandlare som existerar i flera kanaler. Slutsatsen var således att kundupplevelsen i fysiska detaljhandelsbutiker kan förstärkas genom införandet av en augmented reality-prototyp för produktinformationsinhämtning, genom att ta hänsyn till de implikationer gällande innehåll, visualisering och interaktion tillhandahållna i denna studie.
715

Visualizing future buildings : User-centered design process and evaluation of a sensor-based MAR prototype / Visualisering av framtida byggnader : En avändarcentrerad designprocess och utvärdering av en sensorbaserad MAR prototyp

Lindström, Daniel January 2017 (has links)
Mobile Augmented Reality (MAR) is an emerging technology that, when done right, can offer the users a rich experience. However, as with most emerging technologies, it is primarily technology-driven, often leaving the user needs aside. In this study a sensor-based MAR prototype was developed and evaluated, following a User-Centered Design process. The prototype visualized a future building in the city of Stockholm, Sweden. However, sensors in today’s smartphones are not perfect, which is a problem when used as tracking method to augment 3D-content. Also, developing a MAR experience includes some challenges and issues that need to be addressed in order to develop a good experience for the user. One of those challenges is to test and validate ideas at an early stage of the development. Usually when following a User-Centered Design process, prototyping is a useful tool for testing and validating. Though, even the simplest MAR experience requires an advanced infrastructure, which is expensive and time-consuming to develop. In this study both the inherited problems of smartphone sensors and the challenges regarding MAR development were taken into account during the development. To counteract these problems, cleverly designed functions were implemented. To evaluate the final version of the prototype, a user test was conducted in an urban environment. The most important lessons learned during this study were that today’s smartphones do not provide robust and accurate sensor data in an urban environment. Resulting in incorrect placement and flickering of the augmented 3D-object. Also, the functions that were developed to counteract these problems helped in some degree, indicating the potential of overcoming hardware problems with cleverly designed functions. Last, using existing AR framework to produce an interactive and AR-enabled prototype, to a low cost and relatively fast, proved successful. / Mobil förstärkt verklighet (Mobile Augmented Reality, eller MAR) är en teknik på frammarsch som utvecklas i snabb takt. MAR har möjligheten att leverera en rik upplevelse till användaren. Dock är utvecklingen av MAR, likt de flesta tekniker under snabb utveckling, teknikdriven vilket ofta medför att användarens behov glöms bort. I denna studie står användaren i fokus när en sensorbaserad MAR-prototyp utvecklas och utvärderas. Allt detta sker med hjälp utav en användarcentrerad designprocess och prototypen används för att visualisera en kommande byggnad i Stockholm, Sverige. Vid utveckling av en sensorbaserad MAR-upplevelse står man inför ett flertal utmaningar och problem som alla behöver adresseras för att kunna leverera en bra upplevelse till användaren. Dels finns hårdvaruproblem så som att sensorer i dagens smartphones är långt ifrån perfekta, detta medför problematik i placeringen utav 3D-objekt som baseras på sensordata. En annan utmaning är att testa och validera idéer i ett tidigt skede av utvecklingen. Vanligtvis när man följer en användarcentrerad designprocess är tidiga prototyper ett användbart verktyg för testning och validering. Dock kräver även den enklaste MAR-upplevelsen en avancerad infrastruktur, som är både dyr och tidskrävande att utveckla. I denna studie tas dessa problem och utmaningar i beaktning redan i ett tidigt utvecklingsstadium. Väldesignade funktioner utvecklas för att motverka och minska inverkan utav dessa problem. För att utvärdera den slutgiltiga prototypen utfördes ett användartest i stadsmiljö. En viktig lärdom från denna studie är att dagens smartphones inte erbjuder tillräckligt robust och precis mätning av sensordata i stadsmiljö för att leverera en bra användarupplevelse. Detta resulterade i felaktig och hoppande/skakig placering av 3D-objekt. De funktioner som utvecklades för att motverka problemen med MAR hjälpte i viss utsträckning, dessa påvisade en potential att överkomma hårdvarubrister med smarta och väldesignade funktioner. Att använda AR-ramverk för att producera interaktiva MAR-prototyper, till en låg kostnad och med relativt kort utvecklingstid, visade sig lyckat.
716

Towards Real-time Mixed Reality Matting In Natural Scenes

Beato, Nicholas 01 January 2012 (has links)
In Mixed Reality scenarios, background replacement is a common way to immerse a user in a synthetic environment. Properly identifying the background pixels in an image or video is a dif- ficult problem known as matting. Proper alpha mattes usually come from human guidance, special hardware setups, or color dependent algorithms. This is a consequence of the under-constrained nature of the per pixel alpha blending equation. In constant color matting, research identifies and replaces a background that is a single color, known as the chroma key color. Unfortunately, the algorithms force a controlled physical environment and favor constant, uniform lighting. More generic approaches, such as natural image matting, have made progress finding alpha matte solutions in environments with naturally occurring backgrounds. However, even for the quicker algorithms, the generation of trimaps, indicating regions of known foreground and background pixels, normally requires human interaction or offline computation. This research addresses ways to automatically solve an alpha matte for an image in realtime, and by extension a video, using a consumer level GPU. It does so even in the context of noisy environments that result in less reliable constraints than found in controlled settings. To attack these challenges, we are particularly interested in automatically generating trimaps from depth buffers for dynamic scenes so that algorithms requiring more dense constraints may be used. The resulting computation is parallelizable so that it may run on a GPU and should work for natural images as well as chroma key backgrounds. Extra input may be required, but when this occurs, commodity hardware available in most Mixed Reality setups should be able to provide the input. This allows us to provide real-time alpha mattes for Mixed Reality scenarios that take place in relatively controlled environments. As a consequence, while monochromatic backdrops (such as green screens or retro-reflective material) aid the algorithm’s accuracy, they are not an explicit requirement. iii Finally we explore a sub-image based approach to parallelize an existing hierarchical approach on high resolution imagery. We show that locality can be exploited to significantly reduce the memory and compute requirements of previously necessary when computing alpha mattes of high resolution images. We achieve this using a parallelizable scheme that is both independent of the matting algorithm and image features. Combined, these research topics provide a basis for Mixed Reality scenarios using real-time natural image matting on high definition video sources.
717

Affine Region Tracking and Augmentation Using MSER and Adaptive SIFT Model Generation

Marano, Matthew James 01 June 2009 (has links) (PDF)
Relatively complex Augmented Reality (AR) algorithms are becoming widely available due to advancements in affordable mobile computer hardware. To take advantage of this a new method is developed for tracking 2D regions without a prior knowledge of an environment and without developing a computationally expensive world model. In the method of this paper, affinely invariant planar regions in a scene are found using the Maximally Stable Extremal Region (MSER) detector. A region is selected by the user to define a search space, and then the Scale Invariant Feature Transform (SIFT) is used to detect affine invariant keypoints in the region. If three or more keypoint matches across frames are found, the affine transform A of a region is calculated. A 2D image is then transformed by A causing it to appear stationary on the 2D region being tracked. The search region is tracked by transforming the previous search region by A, defining a new location, size, and shape for the search region. Testing reveals that the method is robust to tracking planar surfaces despite affine changes in the geometry of a scene. Many real world surfaces provide adequate texture for successful augmentation of a scene. Regions found multiple frames are consistent with one another, with a mean cross-correlation of 0.608 relating augmented regions. The system can handle up to a 45° out of plane viewpoint change with respect to the camera. Although rotational changes appear to skew the affine transform slightly, translational and scale based have little distortion and provide convincing augmentations of graphics onto the real world.
718

Real-Time Body Tracking and Projection Mapping in the Interactive Arts

Baroya, Sydney 01 December 2020 (has links) (PDF)
Projection mapping, a subtopic of augmented reality, displays computer-generated light visualizations from projectors onto the real environment. A challenge for projection mapping in performing interactive arts is dynamic body movements. Accuracy and speed are key components for an immersive application of body projection mapping and dependent on scanning and processing time. This thesis presents a novel technique to achieve real-time body projection mapping utilizing a state of the art body tracking device, Microsoft’s Azure Kinect DK, by using an array of trackers for error minimization and movement prediction. The device's Sensor and Bodytracking SDKs allow multiple device synchronization. We combine our tracking results from this feature with motion prediction to provide an accurate approximation for body joint tracking. Using the new joint approximations and the depth information from the Kinect, we create a silhouette and map textures and animations to it before projecting it back onto the user. Our implementation of gesture detection provides interaction between the user and the projected images. Our results decreased the lag time created from the devices, code, and projector to create a realistic real-time body projection mapping. Our end goal was to display it in an art show. This thesis was presented at Burning Man 2019 and Delfines de San Carlos 2020 as interactive art installations.
719

The Effects of Head-Centric Rest Frames on Egocentric Distance Perception in Virtual Reality

Hmaiti, Yahya 01 January 2023 (has links) (PDF)
It has been shown through several research investigations that users tend to underestimate distances in virtual reality (VR). Virtual objects that appear close to users wearing a Head-mounted display (HMD) might be located at a farther distance in reality. This discrepancy between the actual distance and the distance observed by users in VR was found to hinder users from benefiting from the full in-VR immersive experience, and several efforts have been directed toward finding the causes and developing tools that mitigate this phenomenon. One hypothesis that stands out in the field of spatial perception is the rest frame hypothesis (RFH), which states that visual frames of reference (RFs), defined as fixed reference points of view in a virtual environment (VE), contribute to minimizing sensory mismatch. RFs have been shown to promote better eye-gaze stability and focus, reduce VR sickness, and improve visual search, along with other benefits. However, their effect on distance perception in VEs has not been evaluated. To explore and better understand the potential effects that RFs can have on distance perception in VR, we used a blind walking task to explore the effect of three head-centric RFs (a mesh mask, a nose, and a hat) on egocentric distance estimation. We performed a mixed-design study where we compared the effect of each of our chosen RFs across different environmental conditions and target distances in different 3D environments. We found that at near and mid-field distances, certain RFs can improve the user's distance estimation accuracy and reduce distance underestimation. Additionally, we found that participants judged distance more accurately in cluttered environments compared to uncluttered environments. Our findings show that the characteristics of the 3D environment are important in distance estimation-dependent tasks in VR and that the addition of head-centric RFs, a simple avatar augmentation method, can lead to meaningful improvements in distance judgments, user experience, and task performance in VR.
720

Impact of Metaverse on Marketing Communication : A case study of the fashion industry

Nabukalu, Resty, Wanjohi, Ambrosena January 2023 (has links)
No description available.

Page generated in 0.0771 seconds