• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 5
  • 4
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 11
  • 8
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Distance-Scaled Human-Robot Interaction with Hybrid Cameras

Pai, Abhishek 24 October 2019 (has links)
No description available.
12

Fingerbaserad navigering i virtuella 3D-miljöer : En utvärdering av fingerstyrning som alternativ till tangentbordet

Grindebäck, Max January 2023 (has links)
Navigering i virtuella 3D-miljöer har varit möjligt i många år och sker vanligtvis i samband med till exempel spel och 3D-modellering. Till en persondator används nästan alltid en datormus och ett tangentbord. Musen har visats vara lätthanterlig vid rotering av vyn och kommer inte att vara fokuset i studien. Tangentbordet däremot, som styr vyns position, skulle möjligtvis kunna bytas ut mot något bättre. Vanligtvis används tangenterna W, A, S, och D för förflyttningar, och sedan behövs två tangenter till om det ska vara möjligt att ”flyga” upp och ner. Sex olika tangenter för att styra förflyttningen i tre dimensioner kan vara svårt att lära sig. Trots att vana användare kan hantera det bra, skulle ett mer naturligt sätt att styra på kunna vara enklare för nybörjare, och kanske också för de erfarna. Det begränsade antalet tangenter som används tillåter inte heller finjustering av riktningen. Studien föreslår en alternativ form av 3D-navigering där användaren styr med sitt finger. En Leap Motion kamera ligger på bordet under för att mäta fingrets position, och översätter det till en vektor som kontrollerar vyns hastighet och riktning. Detta är tänkt att vara ett mer naturligt sätt att styra på, då människor har så bra kontroll över sin egen kropp. Utöver det kan även hastigheten justeras genom att dra fingret längre eller kortare sträckor. Vid styrning med tangentbord är justering av hastigheten inte möjligt; undantaget är om användaren kan hålla ner en tangent för att springa, vilket gör att det finns två val av hastigheter. Fingerstyrningen testades och jämfördes direkt mot tangentbordet i ett antal olika experiment. Testerna visar att det går snabbare när tangentbordet används, och ge-nerellt sker färre misstag. När fingerstyrningen används så blir färdsträckorna ofta kortare, speciellt när det krävs mer precision, dock kan detta bero på den lägre has-tigheten som deltagarna hade när de använde fingret. En inmatningsmetod testades bara sju gånger. Under denna period blev fingerstyrningen betydligt snabbare mellan varje försök jämfört med tangentbordet, därför finns anledning att tro att fingerstyr-ningen kommer att förbättras med mer träning. För att få pålitliga resultat skulle en längre studie behöva utföras där deltagarna verkligen hinner lära sig att styra med fingret. Författaren har under utvecklingen av fingerstyrningen blivit snabbare med den än med tangentbordet. Detta är en ytterligare indikation på att det finns potential hos fingerstyrningen som deltagarna aldrig hann uppnå i denna preliminära studie, och att ytterligare experiment krävs.
13

A novel clinical test of pointing acuity withopen and closed eyes  a validity study / Ett nytt kliniskt test för pekpositionering med öppna och slutna ögon  validitets studie

Hägglund, Benjamin January 2023 (has links)
Hand proprioception is crucial for daily activities and may be compromised by diseases or injuries,impacting patients' independence. The lack of feasible, accurate, and affordable clinical tools forhand proprioception assessment poses a significant challenge, essential for identifying dysfunctionand evaluating treatment effects.The purpose of this study was to evaluate the concurrent validity of the LeapMotion controller(LMC) for assessing hand proprioception. We compared the LMC with a 3D camera system formotion analysis (Qualisys Motion Capture, QTM), known for its high measurement accuracy as thegold standard. Twenty participants (10 men, 10 women), 15 without, and 5 with hand injury or pain,took part in this cross-sectional study. Assessments included pointing acuity with open and closedeyes using the right and left hand. There were moderate to good correlations between LMC andQTM performed with closed eyes, with intraclass correlation coefficient (ICC) values of 0.6 and0.89. Contrary, tests with open eyes showed a poor overall correlation with ICC between 0.003 and0.3. Bland-Altman analysis showed median biases of≤ 1.5 mm between LMC and QTM with eyes open, and ≤ 5.1 mm with eyes closed. Limits ofagreement ranged from -0.4 to 3.5 mm with eyes open and -31.6 to 21.5 mm with eyes closed.The results indicate that the LMC could be a cost-effective and feasible tool for quantifying handproprioception with a clinically acceptable bias. Although the median biases were small formeasurements with eyes open, the ICCs were poor. This may be due to a high pointing acuity withinthe group combined with limited variability between the participants in the eyes open tests.
14

Zařízení pro interakci v rozšířené realitě / Interaction Device for Augmented Reality

Pavlenko, Peter January 2017 (has links)
This master thesis explores interactive augmented reality. The aim is to design, create andtest device, which allows interaction between user and augmented reality. First analyzesaugmented reality, devices for its displaying and necessary calibration methods. Then, onthe basis of acquired knowledge, shows design and construction of few prototypes of devicelike this. Finally it shows experiments to test correctness of the concept.
15

Interacting with Hand Gestures in Augmented Reality : A Typing Study

Moberg, William, Pettersson, Joachim January 2017 (has links)
Smartphones are used today to accomplish a variety of different tasks, but it has some issues that might be solved with new technology. Augmented Reality is a developing technology that in the future can be used in our daily lives to solve some of the problems that smartphones have. Before people will adopt the new augmented technology it is important to have an intuitive method to interact with it. Hand gesturing has always been a vital part of human interaction. Using hand gestures to interact with devices has the potential to be a more natural and familiar method than traditional methods, such as keyboards, controllers, and computer mice. The aim of this thesis is to explore whether hand gesture recognition in an Augmented Reality head-mounted display can provide the same interaction possibilities as a smartphone touchscreen. This was done by implementing an application in Unity that mimics an interface of a smartphone, but uses hand gestures as input in AR. The Leap Motion Controller was the device used to perform hand gesture recognition. To test how practical hand gestures are as an interaction method, text typing was chosen as the task to be used to measure this, as it is used in many applications on smartphones. Thus, the results can be better generalized to real world usage.Five different keyboards were designed and tested in a pilot study. A controlled experiment was conducted, in which 12 participants tried two hand gesturing keyboards and a touchscreen keyboard. This was done to compare how hand gestures compare to touchscreen interaction. In the experiment, participants wrote words using the keyboards, while their completion time and accuracy was recorded. After using a keyboard, a questionnaire was completed by the participants to measure the usability.  The results consists of an implementation of five different keyboards, and data collected from the experiment. The data gathered from the experiment consists of completion time, accuracy, and usability derived from questionnaire responses. Statistical tests were used to determine statistical significance between the keyboards used in the experiment. The results are presented in graphs and tables. The results show that typing with pinch gestures in augmented reality is a slow and tiresome way of typing and affects the users completion time and accuracy negatively, in relation to using a touchscreen. The lower completion time, and higher usability, of the touchscreen keyboard could be determined with statistical significance. Prediction and auto-completion might help with fatigue as fewer key presses are needed to create a word. The research concludes that hand gestures are reasonable to use as input technique to accomplish certain tasks that a smartphone performs. These include simple tasks such as scrolling through a website or opening an email. However, tasks that involve typing long sentences, e.g. composing an email, is arduous using pinch gestures. When it comes to typing, the authors advice developers to employ a continuous gesture typing approach such as Swype for Android and iOS.
16

The Feasibility of Using a Markerless Motion Capture Sensor (Leap Motion<sup>TM</sup> Controller) forQuantitative Motor Assessment Intended for a Clinical Setting

Kincaid, Clay Jordan 01 December 2016 (has links)
Although upper limb motor impairments are common, the primary tools for assessing and tracking these impairments in a clinical setting are subjective, qualitative rating scales that lack resolution and repeatability. Markerless motion capture technology has the potential to greatly improve clinical assessment by providing quick, low-cost, and accurate tools to objectively quantify motor deficits. Here we lay some of the groundwork necessary to enable markerless motion capture systems to be used in clinical settings. First, we adapted five motor tests common in clinical assessments so they can be administered via markerless motion capture. We implemented these modified tests using a particular motion capture sensor (Leap MotionTM Controller, hereafter referred to as the Leap Motion sensor) and administered the tests to 100 healthy subjects to evaluate the feasibility of administrating these tests via markerless motion capture. Second, to determine the ability of the Leap Motion sensor to accurately measure tremor, we characterized the frequency response of the Leap Motion sensor. During the administration of the five modified motor tests on 100 healthy subjects, the subjects had little trouble interfacing with the Leap Motion sensor and graphical user interface, performing the tasks with ease. The Leap Motion sensor maintained an average sampling rate above 106 Hz across all subjects during each of the five tests. The rate of adverse events caused by the Leap Motion sensor (mainly jumps in time or space) was generally below 1%. In characterizing the frequency response of the Leap Motion sensor, we found its bandwidth to vary between 1.7 and 5.5 Hz for actual tremor amplitudes above 1.5 mm, with larger bandwidth for larger amplitudes. To improve the accuracy of tremor measurements, we provide the magnitude ratios that can be used to estimate the actual amplitude of the oscillations from the measurements by the Leap Motion sensor. These results suggest that markerless motion capture systems are on the verge of becoming suitable for routine clinical use, but more work is necessary to further improve the motor tests before they can be administered via markerless motion capture with sufficient robustness for clinical settings.
17

Bezdotykové ovládání interaktivních výukových aplikací s využitím technologie Leap Motion / Contactless control of interactive training applications using Leap Motion technology

SVATEK, Tomáš January 2015 (has links)
In its theoretical part, this thesis treats possibilities of using no-touch technology Leap Motion in elementary school lessons for operating interactive applications focused on physics education. The thesis investigates the ways in which it is possible to operate already published applications which are not programmed specifically for a no-touch technology and which are, however, intended for work with an interactive whiteboard or for a standard computer work. Their advantages and disadvantages are discussed as well. The thesis deals, among other things, with the availability of relevant applications and offers a summary of information about Leap Motion technology and of opportunities for replacing interactive whiteboards. The thesis also includes a description of the technology and of the potential for developing your own applications. The aim of the practical part is to create a new didactic application which will be tested in lessons. This part also includes a poll which will find out what is the interest in Leap Motion technology among teachers.
18

Model-based object tracking with an infrared stereo camera

Rivas Diaz, Juan Manuel January 2015 (has links)
Object tracking has become really important in the field of robotics in the last years. Frequently, the goal is to obtain the trajectory of the tracked target over time and space by acquiring and processing information from the sensors. In this thesis we are interested in tracking objects at a very short range. The primary application of our approach is targeting the domain of object tracking during grasp execution with a hand-in-eye sensor setup. To this end, a promising approach investigated in this work is based on the Leap Motion sensor, which is designed for tracking human hands. However, we are interested in tracking grasped objects thus we need to extend its functionality. The main goal of the thesis is to track the 3D position and orientation of an object from a set of simple primitives (cubes, cylinders, triangles) over a video sequence. That is the reason we have designed and developed two different approaches for tracking objects with the Leap Motion device as stereo vision system.
19

Using Leap Motion for the Interactive Analysis of Multivariate Networks

Vendruscolo, Marcello Pietro, Lif, Andreas January 2020 (has links)
This work is an interdisciplinary study involving mainly the fields of information visualisation and human-computer interaction. The advancement of technology has expanded the ways in which humans interact with machines, which has benefited both the industry as well as several fields within science. However, scientists and practitioners in the information visualisation domain remain working, mostly, with classical setups constituted of keyboard and standard computer mouse devices. This project investigates how a shift in the human-computer interaction aspect of visualisation software systems can affect the accomplishment of tasks and the overall user experience when analysing two-dimensionally displayed multivariate networks. Such investigation is relevant as complex network structures have seen an increase in use as essential tools to solve challenges that directly affect individuals and societies, such as in medicine or social sciences. The improvement of visualisation software’s usability can result in more of such challenges answered in a shorter time or with more precision. To answer this question, a web application that enables users to analyse multivariate networks through interfaces based both on hand gesture recognition and mouse device was developed. Also, a number of gesture designs were developed for several tasks to be performed when visually analysing networks. Then, an expert in the field of human-computer interaction was invited to review the proposed hand gestures and report his overall user experience of using them. The results show that the expert had, overall, similar user experience for both hand gestures and mouse device. Moreover, the interpretation of the results indicates that the accuracy offered by gestures has to be carefully taken into account when designing gestures for selection tasks, particularly when the selection targets are small objects. Finally, our analysis points out that the manner in which the software’s graphical user interface is presented also affects the usability of gestures, and that both factors have to be designed accordingly.
20

Computer Graphics and Visualization based Analysis and Record System for Hand Surgery and Therapy Practice

Gokavarapu, Venkatamanikanta Subrahmanyakartheek 27 May 2016 (has links)
No description available.

Page generated in 0.0831 seconds