Spelling suggestions: "subject:"1interaction simulationlation"" "subject:"1interaction motionsimulation""
1 |
Local Source Influences Upon the Structure of Dust Plumes in the Channel Country of Western Queensland, AustraliaButler, Harry, n/a January 2004 (has links)
Most of the early wind erosion research undertaken in Australia, concentrated on how wind erosion affects cultivated farm land. However, in the 1990's the focus of wind erosion research in Australia started to shift to include rangeland environments. Initially these rangeland experiments used experimental configurations that were developed for cultivated fields. This meant that in most cases a sampler was set up in the middle of a field and it was assumed that the data collected was representative of the field as a whole. It was also assumed that temporal changes in dust fluxes/concentration reflect overall changes in the land type erodibility and wind erosivity. However, recent experiments and field observations within the rangelands, of the Channel Country suggest that this assumption is not valid. These experiments and observations suggest that there are substantial spatial and temporal variations in erodibility within individual land types. Such variations complicate the interpretation of temporal and spatial erosion trends. In particular, this variability implies that it is difficult to compare sampler data between different wind erosion events. To begin quantifying and comparing sampler data between events within the rangeland environments, the Dust Source Interaction Simulation Model (DSism) was developed to simulate the effect that physical processes and spatial variations in erodibility have upon observed dust concentration pro- files. The modelling/simulation approach used is closely linked to experimental data via the extensive use of sensitivity testing. Another key feature of the DSism approach, is its flexibility in allowing different dust source areas to have particle emission characteristics. This combined sensitivity testing and simulation approach has provided new insights into the wind erosion processes. By using DSism, it has been possible to identify several key features of the wind erosion process within rangeland environments. The first observation is that spatial and temporal changes in erodibility produce distinct changes in both the vertical and crosswind dust concentration profiles. Further investigations, indicate that the dispersion processes in operation vary from event to event. In particular, the results presented here indicate that surface heating plays an important role in some wind erosion events. These results also suggest that even small variations in the vertical dust concentration profile can reflect temporal and spatial changes in processes and erodibility. Finally the simulation results show that the particle size distribution of a vertical dust concentration profile depends on (a) the processes in operation during a given event and (b) the spatial variation in the particle size emission characteristics of the various source areas. These findings have several important implications. In particular, they indicate that both the crosswind and vertical dust concentration profiles can be viewed as amalgamation of several distinct plumes from different dust source areas and that dust concentration profiles contain significant information about both the spatial distribution of sources and the processes in operation during any given event. Most field studies have used regression models to describe the variation in dust concentration with height. A problem with this approach is that it assumes that the variation in dust concentration with height, always has a given functional form (or shape) and that dust concentration always decreases with height. Field observations, indicate that this assumption is only valid for some events within rangeland environments and that dust concentration does not always decrease with height in these environments. In most cases, such variations from the regression fit have been assumed to be the result of experimental 'noise' (error) or spatial variations in erodibility. This thesis presents, modelling and field evidence, which suggests that such variations, are the result of a combination of spatial variations in erodibility and changes in thermal conditions.
|
2 |
Modeling Of Mogan And Eymir Lakes Aquifer SystemYagbasan, Ozlem 01 June 2007 (has links) (PDF)
Mogan and Eymir Lakes, located 20 km south of Ankara, are important
aesthetic, recreational, and ecological resources. DikilitaS and ikizce reservoirs,
constructed on upstream surface waters, are two man-made structures in the basin
encompassing an area of 985 km2. The purpose of this study is (1) to quantify
groundwater components in lakes&rsquo / budgets, (2) to assess the potential impacts of
upstream reservoirs on lake levels, and (3) to determine effects of potential
climatic change on lakes and groundwater levels in the basin. Available data have
been used to develop a conceptual model of the system. The three dimensional
groundwater model (MODFLOW) has been developed for the system. The model
has been calibrated successfully under transient conditions over a period of six
years using monthly periods. The results show that groundwater inflows and
outflows have the lowest contribution to the overall lakes&rsquo / budget. A sensitivity
analysis was conducted to determine the limits within which the regional
parameters may vary. Three groundwater management scenarios had been
developed. The results show that the upstream reservoirs have a significant effect
on lake stages but not on groundwater levels. A trade-off curve between the
amount of water released and the average stage in Lake Mogan has been
developed. The continuation of the existing average conditions shows that there
would be declines in groundwater elevations in areas upstream from Lake Mogan
and downstream from Lake Eymir. The results also indicated that very small, but
long-term changes to precipitation and temperature have the potential to cause
significant declines in groundwater and lake levels.
|
3 |
Utilizing Input Simulation for Video Game Test Automation : A Case StudyJerlström, Auguste January 2022 (has links)
In typical software projects, it is common that half of the development time and cost is spent on testing the software. Software Test Automation is an area that has been rapidly expanding in recent years because of its capacity to test features rapidly and efficiently, but in the Video Game industry, this concept is still in its infancy and common practices are still being developed. Having the Automated Tests be as close as possible to the user's experience is desirable to ensure a resilient and bug-free interactive experience. Having the ability to properly automate a test in such a way could potentially save manual Quality Assurance (QA) analysts and developers a lot of time in finding and reporting bugs early on, which in turn can help companies save both time and resources. This thesis explores the use of Input Simulation, e.g., simulating keyboard and mouse inputs, in the context of Video Game Test Automation. For the design and implementation of the Input Simulation framework, Agile Scrum and Human Centered Design Methodologies were followed. Exploratory interviews were conducted with 2 test automation engineers at Fatshark, desk research was used to explore existing tools and their benefits, and Proof of Concepts (POC) were created with tools selected from the Desk Research on two of the company’s games, Vermintide 2 and Darktide. From this, a new framework named TestifyInput, best fitting the needs of the company, was created and implemented as a part of Fatshark’s in-house test automation framework called Testify, on both the Engine and Gameplay side, written in C++ and Lua respectively. TestifyInput was then evaluated through an Automated Test that was implemented to test weapon interactions in the game, a User Observation with 5 QA testers completing 2 tasks, and a Questionnaire sent out to 7 QA testers. By using metrics such as defect detection, speed, and limitations, TestifyInput was evaluated in the existing Test Automation context and against its human counterparts. The evaluation results showed that Input Simulation allowed for better test coverage and allowed to test close to the actual User Experience. TestifyInput itself remains relatively easy to implement and to use, and the test case written with it remained stable, not needing any modification despite changing gameplay code. However, to fully compare the capabilities of a traditional test case with one employing Input Simulation, further research is needed. / I typiska programvaruprojekt är det vanligt att hälften av utvecklingstiden och kostnaden går åt till att testa programvaran. Automatisering av programvarutestning är ett område som har expanderat snabbt under de senaste åren på grund av dess förmåga att testa funktioner snabbt och effektivt, men inom spelindustrin är detta koncept fortfarande i sin linda och gemensamma metoder håller fortfarande på att utvecklas. Det är önskvärt att de automatiserade testerna ligger så nära användarens upplevelse som möjligt för att säkerställa en motståndskraftig och felfri interaktiv upplevelse. Att ha förmågan att korrekt automatisera ett test på ett sådant sätt kan potentiellt spara manuell kvalitetssäkring (QA) analytiker och utvecklare mycket tid när det gäller att hitta och rapportera buggar i ett tidigt skede, vilket i sin tur kan hjälpa företag att spara både tid och resurser. I den här avhandlingen undersöks användningen av inmatningssimulering, t.ex. simulering av tangentbords- och musinmatningar, i samband med automatisering av test av videospel. För utformning och genomförande av ramverket för inmatningssimulering följdes metoderna Agile Scrum och Human Centered Design. Utforskande intervjuer genomfördes med två testutomatiseringsingenjörer på Fatshark, skrivbordsforskning användes för att utforska befintliga verktyg och deras fördelar, och Proof of Concepts (POC) skapades med verktyg som valts ut från skrivbordsforskningen på två av företagets spel, Vermintide 2 och Darktide. Utifrån detta skapades ett nytt ramverk vid namn TestifyInput, som bäst passade företagets behov, och implementerades som en del av Fatsharks interna ramverk för testautomatisering vid namn Testify, både på motor- och spelsidan, skrivet respektivt i C++ och Lua. TestifyInput utvärderades sedan genom ett automatiserat test som implementerades för att testa vapeninteraktioner i spelet, en användarobservation med 5 QA-testare som utförde 2 uppgifter och ett frågeformulär som skickades ut till 7 QA-testare. Med hjälp av mätvärden som defektdetektering, hastighet och begränsningar utvärderades TestifyInput i det befintliga testutomatiseringssammanhanget och mot sina mänskliga motsvarigheter. Utvärderingsresultaten visade att inmatningssimulering gav bättre testtäckning och gjorde det möjligt att testa nära den faktiska användarupplevelsen. TestifyInput i sig är relativt lätt att implementera och använda, och det testfall som skrevs med det förblev stabilt och behövde inte ändras trots att spelkoden ändrades. För att fullt ut kunna jämföra kapaciteten hos ett traditionellt testfall med ett testfall som använder sig av input-simulering krävs dock ytterligare forskning.
|
4 |
Simulation numérique des interactions fluide-structure dans une fistule artério-veineuse sténosée et des effets de traitements endovasculairesDecorato, Iolanda 05 February 2013 (has links)
Une fistule artérioveineuse (FAV) est un accès vasculaire permanent créé par voie chirurgicale en connectant une veine et une artère chez le patient en hémodialyse. Cet accès vasculaire permet de mettre en place une circulation extracorporelle partielle afin de remplacer les fonctions exocrines des reins. En France, environ 36000 patients sont atteint d’insuffisance rénale chronique en phase terminale, stade de la maladie le plus grave qui nécessite la mise en place d’un traitement de suppléance des reins : l’hémodialyse. La création et présence de la FAV modifient significativement l’hémodynamique dans les vaisseaux sanguins, au niveau local et systémique ainsi qu’à court et à plus long terme. Ces modifications de l’hémodynamiques peuvent induire différents pathologies vasculaires, comme la formation d’anévrysmes et de sténoses. L’objectif de cette étude est de mieux comprendre le comportement mécanique et l’hémodynamique dans les vaisseaux de la FAV. Nous avons étudié numériquement les interactions fluide-structure (IFS) au sein d’une FAV patient-spécifique, dont la géométrie a été reconstruite à partir d’images médicales acquises lors d’un précédent doctorat. Cette FAV a été créée chez le patient en connectant la veine céphalique du patient à l’artère radiale et présente une sténose artérielle réduisant de 80% la lumière du vaisseau. Nous avons imposé le profil de vitesse mesuré sur le patient comme conditions aux limites en entrée et un modèle de Windkessel au niveau des sorties artérielle et veineuse. Nous avons considéré des propriétés mécaniques différentes pour l’artère et la veine et pris en compte le comportement non-Newtonien du sang. Les simulations IFS permettent de calculer l’évolution temporelle des contraintes hémodynamiques et des contraintes internes à la paroi des vaisseaux. Nous nous sommes demandées aussi si des simulations non couplées des équations fluides et solides permettaient d’obtenir des résultats suffisamment précis tout en réduisant significativement le temps de calcul, afin d’envisager son utilisation par les chirurgiens. Dans la deuxième partie de l’étude, nous nous sommes intéressés à l’effet de la présence d’une sténose artérielle sur l’hémodynamique et en particulier à ses traitements endovasculaires. Nous avons dans un premier temps simulé numériquement le traitement de la sténose par angioplastie. En clinique, les sténoses résiduelles après angioplastie sont considérées comme acceptables si elles obstruent moins de 30% de la lumière du vaisseau. Nous avons donc gonflé le ballonnet pour angioplastie avec différentes pressions de manière à obtenir des degrés de sténoses résiduelles compris entre 0 et 30%. Une autre possibilité pour traiter la sténose est de placer un stent après l’angioplastie. Nous avons donc dans un deuxième temps simulé ce traitement numériquement et résolu le problème d’IFS dans la fistule après la pose du stent. Dans ces simulations, la présence du stent a été prise en compte en imposant les propriétés mécaniques équivalentes du vaisseau après la pose du stent à une portion de l’artère. Dans la dernière partie de l’étude nous avons mis en place un dispositif de mesure par PIV (Particle Image Velocimetry). Un moule rigide et transparent de la géométrie a été obtenu par prototypage rapide. Les résultats expérimentaux ont été validés par comparaison avec les résultats des simulations numériques. / An arteriovenous fistula (AVF) is a permanent vascular access created surgically connecting a vein onto an artery. It enables to circulate blood extra-corporeally in order to clean it from metabolic waste products and excess of water for patients with end-stage renal disease undergoing hemodialysis. The hemodynamics results to be significantly altered within the arteriovenous fistula compared to the physiological situation. Several studies have been carried out in order to better understand the consequences of AVF creation, maturation and frequent use, but many clinical questions still lie unanswered. The aim of the present study is to better understand the hemodynamics within the AVF, when the compliance of the vascularwall is taken into account. We also propose to quantify the effect of a stenosis at the afferent artery, the incidence of which has been underestimated for many years. The fluid-structure interactions (FSI) within a patient-specific radio-cephalic arteriovenous fistula are investigated numerically. The considered AVF presents an 80% stenosis at the afferent artery. The patient-specific velocity profile is imposed at the boundary inlet, and a Windkessel model is set at the arterial and venous outlets. The mechanical properties of the vein and the artery are differentiated. The non-Newtonian blood behavior has been taken into account. The FSI simulation advantageously provides the time-evolution of both the hemodynamic and structural stresses, and guarantees the equilibrium of the solution at the interface between the fluid and solid domains. The FSI results show the presence of large zones of blood flow recirculation within the cephalic vein, which might promote neointima formation. Large internal stresses are also observed at the venous wall, which may lead to wall remodeling. The fully-coupled FSI simulation results to be costly in computational time, which can so far limit its clinical use. We have investigated whether uncoupled fluid and structure simulations can provide accurate results and significantly reduce the computational time. The uncoupled simulations have the advantage to run 5 times faster than the fully-coupled FSI. We show that an uncoupled fluid simulation provides informative qualitative maps of the hemodynamic conditions in the AVF. Quantitatively, the maximum error on the hemodynamic parameters is 20%. The uncoupled structural simulation with non-uniform wall properties along the vasculature provides the accurate distribution of internal wall stresses, but only at one instant of time within the cardiac cycle. Although partially inaccurate or incomplete, the results of the uncoupled simulations could still be informative enough to guide clinicians in their decision-making. In the second part of the study we have investigated the effects of the arterial stenosis on the hemodynamics, and simulated its treatment by balloon-angioplasty. Clinically, balloon-angioplasty rarely corrects the stenosis fully and a degree of stenosis remains after treatment. Residual degrees of stenosis below 30% are considered as successful. We have inflated the balloon with different pressures to simulate residual stenoses ranging from 0 to 30%. The arterial stenosis has little impact on the blood flow distribution: the venous flow rate remains unchanged before and after the treatment and thus permits hemodialysis. But an increase in the pressure difference across the stenosis is observed, which could cause the heart work load to increase. To guarantee a pressure drop below 5 mmHg, which is considered as the threshold stenosis pressure difference clinically, we find that the residual stenosis degree must be 20% maximum.
|
5 |
Natural Hand Based Interaction Simulation using a Digital HandVipin, J S January 2013 (has links) (PDF)
The focus of the present work is natural human like grasping, for realistic performance simulations in digital human modelling (DHM) environment.
The performance simulation for grasping in DHM is typically done through high level commands to the digital human models (DHMs). This calls for a natural and unambiguous scheme to describe a grasp which would implicitly accommodate variations due to the hand form, object form and hand kinematics. A novel relational description scheme is developed towards this purpose. The grasp is modelled as a spatio-temporal relationship between the patches (a closed region on the surface) in the hand and the object. The task dependency of the grasp affects only the choice of the relevant patches. Thus, the present scheme of grasp description enables a human like grasp description possible. Grasping can be simulated either in an interactive command mode as discussed above or in an autonomous mode. In the autonomous mode the patches have to be computed. It is done using a psychological concept, of affordance. This scheme is employed to select a tool from a set of tools. Various types of grasps a user may adopt while grasping a spanner for manipulating a nut is simulated.
Grasping of objects by human evolves through distinct naturally occurring phases, such as re-oreintation, transport and preshape. Hand is taken to the object ballpark using a novel concept of virtual object. Before contact establishment hand achieves the shape similar to the global shape of the object, called preshaping. Various hand preshape strategies are simulating using an optimization scheme. Since the focus of the present work is human like grasping, the mechanism which drives the DHMs should also be anatomically pertinent. A methodology is developed wherein the hand-object contact establishment is done based on the anatomical observation of logarithmic spiral pattern during finger flexion. The effect of slip in presence of friction has been studied for 2D and 3D object grasping endeavours and a computational generation of the slip locus is done. The in-grasp slip studies are also done which simulates the finger and object response to slip.
It is desirable that the grasping performance simulations be validated for diverse hands that people have. In the absence of an available database of articulated bio-fidelic digital hands, this work develops a semi-automatic methodology for developing subject specific hand models from a single pose 3D laser scan of the subject's hand. The methodology is based on the clinical evidence that creases and joint locations on human hand are strongly correlated. The hand scan is segmented into palm, wrist and phalanges, both manually and computationally. The computational segmentation is based on the crease markings in the hand scan, which is identified by explicitly painting them using a mesh processing software by the user. Joint locations are computed on this segmented hand. A 24 dof kinematic structure is automatically embedded into the hand scan. The joint axes are computed using a novel palm plane normal concept. The computed joint axes are rectified using the convergence, and intra-finger constraints. The methodology is significantly tolerant to the noise in the scan and the pose of the hand. With the proposed methodology articulated, realistic, custom hand models can be generated.
Thus, the reported work presents a geometric framework for comprehensive simulation of grasping performance in a DHM environment.
|
6 |
Investigation of VR as Visualization and Assembly-Simulation ToolGötherström, Richard January 2024 (has links)
Across industries, Computer-aided design (CAD) programs are used to create vehicle parts. Simultaneously, virtual reality (VR) has become more common. Can these two co-existing technologies benefit one another? This thesis investigates how CAD modeling can be enhanced with the help of VR. Specifically, it focuses on importing CAD models during the runtime of a VR application, visualizing those models in real size. An additional feature includes the ability to interact with the imported CAD models, enabling assembly simulations. This thesis specifically explores how parts of electric autonomous trucks can be assembled in VR. The results are moderately satisfying, with users expressing both satisfaction and dissatisfaction with certain aspects of VR as a tool in this context. However, only two users evaluated the application, indicating the need for a more extensive evaluation. / I flera industrier är CAD-program vanliga för att modellera delar till fordon. Samtidigt växer virtual reality (VR) fram och blir allt vanligare. Kan dessa två teknologier samexistera och dra nytta av varandra? Det här arbetet undersöker om CAD-modellering kan dra fördel av VR som ett verktyg för visualisering. En av de mer unika aspekterna av detta arbete är hur import av CAD-modeller under körtid hanteras. En annan funktionalitet som undersöks är möjligheten att interagera med de importerade CAD-modellerna, vilket också öppnar för möjligheter att simulera monteringsprocesser. Resultatet var någorlunda tillfredsställande enligt intressenterna som utvärderade den slutgiltiga VR-applikationen. Det fanns aspekter som inte var helt tillfredsställande samtidigt som det fanns delar som var bra. Det ska beaktas att det endast var två intressenter som utvärderade den slutgiltiga applikationen, vilket indikerar att det behövs mer utvärdering för att fastställa ett gediget resultat.
|
Page generated in 0.1225 seconds