• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 72
  • 72
  • 24
  • 21
  • 20
  • 17
  • 16
  • 13
  • 12
  • 11
  • 9
  • 9
  • 9
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Algorithmes et structures de données parallèles pour applications interactives / Parallel algorithms and data structures for interactive data problems

Toss, Julio 26 October 2017 (has links)
La quête de performance a été une constante à travers l'histoire des systèmes informatiques.Il y a plus d'une décennie maintenant, le modèle de traitement séquentiel montrait ses premiers signes d'épuisement pour satisfaire les exigences de performance.Les barrières du calcul séquentiel ont poussé à un changement de paradigme et ont établi le traitement parallèle comme standard dans les systèmes informatiques modernes.Avec l'adoption généralisée d'ordinateurs parallèles, de nombreux algorithmes et applications ont été développés pour s'adapter à ces nouvelles architectures.Cependant, dans des applications non conventionnelles, avec des exigences d'interactivité et de temps réel, la parallélisation efficace est encore un défi majeur.L'exigence de performance en temps réel apparaît, par exemple, dans les simulations interactives où le système doit prendre en compte l'entrée de l'utilisateur dans une itération de calcul de la boucle de simulation.Le même type de contrainte apparaît dans les applications d'analyse de données en continu.Par exemple, lorsque des donnes issues de capteurs de trafic ou de messages de réseaux sociaux sont produites en flux continu, le système d'analyse doit être capable de traiter ces données à la volée rapidement sur ce flux tout en conservant un budget de mémoire contrôlé.La caractéristique dynamique des données soulève plusieurs problèmes de performance tel que la décomposition du problème pour le traitement en parallèle et la maintenance de la localité mémoire pour une utilisation efficace du cache.Les optimisations classiques qui reposent sur des modèles pré-calculés ou sur l'indexation statique des données ne conduisent pas aux performances souhaitées.Dans cette thèse, nous abordons les problèmes dépendants de données sur deux applications différentes: la première dans le domaine de la simulation physique interactive et la seconde sur l'analyse des données en continu.Pour le problème de simulation, nous présentons un algorithme GPU parallèle pour calculer les multiples plus courts chemins et des diagrammes de Voronoi sur un graphe en forme de grille.Pour le problème d'analyse de données en continu, nous présentons une structure de données parallélisable, basée sur des Packed Memory Arrays, pour indexer des données dynamiques géo-référencées tout en conservant une bonne localité de mémoire. / The quest for performance has been a constant through the history of computing systems. It has been more than a decade now since the sequential processing model had shown its first signs of exhaustion to keep performance improvements.Walls to the sequential computation pushed a paradigm shift and established the parallel processing as the standard in modern computing systems. With the widespread adoption of parallel computers, many algorithms and applications have been ported to fit these new architectures. However, in unconventional applications, with interactivity and real-time requirements, achieving efficient parallelizations is still a major challenge.Real-time performance requirement shows-up, for instance, in user-interactive simulations where the system must be able to react to the user's input within a computation time-step of the simulation loop. The same kind of constraint appears in streaming data monitoring applications. For instance, when an external source of data, such as traffic sensors or social media posts, provides a continuous flow of information to be consumed by an on-line analysis system. The consumer system has to keep a controlled memory budget and delivery fast processed information about the stream.Common optimizations relying on pre-computed models or static index of data are not possible in these highly dynamic scenarios. The dynamic nature of the data brings up several performance issues originated from the problem decomposition for parallel processing and from the data locality maintenance for efficient cache utilization.In this thesis we address data-dependent problems on two different application: one in physics-based simulation and other on streaming data analysis. To the simulation problem, we present a parallel GPU algorithm for computing multiple shortest paths and Voronoi diagrams on a grid-like graph. To the streaming data analysis problem we present a parallelizable data structure, based on packed memory arrays, for indexing dynamic geo-located data while keeping good memory locality.
52

The Light Curve Simulation and Its Inversion Problem for Human-Made Space Objects

Siwei Fan (9193685) 03 August 2020 (has links)
Shape and attitude of near-Earth objects directly affect the orbit propagation via drag and solar radiation pressure. Obtaining information beyond the object states (position and velocity) is integral to identifying an object. It also enables tracing origin and can improve the orbit accuracy. For objects that have a significant distance to the observer, only non-resolved imaging is available, which does not show any details of the object. So-called non-resolved light curve measurements, i.e. photometric measurements over time can be used to determined the shape of space objects using a two step inversion scheme. It follows the procedure to first determine the Extended Gaussian Image and then going through the shape reconstruction process to retrieve the closed shape even while measurement noise is present. Furthermore, it is also possible to generate high confidence candidates when follow-up observations are provided through a multi-hypotheses process.
53

Diagnosis of Evaporative Emissions Control System Using Physics-based and Machine Learning Methods

Yang, Ruochen 24 September 2020 (has links)
No description available.
54

Improving Artistic Workflows For Fluid Simulation Through Adaptive and Editable Fluid Simulation Techniques

Flynn, Sean A 02 April 2021 (has links)
As the fidelity of computer generated imagery has increased, the need to digitally create convincing natural phenomena like fluids has become fundamental to the entertainment production industry. Because fluids are complex, the underlying physics must be computationally simulated. However, because a strictly physics-based approach is both computationally expensive and difficult to control, it does not lend itself well to the way artists and directors like to work. Directors require control to achieve their specific artistic vision. Furthermore, artistic workflows rely on quick iteration and the ability to apply changes late in the production process. In this dissertation we present novel techniques in adaptive simulation and fluid post-processing to improve artistic workflows for fluid simulation. Our methods reduce fluid simulation iteration time and provide a new way for artists to intelligently resize a wide range of volumetric data including fluid simulations. To reduce iteration time, we present a more cache-friendly linear octree structure for adaptive fluid simulation that reduces the overhead of previous octree-based methods. To increase the viability of reusable effects libraries, and to give artists intuitive control over simulations late in the production process we present a ``fluid carving" technique. Fluid carving uses seam carving methods to allow intelligent resizing on a variety of fluid phenomena without the need for costly re-simulation. We present methods that improve upon traditional seam carving approaches to address issues with scalability, non-rectangular boundaries, and that generalize to a variety of different visual effects data like particles, polygonal meshes, liquids, smoke, and fire. We achieve these improvements by guiding seams along user-defined lattices that can enclose regions of interest defined as OpenVDB grids with a wide range of shapes. These techniques significantly improve artist workflows for fluid simulation and allow visual entertainment to be produced in a more intuitive, cost-effective manner.
55

Noise Function Turbulence Optical Phase Screens and Physics Based Rendering

Riley, Joseph T. January 2021 (has links)
No description available.
56

Tools for fluid simulation control in computer graphics

Schoentgen, Arnaud 09 1900 (has links)
L’animation basée sur la physique peut générer des systèmes aux comportements complexes et réalistes. Malheureusement, contrôler de tels systèmes est une tâche ardue. Dans le cas de la simulation de fluide, le processus de contrôle est particulièrement complexe. Bien que de nombreuses méthodes et outils ont été mis au point pour simuler et faire le rendu de fluides, trop peu de méthodes offrent un contrôle efficace et intuitif sur une simulation de fluide. Étant donné que le coût associé au contrôle vient souvent s’additionner au coût de la simulation, appliquer un contrôle sur une simulation à plus haute résolution rallonge chaque itération du processus de création. Afin d’accélérer ce processus, l’édition peut se faire sur une simulation basse résolution moins coûteuse. Nous pouvons donc considérer que la création d’un fluide contrôlé peut se diviser en deux phases: une phase de contrôle durant laquelle un artiste modifie le comportement d’une simulation basse résolution, et une phase d’augmentation de détail durant laquelle une version haute résolution de cette simulation est générée. Cette thèse présente deux projets, chacun contribuant à l’état de l’art relié à chacune de ces deux phases. Dans un premier temps, on introduit un nouveau système de contrôle de liquide représenté par un modèle particulaire. À l’aide de ce système, un artiste peut sélectionner dans une base de données une parcelle de liquide animé précalculée. Cette parcelle peut ensuite être placée dans une simulation afin d’en modifier son comportement. À chaque pas de simulation, notre système utilise la liste de parcelles actives afin de reproduire localement la vision de l’artiste. Une interface graphique intuitive a été développée, inspirée par les logiciels de montage vidéo, et permettant à un utilisateur non expert de simplement éditer une simulation de liquide. Dans un second temps, une méthode d’augmentation de détail est décrite. Nous proposons d’ajouter une étape supplémentaire de suivi après l’étape de projection du champ de vitesse d’une simulation de fumée eulérienne classique. Durant cette étape, un champ de perturbations de vitesse non-divergent est calculé, résultant en une meilleure correspondance des densités à haute et à basse résolution. L’animation de fumée résultante reproduit fidèlement l’aspect grossier de la simulation d’entrée, tout en étant augmentée à l’aide de détails simulés. / Physics-based animation can generate dynamic systems of very complex and realistic behaviors. Unfortunately, controlling them is a daunting task. In particular, fluid simulation brings up particularly difficult problems to the control process. Although many methods and tools have been developed to convincingly simulate and render fluids, too few methods provide efficient and intuitive control over a simulation. Since control often comes with extra computations on top of the simulation cost, art-directing a high-resolution simulation leads to long iterations of the creative process. In order to shorten this process, editing could be performed on a faster, low-resolution model. Therefore, we can consider that the process of generating an art-directed fluid could be split into two stages: a control stage during which an artist modifies the behavior of a low-resolution simulation, and an upresolution stage during which a final high-resolution version of this simulation is driven. This thesis presents two projects, each one improving on the state of the art related to each of these two stages. First, we introduce a new particle-based liquid control system. Using this system, an artist selects patches of precomputed liquid animations from a database, and places them in a simulation to modify its behavior. At each simulation time step, our system uses these entities to control the simulation in order to reproduce the artist’s vision. An intuitive graphical user interface inspired by video editing tools has been developed, allowing a nontechnical user to simply edit a liquid animation. Second, a tracking solution for smoke upresolution is described. We propose to add an extra tracking step after the projection of a classical Eulerian smoke simulation. During this step, we solve for a divergence-free velocity perturbation field resulting in a better matching of the low-frequency density distribution between the low-resolution guide and the high-resolution simulation. The resulting smoke animation faithfully reproduces the coarse aspect of the low-resolution input, while being enhanced with simulated small-scale details.
57

A hybrid prognostic methodology and its application to well-controlled engineering systems

Eker, Ömer F. January 2015 (has links)
This thesis presents a novel hybrid prognostic methodology, integrating physics-based and data-driven prognostic models, to enhance the prognostic accuracy, robustness, and applicability. The presented prognostic methodology integrates the short-term predictions of a physics-based model with the longer term projection of a similarity-based data-driven model, to obtain remaining useful life estimations. The hybrid prognostic methodology has been applied on specific components of two different engineering systems, one which represents accelerated, and the other a nominal degradation process. Clogged filter and fatigue crack propagation failure cases are selected as case studies. An experimental rig has been developed to investigate the accelerated clogging phenomena whereas the publicly available Virkler fatigue crack propagation dataset is chosen after an extensive literature search and dataset analysis. The filter clogging experimental rig is designed to obtain reproducible filter clogging data under different operational profiles. This data is thought to be a good benchmark dataset for prognostic models. The performance of the presented methodology has been evaluated by comparing remaining useful life estimations obtained from both hybrid and individual prognostic models. This comparison has been based on the most recent prognostic evaluation metrics. The results show that the presented methodology improves accuracy, robustness and applicability. The work contained herein is therefore expected to contribute to scientific knowledge as well as industrial technology development.
58

A physics-based maintenance cost methodology for commercial aircraft engines

Stitt, Alice C. January 2014 (has links)
A need has been established in industry and academic publications to link an engine's maintenance costs throughout its operational life to its design as well as its operations and operating conditions. The established correlations between engine operation, design and maintenance costs highlight the value of establishing a satisfactory measure of the relative damage due to different operating conditions (operational severity). The methodology developed in this research enables the exploration of the causal, physics-based relationships underlying the statistical correlations in the public domain and identifies areas for further investigation. This thesis describes a physics-based approach to exploring the interactions, for commercial aircraft, of engine design, operation and through life maintenance costs. Applying the "virtual-workshop" workscoping concept to model engine maintenance throughout the operating life captures the maintenance requirements at each shop visit and the impact of a given shop visit on the timing and requirements for subsequent visits. Comparisons can thus be made between the cost implications of alternative operating regimes, flight profiles and maintenance strategies, taking into account engine design, age, operation and severity. The workscoping model developed operates within a physics-based methodology developed collaboratively within the research group which encompasses engine performance, lifing and operational severity modelling. The tool-set of coupled models used in this research additionally includes the workscoping maintenance cost model developed and implements a simplified 3D turbine blade geometry, new lifing models and an additional lifing mechanism (Thermo-mechanical fatigue (TMF)). Case studies presented model the effects of different outside air temperatures, reduced thrust operations (derate), flight durations and maintenance decisions. The use of operational severity and exhaust gas temperature margin deterioration as physics based cost drivers, while commonly accepted, limit the comparability of the results to other engine-aircraft pairs as the definition of operational severity, its derivation and application vary widely. The use of a single operation severity per mission based on high pressure turbine blade life does not permit the maintenance to vary with the prevalent lifing mechanism type (cyclic/steady state).
59

Thermomechanical fatigue crack formation in a single crystal Ni-base superalloy

Amaro, Robert L. 11 February 2011 (has links)
This research establishes a physics-based life determination model for the second generation single crystal superalloy PWA 1484 experiencing out-of-phase thermomechanical fatigue (TMF). The life model was developed as a result of a combination of critical mechanical tests, dominant damage characterization and utilization of well-established literature. The resulting life model improves life prediction over currently employed methods and provides for extrapolation into yet unutilized operating regimes. Particularly, the proposed deformation model accounts for the materials' coupled fatigue-environment-microstructure response to TMF loading. Because the proposed model is be based upon the underlying deformation physics, the model is robust enough to be easily modified for other single crystal superalloys having similar microstructure. Future use of this model for turbine life estimation calculations would be based upon the actual deformation experienced by the turbine blade, thereby enabling turbine maintenance scheduling based upon on a "retirement for a cause" life management scheme rather than the currently employed "safe-life" calculations. This advancement has the ability to greatly reduce maintenance costs to the turbine end-user since turbine blades would be removed from service for practical and justifiable reasons. Additionally this work will enable a rethinking of the warranty period, thereby decreasing warranty related replacements. Finally, this research provides a more thorough understanding of the deformation mechanisms present in loading situations that combine fatigue-environment-microstructure effects.
60

Exploiting contacts for interactive control of animated human characters

Jain, Sumit 30 June 2011 (has links)
One of the common research goals in disciplines such as computer graphics and robotics is to understand the subtleties of human motion and develop tools for recreating natural and meaningful motion. Physical simulation of virtual human characters is a promising approach since it provides a testbed for developing and testing control strategies required to execute various human behaviors. Designing generic control algorithms for simulating a wide range of human activities, which can robustly adapt to varying physical environments, has remained a primary challenge. This dissertation introduces methods for generic and robust control of virtual characters in an interactive physical environment. Our approach is to use the information of the physical contacts between the character and her environment in the control design. We leverage high-level knowledge of the kinematics goals and the interaction with the surroundings to develop active control strategies that robustly adapt to variations in the physical scene. For synthesizing intentional motion requiring long-term planning, we exploit properties of the physical model for creating efficient and robust controllers in an interactive framework. The control design leverages the reference motion capture data and the contact information with the environment for interactive long-term planning. Finally, we propose a compact soft contact model for handling contacts for rigid body virtual characters. This model aims at improving the robustness of existing control methods without adding any complexity to the control design and opens up possibilities for new control algorithms to synthesize agile human motion.

Page generated in 0.0372 seconds