• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 21
  • 10
  • 7
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 52
  • 52
  • 10
  • 9
  • 9
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

A unifying mathematical definition enables the theoretical study of the algorithmic class of particle methods.

Pahlke, Johannes 05 June 2023 (has links)
Mathematical definitions provide a precise, unambiguous way to formulate concepts. They also provide a common language between disciplines. Thus, they are the basis for a well-founded scientific discussion. In addition, mathematical definitions allow for deeper insights into the defined subject based on mathematical theorems that are incontrovertible under the given definition. Besides their value in mathematics, mathematical definitions are indispensable in other sciences like physics, chemistry, and computer science. In computer science, they help to derive the expected behavior of a computer program and provide guidance for the design and testing of software. Therefore, mathematical definitions can be used to design and implement advanced algorithms. One class of widely used algorithms in computer science is the class of particle-based algorithms, also known as particle methods. Particle methods can solve complex problems in various fields, such as fluid dynamics, plasma physics, or granular flows, using diverse simulation methods, including Discrete Element Methods (DEM), Molecular Dynamics (MD), Reproducing Kernel Particle Methods (RKPM), Particle Strength Exchange (PSE), and Smoothed Particle Hydrodynamics (SPH). Despite the increasing use of particle methods driven by improved computing performance, the relation between these algorithms remains formally unclear. In particular, particle methods lack a unifying mathematical definition and precisely defined terminology. This prevents the determination of whether an algorithm belongs to the class and what distinguishes the class. Here we present a rigorous mathematical definition for determining particle methods and demonstrate its importance by applying it to several canonical algorithms and those not previously recognized as particle methods. Furthermore, we base proofs of theorems about parallelizability and computational power on it and use it to develop scientific computing software. Our definition unified, for the first time, the so far loosely connected notion of particle methods. Thus, it marks the necessary starting point for a broad range of joint formal investigations and applications across fields.:1 Introduction 1.1 The Role of Mathematical Definitions 1.2 Particle Methods 1.3 Scope and Contributions of this Thesis 2 Terminology and Notation 3 A Formal Definition of Particle Methods 3.1 Introduction 3.2 Definition of Particle Methods 3.2.1 Particle Method Algorithm 3.2.2 Particle Method Instance 3.2.3 Particle State Transition Function 3.3 Explanation of the Definition of Particle Methods 3.3.1 Illustrative Example 3.3.2 Explanation of the Particle Method Algorithm 3.3.3 Explanation of the Particle Method Instance 3.3.4 Explanation of the State Transition Function 3.4 Conclusion 4 Algorithms as Particle Methods 4.1 Introduction 4.2 Perfectly Elastic Collision in Arbitrary Dimensions 4.3 Particle Strength Exchange 4.4 Smoothed Particle Hydrodynamics 4.5 Lennard-Jones Molecular Dynamics 4.6 Triangulation refinement 4.7 Conway's Game of Life 4.8 Gaussian Elimination 4.9 Conclusion 5 Parallelizability of Particle Methods 5.1 Introduction 5.2 Particle Methods on Shared Memory Systems 5.2.1 Parallelization Scheme 5.2.2 Lemmata 5.2.3 Parallelizability 5.2.4 Time Complexity 5.2.5 Application 5.3 Particle Methods on Distributed Memory Systems 5.3.1 Parallelization Scheme 5.3.2 Lemmata 5.3.3 Parallelizability 5.3.4 Bounds on Time Complexity and Parallel Scalability 5.4 Conclusion 6 Turing Powerfulness and Halting Decidability 6.1 Introduction 6.2 Turing Machine 6.3 Turing Powerfulness of Particle Methods Under a First Set of Constraints 6.4 Turing Powerfulness of Particle Methods Under a Second Set of Constraints 6.5 Halting Decidability of Particle Methods 6.6 Conclusion 7 Particle Methods as a Basis for Scientific Software Engineering 7.1 Introduction 7.2 Design of the Prototype 7.3 Applications, Comparisons, Convergence Study, and Run-time Evaluations 7.4 Conclusion 8 Results, Discussion, Outlook, and Conclusion 8.1 Problem 8.2 Results 8.3 Discussion 8.4 Outlook 8.5 Conclusion
42

Computer Technologyin The Design Process

Montague, Gregory 01 January 2010 (has links)
This is a study of computer technology's impact on the theatrical design process. The tools of communication provided by technology were studied, and an analysis was conducted in the classroom of Digital Rendering, Digital Rendering Videos, and 3d CADD. After-wards, these tools were applied to an actual production of West Side Story where, with the addition of 3d light simulation software, the tools were used to communicate the design ideas from the lighting designer to the director. The goal of this process was to provide a 'real to life' virtual representation of the show to the director with the least amount of confusion. An additional goal was to test the limits and functions of the software; trying to learn all the benefits that could be provided to the process of mounting a theatrical production.
43

Development of Scheduling, Path Planning and Resource Management Algorithms for Robotic Fully-automated and Multi-story Parking Structure

Debnath, Jayanta Kumar January 2016 (has links)
No description available.
44

Computational and accuracy benchmarking of simulation and system-theoretic models for production systems engineering

Ramos Calderón, Antonio José January 2021 (has links)
The modern industry has an increasing demand for simulation software able to help workers and decision-makers visualize the outputs of a specific process in a fast, accurate way. In this report, a comparative study between FACTS (Factory Analyses in ConcepTual phase using Simulation), Plant Simulation, and PSE (Production System Engineering) Toolbox is done regarding their capacity to simulate models with increasing complexity, how accurate they are in their outputs with different optimized buffer allocations, and how well they perform on the task of detecting the bottlenecks of a process. Benchmarking simulation software requires an experimental approach, and for gathering and organizing all the data generated using external programs like MATLAB, C, Excel, and R are used. A high level of automatization is required as otherwise the manual input of data would take too long to be effective.The results conclude on major concordances among FACTS and Plant Simulation as the most used commercial DES (Discrete Event Simulation) software and a more mathematical-theoretical approach coming from PSE Toolbox. The optimization done in the report links to sustainability, with an enhanced TH improving the ecological, social and economic aspects, and to Lean philosophy using lean buffers that smooth and improve the production flow.
45

Simulátor GateCycle a jeho aplikace / GateCycle simulator and its applications

Svoboda, Adam January 2018 (has links)
Hlavná náplň tejto práce je oboznámenie so softvérom GateCycle od spoločnosti General Electric a jeho následná aplikácia na simulovanie industriálneho parného kotla produkujúceho 120 t pary za hodinu. Dizajn práce je vo forme výučbového materiálu, ktorý môže byť použitý vo výuke predmetu procesného inžinierstva zameraného na simulačné softvéry. Prvá časť práce je venovaná krátkej teórii o simulačných softvéroch a uvádza niekoľko najznámejších procesných inžinierskych simulačných programov. Druhá časť je napísaná formou GateCycle manuálu. V krátkosti predstavuje pracovné prostredie a rozhranie softvéru, demonštruje ako vytvoriť a spustiť simulačný model, zadať vstupné dáta a vytvoriť reporty. Tretia časť práce je praktická, vybraný industriálny parný kotol je prezentovaný a následne je vybudovaný jeho simulačný model. Táto časť je vytvorená vo forme príručky „krok za krokom“ vysvetľujúcej ako vytvoriť model kotla a aké dáta boli zadané do jednotlivých procesných aparátov. Kotol je simulovaný v 3 režimoch, využívajúc zemný plyn, ťažký vykurovací olej, či dechtovú vykurovaciu zmes ako palivo a je prevádzkovaný za mierne rozličných procesných podmienok. V závere sú vypočítané spotreby paliva porovnané s reálnymi prevádzkovými dátami a je vyhodnotená presnosť výpočtov softvéru GateCycle v týchto konkrétnych prípadoch.
46

Vzdušný hokej - strojové vidění a herní strategie / Vzdušný hokej - machine vision and game strategy

Sláma, Ondřej January 2020 (has links)
The aim of this diploma thesis is to implement a robotic air hockey table in cooperation with the work of Dominik Jašek [37] . Specifically, the work deals with the implementation of simulation software that models behavior of robotic air hockey table, designing a game strategy for the robot and detecting position of a puck situated on the playing field. In addition, the work was extended by integrating said software modules under one control application and creating a user interface for controlling and managing all the functionalities of the robotic table through a capacitive touch screen.
47

Modelem řízený návrh konferenčního systému / Model Based Design of the Conference System

Caha, Matěj January 2013 (has links)
This thesis deals with the topic of  model-based design and application of simulation in system design. In the introduction, the thesis discusses the history of software development process and outlines the current status. The aim is to demonstrate a model-driven design on a case study of conference system. There will be presented formalisms of DEVS and OOPN  together with experimental tools PNtalk and SmallDEVS that allow to work with these formalisms. The resulting model of conference system will be deployed as part of a web application using a framework Seaside in the Squeak environment.
48

Computergestützte Simulation und Analyse zufälliger dichter Kugelpackungen

Elsner, Antje 19 November 2009 (has links)
In dieser interdisziplinär geprägten Arbeit wird zunächst eine Übersicht über kugelbasierte Modelle und die algorithmischen Ansätze zur Generierung zufälliger Kugelpackungen gegeben. Ein Algorithmus aus der Gruppe der Kollektiven-Umordnungs-Algorithmen -- der Force-Biased-Algorithmus -- wird ausführlich erläutert und untersucht. Dabei werden die für den Force-Biased-Algorithmus als essenziell geltenden Verschiebungsfunktionen bezüglich ihres Einflusses auf den erreichbaren Volumenanteil der Packungen untersucht. Nicht nur aus der Literatur bekannte, sondern auch neu entwickelte Verschiebungsfunktionen werden hierbei betrachtet. Daran anschließend werden Empfehlungen zur Auswahl geeigneter Verschiebungsfunktionen gegeben. Einige mit dem Force-Biased-Algorithmus generierte Kugelpackungen, zum Beispiel hochdichte monodisperse Packungen, lassen den Schluss zu, dass insbesondere strukturelle Umbildungsvorgänge an solchen Packungen sehr gut zu untersuchen sind. Aus diesem Grund besitzt das Modell der mit dem Force-Biased-Algorithmus dicht gepackten harten Kugeln große Bedeutung in der Materialwissenschaft, insbesondere in der Strukturforschung. In einem weiteren Kapitel werden wichtige Kenngrößen kugelbasierter Modelle erläutert, wie z. B. spezifische Oberfläche, Volumenanteil und die Kontaktverteilungsfunktionen. Für einige besonders anwendungsrelevante Kenngrößen (z. B. die spezifische Oberfläche) werden Näherungsformeln entwickelt, an Modellsystemen untersucht und mit bekannten Näherungen aus der Literatur verglichen. Zur Generierung und Analyse der Kugelpackungen wurde im Rahmen dieser Arbeit die Simulationssoftware „SpherePack“ entwickelt, deren Aufbau unter dem Aspekt des Softwareengineerings betrachtet wird. Die Anforderungen an dieses Simulationssystem sowie dessen Architektur werden hier beschrieben, einschließlich der Erläuterung einzelner Berechnungsmodule. An ausgewählten praxisnahen Beispielen aus der Materialwissenschaft kann die Vielfalt der Einsatzmöglichkeiten eines Simulationssystems zur Generierung und Analyse von zufälligen dicht gepackten Kugelsystemen gezeigt werden. Vor allem die hohe Aussagekraft der Untersuchungen in Bezug auf Materialeigenschaften unterstreicht die Bedeutung des Modells zufällig dicht gepackter harter Kugeln in der Materialforschung und verwandten Forschungsgebieten.
49

Développement d'un télescope Comton avec un calorimètre imageur 3D pour l'astronomie gamma / Development of a Compton Telescope with 3D Imaging Calorimeter for Gamma-Ray Astronomy

Gostojić, Aleksandar 21 April 2016 (has links)
La thèse porte sur le développement d’un petit prototype de télescope Compton pour l'astronomie gamma spatiale dans la gamme d’énergie du MeV (0.1-100 MeV). Nous avons étudié de nouveaux modules de détection destinés à l'imagerie Compton. Nous avons assemblé et testé deux détecteurs à scintillation, l'un avec un cristal de bromure de lanthane dopé au cérium (LaBr₃:Ce) et l'autre avec un cristal de bromure de cérium (CeBr₃). Les deux cristaux sont couplés à des photomultiplicateurs multi-anodes 64 voies sensibles à la position. Notre objectif est d’optimiser la résolution en énergie en même temps que la résolution en position du premier impact d'un rayon gamma incident dans le détecteur. Les deux informations sont vitales pour la reconstruction d'une image avec le prototype de télescope à partir de l’effet Compton. Nous avons développé un banc de test pour étudier expérimentalement les deux modules, avec une électronique de lecture et un système d'acquisition de données dédiés. Nous avons entrepris un étalonnage précis du dispositif et effectué de nombreuses mesures avec différentes sources radioactives. En outre, nous avons réalisé une simulation numérique détaillée de l'expérience avec le logiciel GEANT4 et effectué une étude paramétrique extensive pour modéliser au mieux la propagation des photons ultraviolet de scintillation et les propriétés optiques des surfaces à l'intérieur du détecteur. Nous avons alors développé une méthode originale de reconstruction de la position d’impact en 3D, en utilisant un réseau de neurones artificiels entrainé avec des données simulées. Nous présentons dans ce travail tous les résultats expérimentaux obtenus avec les deux modules, les résultats de la simulation GEANT4, ainsi que l'algorithme basé sur le réseau de neurones. En plus, nous donnons les premiers résultats sur l'imagerie Compton obtenus avec le prototype de télescope et les comparons avec des performances simulées. Enfin, nous concluons en donnant un aperçu des perspectives d'avenir pour l'imagerie gamma Compton et considérons une application possible en discutant d’un concept de télescope spatial semblable à notre prototype. / The thesis aims to develop a small prototype of a Compton telescope for future space instrumentation for gamma-ray astronomy. Telescope’s main target is the MeV range (0.1-100MeV). We studied novel detector modules intended for Compton imaging. We assembled and tested 2 modules, one with a cerium-doped lanthanum(III) bromide (LaBr₃:Ce) crystal and the other with cerium(III) bromide (CeBr₃). Both modules are coupled to and read out by 64-channel multi-anode PMTs. Our goals are to obtain the best possible energy resolution and position resolution in 3D on the first impact of an incident gamma-ray within the detector. Both information are vital for successful reconstruction of a Compton image with the telescope prototype. We developed a test bench to experimentally study both modules and have utilized a customized readout electronics and data acquisition system. We conducted a precise calibration of the system and performed experimental runs utilizing different radioactive sources. Furthermore, we have written a detailed GEANT4 simulation of the experiment and performed an extensive parametric study on defining the surfaces and types of scintillation propagation within the scintillator. We utilized simulated data to train an Artificial Neural Network (ANN) algorithm to create a simplified 3D impact position reconstruction method and in addition developed an approximation routine to estimate the standard deviations for the method. We show all experimental results obtained by both modules, results from the GEANT4 simulation runs and from the ANN algorithm. In addition, we give the first results on Compton imaging with the telescope prototype and compare them with simulated performance. We analyzed and discussed the performance of the modules, specifically spectral and position reconstruction capabilities. We conclude by giving an overview of the future prospects for gamma-ray imaging and consider possible applications showing a concept of a space telescope based on our prototype.
50

Virtual Matching : Från fysisk till virtuell sampassning / Virtual Matching : From physical to virtual matching

Skog, Johan, Holm Bergstedt, Charlie January 2018 (has links)
I takt mot en digitaliserad värld går de allra flesta arbetsmetoderna mot datorbaserade verktyg. I dagsläget tillämpar Scania arbetsmetoden fysisk sampassning som utförs för sampassning och utvärdering av artiklars avvikelser. Detta görs för att utvärdera det estetiska utseendet på lastbilens chassi med målet att uppnå högsta kvalitet. I detta arbete använder Scania RD&T (Robust Design & Tolerancing), ett simuleringsprogram utvecklat av företaget RD&T Technology i Göteborg som används till att förutse avvikelser i sammanställningar av artiklar. RD&T erbjuder modulen Virtual Matching som beräknar variationerna på och mellan artiklar. Syftet med detta examensarbete är att undersöka om modulen Virtual Matching kan komplettera eller ersätta den fysiska sampassningsprocessen. En Virtual Matching-analys har utförts mellan två artiklar med mätdata på artiklarna för utvärdering av arbetsprocessen med modulen Virtual Matching. Denna arbetsprocess har jämförts med den fysiska sampassningsprocessen på Scania. Som grund till underlaget har intervjuer och deltagande i övningar genomförts. Modulen Virtual Matching kan bidras som ett komplement i ett tidigt stadie av fysisk sampassning som resulterar i resursbesparingar. En utvärdering av en implementationsprocess har beskrivits tillsammans med förslag till vidare studier på liknande arbetsprocesser. / Click her Towards a digitized world, the vast majority of working processes strive for computer-based tools. Scania applies the working process physical matching that is performed for matching and evaluating article's deviations. The purpose with the evaluation is to gain the highest visual quality of the trucks chassis. In this process, Scania uses RD&T (Robust Design & Tolerancing), a simulation program used to predict variations in compilations of articles. RD&T offers the Virtual Matching module that analyzes the variations between articles with real measurement data. The purpose of this thesis is to investigate w hether the Virtual Matching module can supplement or replace the physical matching process. A Virtual Matching-analysis has been performed between two articles with measure data on both articles. The working process with the Virtual Matching module has been evaluated and compared with the physical matching process at Scania. In conclusion, the Virtual Matching module can contribute as a complement in an early stage of physical matching processes which results in resource savings. An evaluation of an implementation process has been described together with proposals for further studies on similar work processes.e to insert text.

Page generated in 0.5161 seconds