• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 18
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 29
  • 29
  • 12
  • 10
  • 9
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Floquet engineering in periodically driven closed quantum systems: from dynamical localisation to ultracold topological matter

Bukov, Marin Georgiev 12 February 2022 (has links)
This dissertation presents a self-contained study of periodically-driven quantum systems. Following a brief introduction to Floquet theory, we introduce the inverse-frequency expansion, variants of which include the Floquet-Magnus, van Vleck, and Brillouin-Wigner expansions. We reveal that the convergence properties of these expansions depend strongly on the rotating frame chosen, and relate the former to the existence of Floquet resonances in the quasienergy spectrum. The theoretical design and experimental realisation (`engineering') of novel Floquet Hamiltonians is discussed introducing three universal high-frequency limits for systems comprising single-particle and many-body linear and nonlinear models. The celebrated Schrieffer-Wolff transformation for strongly-correlated quantum systems is generalised to periodically-driven systems, and a systematic approach to calculate higher-order corrections to the Rotating Wave Approximation is presented. Next, we develop Floquet adiabatic perturbation theory from first principles, and discuss extensively the adiabatic state preparation and the corresponding leading-order non-adiabatic corrections. Special emphasis is thereby put on geometrical and topological objects, such as the Floquet Berry curvature and the Floquet Chern number obtained within linear response in the presence of the drive. Last, pre-thermalisation and thermalisation in closed, clean periodically-driven quantum systems are studied in detail, with the focus put on the crucial role of Floquet many-body resonances for energy absorption.
22

Parametric Bose-Hubbard Hamiltonians: Quantum Dissipation, Irreversibility, and Pumping / Parametrische Bose-Hubbard Hamiltonians: Dissipation, Irreversibilität und Quantenpumpen

Hiller, Moritz 19 December 2007 (has links)
No description available.
23

Towards Autonomic and Cognitive IoT Systems, Application to Patients’ Treatments Management / Vers les systèmes IoT autonomiques et cognitifs, application pour la gestion des traitements des patients

Mezghani, Emna 15 December 2016 (has links)
Dans cette thèse, nous proposons une méthodologie basée sur les modèles pour gérer la complexité de la conception des systèmes autonomiques cognitifs intégrant des objets connectés. Cette méthodologie englobe un ensemble de patrons de conception dont nous avons défini pour modéliser la coordination dynamique des processus autonomiques pour gérer l’évolution des besoins du système, et pour enrichir les systèmes avec des propriétés cognitives qui permettent de comprendre les données et de générer des nouvelles connaissances. De plus, pour gérer les problèmes reliés à la gestion des big data et à la scalabilité du système lors du déploiement des processus, nous proposons une plate-forme sémantique supportant le traitement des grandes quantités de données afin d’intégrer des sources de données distribuées et hétérogènes déployées sur le cloud pour générer des connaissances qui seront exposées en tant que service (KaaS). Comme application de nos contributions, nous proposons un système cognitif prescriptif pour la gestion du plan de traitement du patient. Ainsi, nous élaborons des modèles ontologiques décrivant les capteurs et le contexte du patient, ainsi que la connaissance médicale pour la prise de décision. Le système proposé est évalué de point de vue clinique en collaborant avec des experts médicaux, et de point de vue performance en proposant des différentes configurations dans le KaaS. / In this thesis, we propose a collaborative model driven methodology for designing Autonomic Cognitive IoT systems to deal with IoT design complexity. We defined within this methodology a set of autonomic cognitive design patterns that aim at (1) delineating the dynamic coordination of the autonomic processes to deal with the system's context changeability and requirements evolution at run-time, and (2) adding cognitive abilities to IoT systems to understand big data and generate new insights. To address challenges related to big data and scalability, we propose a generic semantic big data platform that aims at integrating heterogeneous distributed data sources deployed on the cloud and generating knowledge that will be exposed as a service (Knowledge as a Service--KaaS). As an application of the proposed contributions, we instantiated and combined a set of patterns for the development of prescriptive cognitive system for the patient treatment management. Thus, we elaborated two ontological models describing the wearable devices and the patient context as well as the medical knowledge for decision-making. The proposed system is evaluated from the clinical prescriptive through collaborating with medical experts, and from the performance perspective through deploying the system within the KaaS following different configurations
24

An Integrative Framework for Model-Driven Systems Engineering : Towards the Co-Evolution of Simulation, Formal Analysis and Enactment Methodologies for Discrete Event Systems / Un cadre intégratif pour l'ingénierie dirigée par les modèles des systèmes complexes : vers une fusion méthodologique de la simulation à évènements discrets avec l'analyse formelle et le prototypage rapide

Aliyu, Hamzat Olanrewaju 15 December 2016 (has links)
Les méthodes d’ingénierie dirigée par modèle des systèmes, telles que la simulation, l’analyse formelle et l’émulation ont été intensivement utilisées ces dernières années pour étudier et prévoir les propriétés et les comportements des systèmes complexes. Les résultats de ces analyses révèlent des connaissances qui peuvent améliorer la compréhension d’un système existant ou soutenir un processus de conception de manière à éviter des erreurs couteuses (et catastrophiques) qui pourraient se produire dans le système. Les réponses à certaines questions que l’on se pose sur un système sont généralement obtenues en utilisant des méthodes d’analyse spécifiques ; par exemple les performances et les comportements d’un système peuvent être étudiés de façon efficace dans certains cadres expérimentaux, en utilisant une méthode appropriée de simulation. De façon similaire, la vérification de propriétés telles que la vivacité, la sécurité et l’équité sont mieux étudiées en utilisant des méthodes formelles appropriées tandis que les méthodologies d’émulation peuvent être utilisées pour vérifier des hypothèses temporelles et des activités et comportements impliquant des interactions humaines. Donc, une étude exhaustive d’un système complexe (ou même d’apparence simple) nécessite souvent l’utilisation de plusieurs méthodes d’analyse pour produire des réponses complémentaires aux probables questions. Nul doute que la combinaison de multiples méthodes d’analyse offre plus de possibilités et de rigueur pour analyser un système que ne peut le faire chacune des méthodes prise individuellement. Si cet exercice (de combinaison) permet d’aller vers une connaissance (presque) complète des systèmes complexes, son adoption pratique ne va pas de pair avec les avancées théoriques en matière de formalismes et d’algorithmes évolués, qui résultent de décennies de recherche par les praticiens des différentes méthodes. Ce déficit peut s’expliquer parles compétences mathématiques requises pour utiliser ces formalismes, en combinaison avec la faible portabilité des modèles entre les outils des différentes méthodes. Cette dernière exigence rend nécessaire la tâche herculéenne de créer et de gérer plusieurs modèles du même système dans différents formalismes et pour différents types d’analyse. Un autre facteur bloquant est que la plupart des environnements d’analyse sont dédiés à une méthode d’analyse spécifique (i.e., simulation, ou analyse formelle, ou émulation) et sont généralement difficiles à étendre pour réaliser d’autres types d’analyse. Ainsi, une vaste connaissance de formalismes supportant la multitude de méthodes d’analyse est requise, pour pouvoir créer les différents modèles nécessaires, mais surtout un problème de cohérence se pose lorsqu’il faudra mettre à jour séparément ces modèles lorsque certaines parties du système changent. La contribution de cette thèse est d’alléger les charges d’un utilisateur de méthodes d'analyse multiples, dans sa quête d’exhaustivité dans l’étude des systèmes complexes, grâce à un cadre qui utilise les technologies d’Ingénierie Dirigée par les Modèles (IDM) pour fédérer la simulation, l’analyse formelle et l’émulation. Ceci est rendu possible grâce à la définition d’un langage de spécification unifié de haut niveau, supporté par des capacités de synthèse automatiques d’artéfacts requis par les différentes méthodes d’analyse. (...) / Model-based systems engineering methodologies such as Simulation, Formal Methods (FM) and Enactment have been used extensively in recent decades to study, analyze, and forecast the properties and behaviors of complex systems. The results of these analyses often reveal subtle knowledge that could enhance deeper understanding of an existing system or provide timely feedbacks into a design process to avert costly (and catastrophic) errors that may arise in the system. Questions about different aspects of a system are usually best answered using some specific analysis methodologies; for instance, system's performance and behavior in some specified experimental frames can be efficiently studied using appropriate simulation methodologies. Similarly, verification of properties such as, liveness, safeness and fairness are better studied with appropriate formal methods while enactment methodologies may be used to verify assumptions about some time-based and human-in-the-loop activities and behaviors. Therefore, an exhaustive study of a complex (or even seemingly simple) system often requires the use of different analysis methodologies to produce complementary answers to likely questions. There is no gainsaying that a combination of multiple analysis methodologies offers more powerful capabilities and rigor to test system designs than can be accomplished with any of the methodologies applied alone. While this exercise will provide (near) complete knowledge of complex systems and helps analysts to make reliable assumptions and forecasts about their properties, its practical adoption is not commensurate with the theoretical advancements, and evolving formalisms and algorithms, resulting from decades of research by practitioners of the different methodologies. This shortfall has been linked to the prerequisite mathematical skills for dealing with most formalisms, which is compounded by little portability of models between tools of different methodologies that makes it mostly necessary to bear the herculean task of creating and managing several models of same system in different formalisms. Another contributing factor is that most of existing computational analysis environments are dedicated to specific analysis methodologies (i.e., simulation or FM or enactment) and are usually difficult to extend to accommodate other approaches. Thus, one must learn all the formalisms underlining the various methods to create models and go round to update all of them whenever certain system variables change. The contribution of this thesis to alleviating the burdens on users of multiple analysis methodologies for exhaustive study of complex systems can be described as a framework that uses Model-Driven Engineering (MDE) technologies to federate simulation, FM and enactment analysis methodologies behind a unified high-level specification language with support for automated synthesis of artifacts required by the disparate methodologies. This framework envelops four pieces of contributions: i) a methodology that emulates the Model- Driven Architecture (MDA) to propose an independent formalism to integrate the different analysis methodologies. ii) Integration of concepts from the three methodologies to provide a common metamodel to unite some selected formalisms for simulation, FM and enactment. Iii) Mapping rules for automated synthesis of artifacts for simulation, FM and enactment from a common reference model of a system and its requirements. iv) A framework for the enactment of idiscrete event systems. We use the beverage vending system as a running example throughout the thesis. (...)
25

Experimentální studium pole neutronů v podkritickém urychlovačem řízeném jaderném reaktoru / Experimental Investigation of the Neutron Field in an Accelerator Driven Subcritical Reactor

Zeman, Miroslav January 2020 (has links)
This dissertation focuses on irradiations of a spallation set-up consisting of more than half a ton of natural uranium that were executed by a 660 MeV proton beam at the Joint Institute for Nuclear Reserch in Dubna. Two types of irradiations were arranged: with and without lead shielding. Both types were arranged with threshold activation detectors (Al-27, Mn-55, Co-59, and In-nat) located throughout the whole set-up both in horizontal and vertical positions and activated by secondary neutrons produced by spallation reaction. The threshold activation detectors were analysed by the method of gamma-ray spectroscopy. Radionuclides found in the threshold detectors were analysed and reaction rates were determined for each radionuclide. Ratios of the reaction rates were determined from irradiation of the set-up with and without lead shielding. Subsequently, the neutron spectra generated inside the spallation target at different positions were calculated using Co-59 detector. The experimental results were compared with Monte Carlo simulations performed using MCNPX 2.7.0.
26

Monitoring a simulace chování experimentálních terčů pro ADS, vývinu tepla a úniku neutronů / Monitoring and Simulation of ADS Experimental Target Behaviour, Heat Generation, and Neutron Leakage

Svoboda, Josef January 2021 (has links)
Urychlovačem řízené podkritické systémy (ADS) se schopností transmutovat dlouhodobě žijící radionuklidy mohou vyřešit problematiku použitého jaderného paliva z aktuálních jaderných reaktorů. Stejně tak i potenciální problém s nedostatkem dnes používaného paliva, U-235, jelikož jsou schopny energeticky využít U-238 nebo i hojný izotop thoria Th-232. Tato disertační práce se v rámci základního ADS výzkumu zabývá spalačními reakcemi a produkcí tepla různých experimentálních terčů. Experimentální měření bylo provedeno ve Spojeném ústavu jaderných výzkumů v Dubně v Ruské federaci. V rámci doktorského studia bylo v průběhu let 2015-2019 provedeno 13 experimentů. Během výzkumu byly na urychlovači Fázotron ozařovány různé terče protony s energií 660 MeV. Nejdříve spalační terč QUINTA složený z 512 kg přírodního uranu, následně pak experimentální terče z olova a uhlíku nebo terč složený z olověných cihel. Byl proveden také speciální experiment zaměřený na detailní výzkum dvou protony ozařovaných uranových válečků, z nichž je složen spalační terč QUINTA. Výzkum byl především zaměřen na monitorování uvolňovaného tepla ze zpomalovaných protonů, spalační reakce a štěpení, způsobeného neutrony produkovanými spalační reakcí. Dále se na uvolňování tepla podílely piony a fotony. Teplota byla experimentálně měřena pomocí přesných termočlánků se speciální kalibrací. Rozdíly teplot byly monitorovány jak na povrchu, tak uvnitř terčů. Další výzkum byl zaměřený na monitorování unikajících neutronů z terče porovnávací metodou mezi dvěma detektory. První obsahoval malé množství štěpného materiálu s teplotním čidlem. Druhý byl složený z neštěpného materiálu (W nebo Ta), avšak s podobnými materiálovými vlastnostmi se stejnými rozměry. Unik neutronů (resp. neutronový tok mimo experimentální terč) byl detekován uvolněnou energií ze štěpné reakce. Tato práce se zabývá přesným měřením změny teploty pomocí termočlánků, s využitím elekroniky od National Instrument a softwaru LabView pro sběr dat. Pro práci s daty, analýzu a vizualizaci dat byl použit skriptovací jazyk Python 3.7. (s využitím několika knihoven). Přenos částic by simulován pomocí MCNPX 2.7.0., a konečně simulace přenosu tepla a určení povrchové teploty simulovaného modelu bylo provedeno v programu ANSYS Fluent (pro jednodušší výpočty ANSYS Transient Thermal).
27

Studies of Accelerator-Driven Systems for Transmutation of Nuclear Waste / Studier av acceleratordrivna system för transmutation av kärnavfall

Dahlfors, Marcus January 2006 (has links)
<p>Accelerator-driven systems for transmutation of nuclear waste have been suggested as a means for dealing with spent fuel components that pose potential radiological hazard for long periods of time. While not entirely removing the need for underground waste repositories, this nuclear waste incineration technology provides a viable method for reducing both waste volumes and storage times. Potentially, the time spans could be diminished from hundreds of thousand years to merely 1.000 years or even less. A central aspect for accelerator-driven systems design is the prediction of safety parameters and fuel economy. The simulations performed rely heavily on nuclear data and especially on the precision of the neutron cross section representations of essential nuclides over a wide energy range, from the thermal to the fast energy regime. In combination with a more demanding neutron flux distribution as compared with ordinary light-water reactors, the expanded nuclear data energy regime makes exploration of the cross section sensitivity for simulations of accelerator-driven systems a necessity. This fact was observed throughout the work and a significant portion of the study is devoted to investigations of nuclear data related effects. The computer code package EA-MC, based on 3-D Monte Carlo techniques, is the main computational tool employed for the analyses presented. Directly related to the development of the code is the extensive IAEA ADS Benchmark 3.2, and an account of the results of the benchmark exercises as implemented with EA-MC is given. CERN's Energy Amplifier prototype is studied from the perspectives of neutron source types, nuclear data sensitivity and transmutation. The commissioning of the n_TOF experiment, which is a neutron cross section measurement project at CERN, is also described.</p>
28

Studies of Accelerator-Driven Systems for Transmutation of Nuclear Waste / Studier av acceleratordrivna system för transmutation av kärnavfall

Dahlfors, Marcus January 2006 (has links)
Accelerator-driven systems for transmutation of nuclear waste have been suggested as a means for dealing with spent fuel components that pose potential radiological hazard for long periods of time. While not entirely removing the need for underground waste repositories, this nuclear waste incineration technology provides a viable method for reducing both waste volumes and storage times. Potentially, the time spans could be diminished from hundreds of thousand years to merely 1.000 years or even less. A central aspect for accelerator-driven systems design is the prediction of safety parameters and fuel economy. The simulations performed rely heavily on nuclear data and especially on the precision of the neutron cross section representations of essential nuclides over a wide energy range, from the thermal to the fast energy regime. In combination with a more demanding neutron flux distribution as compared with ordinary light-water reactors, the expanded nuclear data energy regime makes exploration of the cross section sensitivity for simulations of accelerator-driven systems a necessity. This fact was observed throughout the work and a significant portion of the study is devoted to investigations of nuclear data related effects. The computer code package EA-MC, based on 3-D Monte Carlo techniques, is the main computational tool employed for the analyses presented. Directly related to the development of the code is the extensive IAEA ADS Benchmark 3.2, and an account of the results of the benchmark exercises as implemented with EA-MC is given. CERN's Energy Amplifier prototype is studied from the perspectives of neutron source types, nuclear data sensitivity and transmutation. The commissioning of the n_TOF experiment, which is a neutron cross section measurement project at CERN, is also described.
29

Glättungsmechanismen beim Ionenbeschuss rauer amorpher Oberflächen / Smoothing mechanisms due to ion bombardment of rough amorphous surfaces

Vauth, Sebastian 11 October 2007 (has links)
No description available.

Page generated in 0.1241 seconds