• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 47
  • 47
  • 13
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Zkoumání závislosti výpočtů v konečném řádu poruchové QCD na faktorizačním schématu / Investigation of the factorization scheme dependence of finite order perturbative QCD calculations

Kolář, Karel January 2012 (has links)
Title: Investigation of the factorization scheme dependence of finite order per- turbative QCD calculations Author: Karel Kolář Institute: Institute of Particle and Nuclear Physics Supervisor of the doctoral thesis: prof. Jiří Chýla, CSc., Institute of Physics of the Academy of Sciences of the Czech Republic Abstract: The main aim of this thesis is the investigation of phenomenological implications of the freedom in the choice of the factorization scheme for the de- scription of hard collisions with the potential application for an improvement of current NLO Monte Carlo event generators. We analyze the freedom associated with the definition of parton distribution functions and we derive general formu- lae governing the dependence of parton distribution functions and hard scattering cross-sections on unphysical quantities specifying the renormalization and factor- ization procedure. The issue of the specification of factorization schemes via the corresponding higher order splitting functions is discussed in detail. The main attention is paid to the so called ZERO factorization scheme, which allows the construction of consistent NLO Monte Carlo event generators in which initial state parton showers can be taken formally at the LO. Unfortunately, it has turned out that the practical applicability of the ZERO...
32

Les techniques Monte Carlo par chaînes de Markov appliquées à la détermination des distributions de partons / Markov chain Monte Carlo techniques applied to parton distribution functions determination : proof of concept

Gbedo, Yémalin Gabin 22 September 2017 (has links)
Nous avons développé une nouvelle approche basée sur les méthodes Monte Carlo par chaînes de Markov pour déterminer les distributions de Partons et quantifier leurs incertitudes expérimentales. L’intérêt principal d’une telle étude repose sur la possibilité de remplacer la minimisation standard avec MINUIT de la fonction χ 2 par des procédures fondées sur les méthodes Statistiques et sur l’inférence Bayésienne en particulier,offrant ainsi une meilleure compréhension de la détermination des distributions de partons. Après avoir examiné ces techniques Monte Carlo par chaînes de Markov, nous introduisons l’algorithme que nous avons choisi de mettre en œuvre, à savoir le Monte Carlo hybride (ou Hamiltonien). Cet algorithme, développé initialement pour la chromodynamique quantique sur réseau, s’avère très intéressant lorsqu’il est appliqué à la détermination des distributions de partons par des analyses globales. Nous avons montré qu’il permet de contourner les difficultés techniques dues à la grande dimensionnalité du problème, en particulier celle relative au taux d’acceptation. L’étude de faisabilité réalisée et présentée dans cette thèse indique que la méthode Monte Carlo par chaînes de Markov peut être appliquée avec succès à l’extraction des distributions de partons et à leurs in-certitudes expérimentales. / We have developed a new approach to determine parton distribution functions and quantify their experimental uncertainties, based on Markov Chain Monte Carlo methods.The main interest devoted to such a study is that we can replace the standard χ 2 MINUIT minimization by procedures grounded on Statistical Methods, and on Bayesian inference in particular, thus offering additional insight into the rich field of PDFs determination.After reviewing these Markov chain Monte Carlo techniques, we introduce the algorithm we have chosen to implement – namely Hybrid (or Hamiltonian) Monte Carlo. This algorithm, initially developed for lattice quantum chromodynamique, turns out to be very interesting when applied to parton distribution functions determination by global analyses ; we have shown that it allows to circumvent the technical difficulties due to the high dimensionality of the problem, in particular concerning the acceptance rate. The feasibility study performed and presented in this thesis, indicates that Markov chain Monte Carlo method can successfully be applied to the extraction of PDFs and of their experimental uncertainties.
33

Influencia da textura em medidas de tensao residual

LIMA, NELSON B. de 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:36:59Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T13:56:24Z (GMT). No. of bitstreams: 1 04491.pdf: 3155069 bytes, checksum: aa854e4a23f31eb334f216ec1ea726c2 (MD5) / Tese (Doutoramento) / IPEN/T / Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP
34

Uplatnění statistických metod pro zkoumání vlastností nejprodávanějších přípravků na ochranu rostlin a vztahů mezi nimi / Application of Statistical Methods to Investigate the Properties of Best-Selling Plant Protection Products and their Relationship

Haluzová, Dana January 2018 (has links)
This diploma thesis focuses on the statistical examination of properties of plant protection products at Agro-Artikel, s.r.o. Using the empirical distribution function, it focuses on the sales price and the shelf life of the products, tests the hypotheses about the properties of the products and the dependencies between them. The thesis also explores the results of the questionnaire survey and offers recommendations for the introduction of new products.
35

A constitutive material model for simulating texture evolution and anisotropy effects in cold spray.

Giles, Creston Michael 09 December 2022 (has links) (PDF)
Cold spray has seen rapid advancement since its inception and has shown significant potential as a method of additive manufacturing. However, the large plastic deformation and repeated heating/cooling cycles that the material undergoes during the cold spray process can result in gradients in material structure and large residual stresses. The purpose of this study is to extend the existing EMMI material model to include anisotropic material response through the use of orientation distribution functions to predict residual stresses and anisotropy resulting from cold spray and similar additive manufacturing processes. Through the use of a finite element simulation, yield surfaces for a two-step tension problem were generated and analyzed to capture the effects of the four coaxiality parameters that govern the model.
36

Direct calculation of parton distribution functions (PDFs) on the lattice

Manigrasso, Floriano 05 September 2022 (has links)
In dieser Arbeit befassen wir uns mit einer Reihe von entscheidenden Schritten, um die unpolarisierten Helizitäts- und Trasversitäts-Parton-Verteilungsfunktionen der Nukleonen im Rahmen der Gitter-QCD zu bewerten. Diskretisierungsartefakte werden unter Verwendung eines N_f=2+1+1 Eichensembles von Fermionen mit verdrillter Wilson-Masse untersucht, die bei einer Pionenmasse von ungefähr M=37 MeV simuliert werden. Die unpolarisierten und Helizitäts Partonverteilungsfunktionen weisen eine nicht vernachlässigbare Abhängigkeit vom Gitterabstand auf, und die Kontinuumsextrapolation ergibt eine bessere Übereinstimmung mit Phänomenologie. Die direkte Berechnung der Fourier-Transformation mit diskreten Gitterdaten kann Artefakte verursachen. Daher arbeiten wir mit einer neuen datengesteuerten Methode, die auf Gauß-Prozess-Regression basiert, die sogenannte Bayes-Gauß-Fourier-Transformation, um die Einschränkungen der diskreten Fourier-Transformation zu überwinden. Wir sind der Meinung, dass dieser datengesteuerte Ansatz die durch die Diskretisierung der Fourier-Transformation eingeführten Artefakte drastisch reduzieren kann, jedoch ist der endgültige Effekt auf die Lichtkegel-PDFs gering. Darüber hinaus präsentieren wir die Ergebnisse der ersten ab initio Berechnung der individuellen up, down und strange unpolarisierten, Helizitäts- und Transversitäts-Partonverteilungsfunktionen für das Proton. Die Analyse wird an einem durch N_f=2+1+1 verdrillten Kleeblatt-verbesserten Fermionen-Ensemble durchgeführt, das bei einer Pionenmasse von 260 MeV simuliert wird. Wir verwenden den hierarchischen Sondierungsalgorithmus, um die unzusammenhängenden Quarkschleifen auszuwerten. Dadurch erhalten wir Ergebnisse ungleich Null für den unbegundenen isoskalaren Beitrag und die strange Quark-Matrixelemente. / In this work, we address a number of crucial steps in order to evaluate the nucleon unpolarized helicity and trasversity parton distribution functions within the framework of lattice QCD. Discretization artifacts are investigated using an N_f=2+1+1 gauge ensemble of Wilson twisted mass fermions simulated at a pion mass of approximately M=370 MeV. The unpolarized and helicity parton distribution functions show a non-negligible dependence on the lattice spacing, with the continuum extrapolation producing a better agreement with phenomenology. The direct computation of the Fourier transform using discrete lattice data may introduce artifacts and we, therefore, use a new data-driven method based on Gaussian process regression, the so-called Bayes-Gauss Fourier transform to overcome the limitations of the discrete Fourier transform. We find that this data-driven approach can drastically reduce the artifacts introduced by the discretization of the Fourier transform, however, the final effect on the light-cone PDFs is small. Furthermore, we present results of the first ab initio calculation of the individual up, down, and strange unpolarized, helicity, and transversity parton distribution functions for the proton. The analysis is performed on an N_f=2+1+1 twisted mass clover-improved fermion ensemble simulated at a pion mass of 260 MeV. We employ the hierarchical probing algorithm to evaluate the disconnected quark loops, allowing us to obtain non-zero results for the disconnected isoscalar contribution and the strange quark matrix elements.
37

Offset Surface Light Fields

Ang, Jason January 2003 (has links)
For producing realistic images, reflection is an important visual effect. Reflections of the environment are important not only for highly reflective objects, such as mirrors, but also for more common objects such as brushed metals and glossy plastics. Generating these reflections accurately at real-time rates for interactive applications, however, is a difficult problem. Previous works in this area have made assumptions that sacrifice accuracy in order to preserve interactivity. I will present an algorithm that tries to handle reflection accurately in the general case for real-time rendering. The algorithm uses a database of prerendered environment maps to render both the original object itself and an additional bidirectional reflection distribution function (BRDF). The algorithm performs image-based rendering in reflection space in order to achieve accurate results. It also uses graphics processing unit (GPU) features to accelerate rendering.
38

Offset Surface Light Fields

Ang, Jason January 2003 (has links)
For producing realistic images, reflection is an important visual effect. Reflections of the environment are important not only for highly reflective objects, such as mirrors, but also for more common objects such as brushed metals and glossy plastics. Generating these reflections accurately at real-time rates for interactive applications, however, is a difficult problem. Previous works in this area have made assumptions that sacrifice accuracy in order to preserve interactivity. I will present an algorithm that tries to handle reflection accurately in the general case for real-time rendering. The algorithm uses a database of prerendered environment maps to render both the original object itself and an additional bidirectional reflection distribution function (BRDF). The algorithm performs image-based rendering in reflection space in order to achieve accurate results. It also uses graphics processing unit (GPU) features to accelerate rendering.
39

Simulation studies of molecular transport across the liquid-gas interface

Somasundaram, Theepaharan January 2000 (has links)
No description available.
40

Reliability prediction of complex repairable systems : an engineering approach

Sun, Yong January 2006 (has links)
This research has developed several models and methodologies with the aim of improving the accuracy and applicability of reliability predictions for complex repairable systems. A repairable system is usually defined as one that will be repaired to recover its functions after each failure. Physical assets such as machines, buildings, vehicles are often repairable. Optimal maintenance strategies require the prediction of the reliability of complex repairable systems accurately. Numerous models and methods have been developed for predicting system reliability. After an extensive literature review, several limitations in the existing research and needs for future research have been identified. These include the follows: the need for an effective method to predict the reliability of an asset with multiple preventive maintenance intervals during its entire life span; the need for considering interactions among failures of components in a system; and the need for an effective method for predicting reliability with sparse or zero failure data. In this research, the Split System Approach (SSA), an Analytical Model for Interactive Failures (AMIF), the Extended SSA (ESSA) and the Proportional Covariate Model (PCM), were developed by the candidate to meet the needs identified previously, in an effective manner. These new methodologies/models are expected to rectify the identified limitations of current models and significantly improve the accuracy of the reliability prediction of existing models for repairable systems. The characteristics of the reliability of a system will alter after regular preventive maintenance. This alternation makes prediction of the reliability of complex repairable systems difficult, especially when the prediction covers a number of imperfect preventive maintenance actions over multiple intervals during the asset's lifetime. The SSA uses a new concept to address this issue effectively and splits a system into repaired and unrepaired parts virtually. SSA has been used to analyse system reliability at the component level and to address different states of a repairable system after single or multiple preventive maintenance activities over multiple intervals. The results obtained from this investigation demonstrate that SSA has an excellent ability to support the making of optimal asset preventive maintenance decisions over its whole life. It is noted that SSA, like most existing models, is based on the assumption that failures are independent of each other. This assumption is often unrealistic in industrial circumstances and may lead to unacceptable prediction errors. To ensure the accuracy of reliability prediction, interactive failures were considered. The concept of interactive failure presented in this thesis is a new variant of the definition of failure. The candidate has made several original contributions such as introducing and defining related concepts and terminologies, developing a model to analyse interactive failures quantitatively and revealing that interactive failure can be either stable or unstable. The research results effectively assist in avoiding unstable interactive relationship in machinery during its design phase. This research on interactive failures pioneers a new area of reliability prediction and enables the estimation of failure probabilities more precisely. ESSA was developed through an integration of SSA and AMIF. ESSA is the first effective method to address the reliability prediction of systems with interactive failures and with multiple preventive maintenance actions over multiple intervals. It enhances the capability of SSA and AMIF. PCM was developed to further enhance the capability of the above methodologies/models. It addresses the issue of reliability prediction using both failure data and condition data. The philosophy and procedure of PCM are different from existing models such as the Proportional Hazard Model (PHM). PCM has been used successfully to investigate the hazard of gearboxes and truck engines. The candidate demonstrated that PCM had several unique features: 1) it automatically tracks the changing characteristics of the hazard of a system using symptom indicators; 2) it estimates the hazard of a system using symptom indicators without historical failure data; 3) it reduces the influence of fluctuations in condition monitoring data on hazard estimation. These newly developed methodologies/models have been verified using simulations, industrial case studies and laboratory experiments. The research outcomes of this research are expected to enrich the body of knowledge in reliability prediction through effectively addressing some limitations of existing models and exploring the area of interactive failures.

Page generated in 0.2621 seconds