• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 47
  • 47
  • 13
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Two dimensional supersymmetric models and some of their thermodynamic properties from the context of SDLCQ

Proestos, Yiannis 06 August 2007 (has links)
No description available.
22

Fluctuation solution theory

Ploetz, Elizabeth Anne January 1900 (has links)
Doctor of Philosophy / Department of Chemistry / Paul E. Smith / The Kirkwood-Buff (KB) theory of solutions, published in 1951, established a route from integrals over radial (pair) distribution functions (RDFs) in the grand canonical ensemble to a set of thermodynamic quantities in an equivalent closed ensemble. These “KB integrals” (KBIs) can also be expressed in terms of the particle-particle (i.e., concentration or density) fluctuations within grand canonical ensemble regions. Contributions by Ben-Naim in 1977 provided the means to obtain the KBIs if one already knew the set of thermodynamic quantities for the mixture of interest; that is, he provided the inversion procedure. Thus, KB theory provides a two-way bridge between local (microscopic) and global (bulk/thermodynamic) properties. Due to its lack of approximations, its wide ranging applicability, and the absence of a competitive theory for rigorously understanding liquid mixtures, it has been used to understand solution microheterogeneity, solute solubility, cosolvent effects on biomolecules, preferential solvation, etc. Here, after using KB theory to test the accuracy of pair potentials, we present and illustrate two extensions of the theory, resulting in a general Fluctuation Solution Theory (FST). First, we generalize KB theory to include two-way relationships between the grand canonical ensemble’s particle-energy and energy-energy fluctuations and additional thermodynamic quantities. This extension allows for non-isothermal conditions to be considered, unlike traditional KB theory. We illustrate these new relationships using analyses of experimental data and molecular dynamics (MD) simulations for pure liquids and binary mixtures. Furthermore, we use it to obtain conformation-specific infinitely dilute partial molar volumes and compressibilities for proteins (other properties will follow) from MD simulations and compare the method to a non-FST method for obtaining the same properties. The second extension of KB theory involves moving beyond doublet particle fluctuations to additionally consider triplet and quadruplet particle fluctuations, which are related to derivatives of the thermodynamic properties involved in regular KB theory. We present these higher order fluctuations obtained from experiment and simulation for pure liquids and binary mixtures. Using the newfound experimental third and fourth cumulants of the distribution of particles in solution, which can be extracted from bulk thermodynamic data using this extension, we also probe particle distributions’ non-Gaussian nature.
23

Rates of convergence of variance-gamma approximations via Stein's method

Gaunt, Robert E. January 2013 (has links)
Stein's method is a powerful technique that can be used to obtain bounds for approximation errors in a weak convergence setting. The method has been used to obtain approximation results for a number of distributions, such as the normal, Poisson and Gamma distributions. A major strength of the method is that it is often relatively straightforward to apply it to problems involving dependent random variables. In this thesis, we consider the adaptation of Stein's method to the class of Variance-Gamma distributions. We obtain a Stein equation for the Variance-Gamma distributions. Uniform bounds for the solution of the Symmetric Variance-Gamma Stein equation and its first four derivatives are given in terms of the supremum norms of derivatives of the test function. New formulas and inequalities for modified Bessel functions are obtained, which allow us to obtain these bounds. We then use local approach couplings to obtain bounds on the error in approximating two asymptotically Variance-Gamma distributed statistics by their limiting distribution. In both cases, we obtain a convergence rate of order n<sup>-1</sup> for suitably smooth test functions. The product of two normal random variables has a Variance-Gamma distribution and this leads us to consider the development of Stein's method to the product of r independent mean-zero normal random variables. An elegant Stein equation is obtained, which motivates a generalisation of the zero bias transformation. This new transformation has a number of interesting properties, which we exploit to prove some limit theorems for statistics that are asymptotically distributed as the product of two central normal distributions. The Variance-Gamma and Product Normal distributions arise as functions of the multivariate normal distribution. We end this thesis by demonstrating how the multivariate normal Stein equation can be used to prove limit theorems for statistics that are asymptotically distributed as a function of the multivariate normal distribution. We establish some sufficient conditions for convergence rates to be of order n<sup>-1</sup> for smooth test functions, and thus faster than the O(n<sup>-1/2</sup>) rate that would arise from the Berry-Esseen Theorem. We apply the multivariate normal Stein equation approach to prove Variance-Gamma and Product Normal limit theorems, and we also consider an application to Friedman's X<sup>2</sup> statistic.
24

Theoretical and numerical calculations for the dynamics of colloidal suspensions of molecular particles in flowing solution inside mesopores / Modélisation théorique et numérique de la dynamique de particules macromoléculaires en écoulement dans des systèmes méso-poreux

Atwi, Ali 02 May 2012 (has links)
Les objectifs de cette thèse visent le développement d’un traitement inédit dans un repère spatiale tridimensionnel, pour le problème de la dynamique de collisions diffusives d’objets macromoléculaires en solution en écoulement hydrodynamique à l'intérieur des pores de largeur variable, soumis aux forces hydrodynamiques, du mouvement brownien et des collisions diffusifs aux parois des pores, en utilisant la modélisation théorique et les simulations numériques. L’approche par simulation numérique est nécessaire car il est extrêmement complexe d’utiliser des outils analytiques à présent pour traiter le problème de ces collisions diffusives aux parois solides. Les algorithmes que nous avons développés et les simulations correspondantes sont suffisamment généraux et avancés pour être directement appliquée à l'étude de la dynamique d'une grande variété de polymère et des particules biologiques dans des solutions diluées sous diverses conditions physiques et hydrodynamiques à l'intérieur des pores. Par ailleurs, les mécanismes conduisant à l'adhésion de nano particules et de particules macromoléculaires sous conditions de non-équilibre, en raison de l'influence contradictoire des collisions mécaniques diffusifs et les forces attractives de Hamaker aux parois solides, sont d'un intérêt majeur. Nous avons donc développé un modèle théorique pour calculer le coefficient de restitution. L'objectif est de quantifier le bilan énergétique pendant le processus de collision diffusive de ces particules aux parois, sous l'influence des forces de répulsion d'une part et les forces attractives de Hamaker. Cela se fait par l'élaboration d'un modèle, basé sur le JKR et les théories d’Hertz, pour tenir compte des pertes d'énergie lors des collisions et des gains d'énergie en raison des interactions Hamaker. L’adhésion arrive si le bilan énergétique le permet. Notre modèle théorique est développé en proposant une approche particulière basée sur le potentiel Hamaker. Nous démontrons ce bilan par le biais d'une équation caractéristique non linéaire pour le coefficient de restitution, et analysons ses propriétés qui déterminent l'adhésion ou non pour diverses conditions physiques initiales. / The purpose of this thesis is to develop a comprehensive model analysis in a three-dimensional spatial frame for the dynamics of molecular particles in dilute colloidal suspensions in solutions flowing inside pores of variable width, subject to hydrodynamic forces, Brownian motion and diffusive collisions at the rough pore boundaries, by using numerical simulations. The approach by simulations is necessary because it is extremely complex to use analytical tools at present to deal with the problem of diffusive collisions of the particles at the solid pore boundaries. The algorithms which we have developed and the corresponding simulations are sufficiently general and refined to be directly applied to the study of the dynamics of a wide variety of polymer and biological particles in dilute solutions under diverse physical and applicable hydrodynamic conditions inside pores. Moreover, the mechanisms leading to the adhesion of particles of nano sizes under what would be non-equilibrium conditions, due to the conflicting influence of the mechanical diffusive collisions and the attractive Hamaker forces at the boundaries, are of major interest. We have hence investigated a theoretical model to calculate the restitution coefficient from basic physical principles. The objective is to quantify the energy balance during the process of a diffusive collision of a nano particle under the influence of the repulsive forces on one hand, and the attractive Hamaker forces acting on the nano particle on the other. This is done by developing a model, based on the JKR and Hertz theories, to account for the energy losses during collisions, and for the energy gains due to the Hamaker interactions. Adhesion becomes an outcome if the energy balance permits this. Our theoretical model is developed by proposing a special analytic approach based on the Hamaker potential. We derive from the theoretical analysis a characteristic nonlinear equation for the restitution coefficient, and analyze its properties which determine under given physical conditions the outcome for adhesion or not.
25

Hard scattering cross sections and parton distribution functions at the LHC

Kovačíková, Petra 19 August 2013 (has links)
Über einen Mellinraumzugang werden Methoden zur Auswertung von Wirkunsquerschnitten für verschiedene Prozesse mit Hadronen im Anfangszustand entwickelt. Die Arbeit geschieht im Hinblick auf drei Prozesse, für die die analyischen Ergebnisse für perturbative QCD Korrekturen zu “next-to-next-to-leading order” bekannt sind; diese sind: die Produktion der Vektorbosonen Z0 und W± über einen Drell-Yan-Prozess in der “narrow width”-Näherung, die Produktion eines Standardmodell-Higgs-Bosons über die Fusion zweier Gluonen im Grenzfall schwerer Top-Quark-Massen und die tiefinelastische Lepton-Hadron-Streuung über neutrale und geladene Ströme. Die Implementierung der Mellinraumtechniken erfolgt in dem c++ Paket sbp. Das Programm ermöglicht auf elegante Weise eine schnelle und präzise Auswertung von inklusiven Wirkungsquerschnitten. Wir vergleichen sbp mit den herkömmlichen Impulsraumtechniken, und präsentieren Studien der asymptotischen Konvergenz den perturbativen Reihen und von Skalenabhängigkeiten. Als Anwendung untersuchen wir welchen Einfluss die Behandlung der Faktorisierungs- und Renormierungsskala auf den Wirkungsquerschnitt hat. / In this thesis we will explore a Mellin space approach to the evaluation of precision cross-sections at hadron colliders. We consider three processes with known analytic results for perturbative QCD corrections up to the next-to-next-to-leading order, namely: the production of vector bosons Z0, W± via the Drell-Yan mechanism in the narrow width approximation; the production of the standard model Higgs boson via gluon-gluon fusion using the large top quark mass limit and the neutral and charged current deep inelastic lepton-hadron scattering. We develop a c++ package sbp that implements the Mellin space technique. The resulting program provides an elegant, fast and accurate solution for the evaluation of inclusive cross sections. We compare our program with available results that use standard momentum space techniques. We present studies of asymptotic convergence and scale dependence of the perturbative series. We use the package to study different treatments of factorisation and renormalisation scales in cross sections.
26

Modélisation théorique et numérique de la dynamique de particules macromoléculaires en écoulement dans des systèmes méso-poreux

Atwi, Ali 02 May 2012 (has links) (PDF)
Les objectifs de cette thèse visent le développement d'un traitement inédit dans un repère spatiale tridimensionnel, pour le problème de la dynamique de collisions diffusives d'objets macromoléculaires en solution en écoulement hydrodynamique à l'intérieur des pores de largeur variable, soumis aux forces hydrodynamiques, du mouvement brownien et des collisions diffusifs aux parois des pores, en utilisant la modélisation théorique et les simulations numériques. L'approche par simulation numérique est nécessaire car il est extrêmement complexe d'utiliser des outils analytiques à présent pour traiter le problème de ces collisions diffusives aux parois solides. Les algorithmes que nous avons développés et les simulations correspondantes sont suffisamment généraux et avancés pour être directement appliquée à l'étude de la dynamique d'une grande variété de polymère et des particules biologiques dans des solutions diluées sous diverses conditions physiques et hydrodynamiques à l'intérieur des pores. Par ailleurs, les mécanismes conduisant à l'adhésion de nano particules et de particules macromoléculaires sous conditions de non-équilibre, en raison de l'influence contradictoire des collisions mécaniques diffusifs et les forces attractives de Hamaker aux parois solides, sont d'un intérêt majeur. Nous avons donc développé un modèle théorique pour calculer le coefficient de restitution. L'objectif est de quantifier le bilan énergétique pendant le processus de collision diffusive de ces particules aux parois, sous l'influence des forces de répulsion d'une part et les forces attractives de Hamaker. Cela se fait par l'élaboration d'un modèle, basé sur le JKR et les théories d'Hertz, pour tenir compte des pertes d'énergie lors des collisions et des gains d'énergie en raison des interactions Hamaker. L'adhésion arrive si le bilan énergétique le permet. Notre modèle théorique est développé en proposant une approche particulière basée sur le potentiel Hamaker. Nous démontrons ce bilan par le biais d'une équation caractéristique non linéaire pour le coefficient de restitution, et analysons ses propriétés qui déterminent l'adhésion ou non pour diverses conditions physiques initiales.
27

Improving flood frequency analysis by integration of empirical and probabilistic regional envelope curves

Guse, Björn Felix January 2010 (has links)
Flood design necessitates discharge estimates for large recurrence intervals. However, in a flood frequency analysis, the uncertainty of discharge estimates increases with higher recurrence intervals, particularly due to the small number of available flood data. Furthermore, traditional distribution functions increase unlimitedly without consideration of an upper bound discharge. Hence, additional information needs to be considered which is representative for high recurrence intervals. Envelope curves which bound the maximum observed discharges of a region are an adequate regionalisation method to provide additional spatial information for the upper tail of a distribution function. Probabilistic regional envelope curves (PRECs) are an extension of the traditional empirical envelope curve approach, in which a recurrence interval is estimated for a regional envelope curve (REC). The REC is constructed for a homogeneous pooling group of sites. The estimation of this recurrence interval is based on the effective sample years of data considering the intersite dependence among all sites of the pooling group. The core idea of this thesis was an improvement of discharge estimates for high recurrence intervals by integrating empirical and probabilistic regional envelope curves into the flood frequency analysis. Therefore, the method of probabilistic regional envelope curves was investigated in detail. Several pooling groups were derived by modifying candidate sets of catchment descriptors and settings of two different pooling methods. These were used to construct PRECs. A sensitivity analysis shows the variability of discharges and the recurrence intervals for a given site due to the different assumptions. The unit flood of record which governs the intercept of PREC was determined as the most influential aspect. By separating the catchments into nested and unnested pairs, the calculation algorithm for the effective sample years of data was refined. In this way, the estimation of the recurrence intervals was improved, and therefore the use of different parameter sets for nested and unnested pairs of catchments is recommended. In the second part of this thesis, PRECs were introduced into a distribution function. Whereas in the traditional approach only discharge values are used, PRECs provide a discharge and its corresponding recurrence interval. Hence, a novel approach was developed, which allows a combination of the PREC results with the traditional systematic flood series while taking the PREC recurrence interval into consideration. An adequate mixed bounded distribution function was presented, which in addition to the PREC results also uses an upper bound discharge derived by an empirical envelope curve. By doing so, two types of additional information which are representative for the upper tail of a distribution function were included in the flood frequency analysis. The integration of both types of additional information leads to an improved discharge estimation for recurrence intervals between 100 and 1000 years. / Abschätzungen von Abflüssen mit hohen Wiederkehrintervallen werden vor allem für die Bemessung von Extremhochwässern benötigt. In der Hochwasserstatistik bestehen insbesondere für hohe Wiederkehrintervalle große Unsicherheiten, da nur eine geringe Anzahl an Messwerten für Hochwasserereignisse verfügbar ist. Zudem werden zumeist Verteilungsfunktionen verwendet, die keine obere Grenze beinhalten. Daher müssen zusätzliche Informationen zu den lokalen Pegelmessungen berücksichtigt werden, die den Extrembereich einer Verteilungsfunktion abdecken. Hüllkurven ermitteln eine obere Grenze von Hochwasserabflüssen basierend auf beobachteten maximalen Abflusswerten. Daher sind sie eine geeignete Regionalisierungsmethode. Probabilistische regionale Hüllkurven sind eine Fortentwicklung des herkömmlichen Ansatzes der empirischen Hüllkurven. Hierbei wird einer Hüllkurve einer homogenen Region von Abflusspegeln ein Wiederkehrintervall zugeordnet. Die Berechnung dieses Wiederkehrintervalls basiert auf der effektiven Stichprobengröße und berücksichtigt die Korrelationsbeziehungen zwischen den Pegeln einer Region. Ziel dieser Arbeit ist eine Verbesserung der Abschätzung von Abflüssen mit großen Wiederkehrintervallen durch die Integration von empirischen und probabilistischen Hüllkurven in die Hochwasserstatistik. Hierzu wurden probabilistische Hüllkurven detailliert untersucht und für eine Vielzahl an homogenen Regionen konstruiert. Hierbei wurden verschiedene Kombinationen von Einzugsgebietsparametern und Variationen von zwei Gruppierungsmethoden verwendet. Eine Sensitivitätsanalyse zeigt die Variabilität von Abfluss und Wiederkehrintervall zwischen den Realisationen als Folge der unterschiedlichen Annahmen. Die einflussreichste Größe ist der maximale Abfluss, der die Höhe der Hüllkurve bestimmt. Eine Einteilung in genestete und ungenestete Einzugsgebiete führt zu einer genaueren Ermittlung der effektiven Stichprobe und damit zu einer verbesserten Abschätzung des Wiederkehrintervalls. Daher wird die Verwendung von zwei getrennten Parametersätzen für die Korrelationsfunktion zur Abschätzung des Wiederkehrintervalls empfohlen. In einem zweiten Schritt wurden die probabilistischen Hüllkurven in die Hochwasserstatistik integriert. Da in traditionellen Ansätzen nur Abflusswerte genutzt werden, wird eine neue Methode präsentiert, die zusätzlich zu den gemessenen Abflusswerten die Ergebnisse der probabilistischen Hüllkurve – Abfluss und zugehöriges Wiederkehrintervall - berücksichtigt. Die Wahl fiel auf eine gemischte begrenzte Verteilungsfunktion, die neben den probabilistischen Hüllkurven auch eine absolute obere Grenze, die mit einer empirischen Hüllkurve ermittelt wurde, beinhaltet. Damit werden zwei Arten von zusätzlichen Informationen verwendet, die den oberen Bereich einer Verteilungsfunktion beschreiben. Die Integration von beiden führt zu einer verbesserten Abschätzung von Abflüssen mit Wiederkehrintervallen zwischen 100 und 1000 Jahren.
28

Probabilistic Seismic Hazard Assessment Of Eastern Marmara And Evaluation Of Turkish Earthquake Code Requirements

Ocak, Recai Soner 01 November 2011 (has links) (PDF)
The primary objective of this study is to evaluate the seismic hazard in the Eastern Marmara Region using improved seismic source models and enhanced ground motion prediction models by probabilistic approach. Geometry of the fault zones (length, width, dip angle, segmentation points etc.) is determined by the help of available fault maps and traced source lines on the satellite images. State of the art rupture model proposed by USGS Working Group in 2002 is applied to the source system. Composite reoccurrence model is used for all seismic sources in the region to represent the characteristic behavior of North Anatolian Fault. New and improved global ground motion models (NGA models) are used to model the ground motion variability for this study. Previous studies, in general, used regional models or older ground motion prediction models which were updated by their developers during the NGA project. New NGA models were improved in terms of additional prediction parameters (such as depth of the source, basin effects, site dependent standard deviations, etc.), statistical approach, and very well constrained global database. The use of NGA models reduced the epistemic uncertainty in the total hazard incorporated by regional or older models using smaller datasets. The results of the study is presented in terms of hazard curves, deaggregation of the hazard and uniform hazard spectrum for six main locations in the region (Adapazari, Duzce, Golcuk, Izmit, Iznik, and Sapanca City Centers) to provide basis for seismic design of special structures in the area. Hazard maps of the region for rock site conditions at the accepted levels of risk by Turkish Earthquake Code (TEC-2007) are provided to allow the user perform site-specific hazard assessment for local site conditions and develop site-specific design spectrum. Comparison of TEC-2007 design spectrum with the uniform hazard spectrum developed for selected locations is also presented for future reference.
29

Image-based Extraction Of Material Reflectance Properties Of A 3d Object

Erdem, Mehmet Erkut 01 January 2003 (has links) (PDF)
In this study, an appearance reconstruction method based on extraction of material re&amp / #64258 / ectance properties of a three-dimensional (3D) object from its twodimensional (2D) images is explained. One of the main advantages of this system is that the reconstructed object can be rendered in real-time with photorealistic quality in varying illumination conditions. Bidirectional Re&amp / #64258 / ectance Distribution Functions (BRDFs) are used in representing the re&amp / #64258 / ectance of the object. The re&amp / #64258 / ectance of the object is decomposed into di&amp / #64256 / use and specular components and each component is estimated seperately. While estimating the di&amp / #64256 / use components, illumination-invariant images of the object are computed from the input images, and a global texture of the object is extracted from these images by using surface particles. The specular re&amp / #64258 / ectance data are collected from the residual images obtained by taking di&amp / #64256 / erence between the input images and corresponding illumination-invariant images, and a Lafortune BRDF model is &amp / #64257 / tted to these data. At the rendering phase, the di&amp / #64256 / use and specular components are blended into each other to achieve a photorealistic appearance of the reconstructed object.
30

Influencia da textura em medidas de tensao residual

LIMA, NELSON B. de 09 October 2014 (has links)
Made available in DSpace on 2014-10-09T12:36:59Z (GMT). No. of bitstreams: 0 / Made available in DSpace on 2014-10-09T13:56:24Z (GMT). No. of bitstreams: 1 04491.pdf: 3155069 bytes, checksum: aa854e4a23f31eb334f216ec1ea726c2 (MD5) / Tese (Doutoramento) / IPEN/T / Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP

Page generated in 0.1625 seconds