• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 28
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 138
  • 138
  • 25
  • 23
  • 19
  • 18
  • 18
  • 17
  • 16
  • 15
  • 15
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Modélisation stochastique, en mécanique des milieux continus, de l'interphase inclusion-matrice à partir de simulations en dynamique moléculaire / Stochastic modeling, in continuum mechanics, of the inclusion-matrix interphase from molecular dynamics simulations

Le, Tien-Thinh 21 October 2015 (has links)
Dans ce travail, nous nous intéressons à la modélisation stochastique continue et à l'identification des propriétés élastiques dans la zone d'interphase présente au voisinage des hétérogénéités dans un nano composite prototypique, composé d'une matrice polymère modèle renforcée par une nano inclusion de silice. Des simulations par dynamique moléculaire (DM) sont tout d'abord conduites afin d'extraire certaines caractéristiques de conformation des chaînes proches de la surface de l'inclusion, ainsi que pour estimer, par des essais mécaniques virtuels, des réalisations du tenseur apparent associé au domaine de simulation. Sur la base des résultats obtenus, un modèle informationnel de champ aléatoire est proposé afin de modéliser les fluctuations spatiales du tenseur des rigidités dans l'interphase. Les paramètres du modèle probabiliste sont alors identifiés par la résolution séquentielle de deux problèmes d'optimisation inverses (l'un déterministe et associé au modèle moyen, l'autre stochastique et lié aux paramètres de dispersion et de corrélation spatiale) impliquant une procédure d'homogénéisation numérique. On montre en particulier que la longueur de corrélation dans la direction radiale est du même ordre de grandeur que l'épaisseur de l'interphase, indiquant ainsi la non-séparation des échelles. Enfin, la prise en compte, par un modèle de matrices aléatoires, du bruit intrinsèque généré par les simulations de DM (dans la procédure de calibration) est discutée / This work is concerned with the stochastic modeling and identification of the elastic properties in the so-called interphase region surrounding the inclusions in nanoreinforced composites. For the sake of illustration, a prototypical nanocomposite made up with a model polymer matrix filled by a silica nanoinclusion is considered. Molecular Dynamics (MD) simulations are first performed in order to get a physical insight about the local conformation of the polymer chains in the vicinity of the inclusion surface. In addition, a virtual mechanical testing procedure is proposed so as to estimate realizations of the apparent stiffness tensor associated with the MD simulation box. An information-theoretic probabilistic representation is then proposed as a surrogate model for mimicking the spatial fluctuations of the elasticity field within the interphase. The hyper parameters defining the aforementioned model are subsequently calibrated by solving, in a sequential manner, two inverse problems involving a computational homogenization scheme. The first problem, related to the mean model, is formulated in a deterministic framework, whereas the second one involves a statistical metric allowing the dispersion parameter and the spatial correlation lengths to be estimated. It is shown in particular that the spatial correlation length in the radial direction is roughly equal to the interphase thickness, hence showing that the scales under consideration are not well separated. The calibration results are finally refined by taking into account, by means of a random matrix model, the MD finite-sampling noise
72

Maximum entropy regularization for calibrating a time-dependent volatility function

Hofmann, Bernd, Krämer, Romy 26 August 2004 (has links)
We investigate the applicability of the method of maximum entropy regularization (MER) including convergence and convergence rates of regularized solutions to the specific inverse problem (SIP) of calibrating a purely time-dependent volatility function. In this context, we extend the results of [16] and [17] in some details. Due to the explicit structure of the forward operator based on a generalized Black-Scholes formula the ill-posedness character of the nonlinear inverse problem (SIP) can be verified. Numerical case studies illustrate the chances and limitations of (MER) versus Tikhonov regularization (TR) for smooth solutions and solutions with a sharp peak.
73

Utilizing soil quality data for premium rate making in the federal crop insurance program

Moore, Rylan 08 August 2023 (has links) (PDF)
The federal crop insurance program provides crop insurance for millions of acres and many commodities every year. The Risk Management Agency of the USDA is responsible for determining the premium rates for these covered commodities. Currently, the quality of soil is not considered when determining baseline yields and expected premium rates. This study utilizes the moment-based maximum entropy method to assess the effect of incorporating soil in the rate making methodology. Several moments of upland cotton yield in Arkansas, Mississippi, and Texas are conditioned on weather, irrigation, and soil control variables. Ultimately, I find evidence of mispriced premium rates for counties in all three states for both irrigated and non-irrigated upland cotton yield.
74

Improving Filtering of Email Phishing Attacks by Using Three-Way Text Classifiers

Trevino, Alberto 13 March 2012 (has links) (PDF)
The Internet has been plagued with endless spam for over 15 years. However, in the last five years spam has morphed from an annoying advertising tool to a social engineering attack vector. Much of today's unwanted email tries to deceive users into replying with passwords, bank account information, or to visit malicious sites which steal login credentials and spread malware. These email-based attacks are known as phishing attacks. Much has been published about these attacks which try to appear real not only to users and subsequently, spam filters. Several sources indicate traditional content filters have a hard time detecting phishing attacks because the emails lack the traditional features and characteristics of spam messages. This thesis tests the hypothesis that by separating the messages into three categories (ham, spam and phish) content filters will yield better filtering performance. Even though experimentation showed three-way classification did not improve performance, several additional premises were tested, including the validity of the claim that phishing emails are too much like legitimate emails and the ability of Naive Bayes classifiers to properly classify emails.
75

A Polydispersed Gaussian-Moment Model for Polythermal, Evaporating, and Turbulent Multiphase Flow Applications

Allard, Benoit 06 April 2023 (has links)
A novel higher-order moment-closure method is applied for the Eulerian treatment of gas-particle multiphase flows characterized by a dilute polydisperse and polythermal particle phase. Based upon the polydisperse Gaussian-moment model (PGM) framework, the proposed model is derived by applying an entropy-maximization moment-closure formulation to the transport equation of the particle-number density function, which is equivalent to the Williams-Boltzmann equation for droplet sprays. The resulting set of first-order robustly-hyperbolic balance laws include a direct treatment for local higher-order statistics such as co-variances between particle distinguishable properties (i.e., diameter and temperature) and particle velocity. Leveraging the additional distinguishing variables, classical hydrodynamic droplet evaporation theory is considered to describe unsteady droplet vaporization. Further, studying turbulent multiphase flow theory, a first-order hyperbolicity maintaining approximation to turbulent flow diffusion-inertia effects is proposed. Investigations into the predictive capabilities of the model are evaluated relative to Lagrangian-based solutions for a range of flows, including aerosol dispersion and fuel-sprays. Further, the model is implemented in a massively parallel discontinuous-Galerkin framework. Validation of the proposed turbulence coupling model is subsequently performed against experimental data, and a qualitative analysis of the model is given for a qualitative liquid fuel-spray problem.
76

Distributions of Large Mammal Assemblages in Thailand with a Focus on Dhole (Cuon alpinus) Conservation

Jenks, Kate Elizabeth 01 May 2012 (has links)
Biodiversity monitoring and predictions of species occurrence are essential to develop outcome-oriented conservation management plans for endangered species and assess their success over time. To assess distribution and patterns of habitat use of large mammal assemblages in Thailand, with a focus on the endangered dhole (Cuon alpinus), I first implemented a long-term camera-trapping project carried out with park rangers from October 2003 through October 2007 in Khao Yai National Park. This project was extremely successful and may serve as a regional model for wildlife conservation. I found significantly lower relative abundance indices for carnivore species, and collectively for all mammals compared to data obtained in 1999-2000, suggesting population declines resulting from increased human activity. I integrated this data into maximum entropy modeling (Maxent) to further evaluate whether ranger stations reduced poaching activity and increased wildlife diversity and abundances. I then conducted a focused camera trap survey from January 2008 through February 2010 in Khao Ang Rue Nai Wildlife Sanctuary to gather critical baseline information on dholes, one of the predator species that seemed to have declined over time and that is exposed to continued pressure from humans. Additionally, I led a collaborative effort with other colleagues in the field to collate and integrate camera trap data from 15 protected areas to build a country-wide habitat suitability map for dholes, other predators, and their major prey species. The predicted presence probability for sambar (Rusa unicolor) and leopards (Panthera pardus) were the most important variables in predicting dhole presence countrywide. Based on my experience from these different field ecological surveys and endeavors, it became clear that local people's beliefs may have a strong influence on dhole management and conservation. Thus, I conducted villager interview surveys to identify local attitudes towards dholes, document the status of dholes in wildlife sanctuaries adjacent to Cambodia, and determine the best approach to improve local support for dhole conservation before proceeding with further field studies of the species in Thailand. A photograph of a dhole was correctly identified by only 20% of the respondents. My studies provide evidence that some protected areas in Thailand continue to support a diversity of carnivore speices of conservation concern, including clouded leopards (Neofelis nebulosa), dholes, and small felids. However, dholes' impact on prey populations may be increasing as tiger (Panthera tigris) and leopards are extripated from protected areas. The next step in dhole conservation is to estimate the size and stability of their fragmented populations and also focus on maintaining adequate prey bases that would support both large felids and dholes
77

Interfacial Solid-Liquid Diffuseness and Instability by the Maximum Entropy Production Rate (MEPR) Postulate

Bensah, Yaw D. 10 September 2015 (has links)
No description available.
78

Probabilistic Modeling of Multi-relational and Multivariate Discrete Data

Wu, Hao 07 February 2017 (has links)
Modeling and discovering knowledge from multi-relational and multivariate discrete data is a crucial task that arises in many research and application domains, e.g. text mining, intelligence analysis, epidemiology, social science, etc. In this dissertation, we study and address three problems involving the modeling of multi-relational discrete data and multivariate multi-response count data, viz. (1) discovering surprising patterns from multi-relational data, (2) constructing a generative model for multivariate categorical data, and (3) simultaneously modeling multivariate multi-response count data and estimating covariance structures between multiple responses. To discover surprising multi-relational patterns, we first study the ``where do I start?'' problem originating from intelligence analysis. By studying nine methods with origins in association analysis, graph metrics, and probabilistic modeling, we identify several classes of algorithmic strategies that can supply starting points to analysts, and thus help to discover interesting multi-relational patterns from datasets. To actually mine for interesting multi-relational patterns, we represent the multi-relational patterns as dense and well-connected chains of biclusters over multiple relations, and model the discrete data by the maximum entropy principle, such that in a statistically well-founded way we can gauge the surprisingness of a discovered bicluster chain with respect to what we already know. We design an algorithm for approximating the most informative multi-relational patterns, and provide strategies to incrementally organize discovered patterns into the background model. We illustrate how our method is adept at discovering the hidden plot in multiple synthetic and real-world intelligence analysis datasets. Our approach naturally generalizes traditional attribute-based maximum entropy models for single relations, and further supports iterative, human-in-the-loop, knowledge discovery. To build a generative model for multivariate categorical data, we apply the maximum entropy principle to propose a categorical maximum entropy model such that in a statistically well-founded way we can optimally use given prior information about the data, and are unbiased otherwise. Generally, inferring the maximum entropy model could be infeasible in practice. Here, we leverage the structure of the categorical data space to design an efficient model inference algorithm to estimate the categorical maximum entropy model, and we demonstrate how the proposed model is adept at estimating underlying data distributions. We evaluate this approach against both simulated data and US census datasets, and demonstrate its feasibility using an epidemic simulation application. Modeling data with multivariate count responses is a challenging problem due to the discrete nature of the responses. Existing methods for univariate count responses cannot be easily extended to the multivariate case since the dependency among multiple responses needs to be properly accounted for. To model multivariate data with multiple count responses, we propose a novel multivariate Poisson log-normal model (MVPLN). By simultaneously estimating the regression coefficients and inverse covariance matrix over the latent variables with an efficient Monte Carlo EM algorithm, the proposed model takes advantages of association among multiple count responses to improve the model prediction accuracy. Simulation studies and applications to real world data are conducted to systematically evaluate the performance of the proposed method in comparison with conventional methods. / Ph. D.
79

Contributions to the theory of unequal probability sampling

Lundquist, Anders January 2009 (has links)
This thesis consists of five papers related to the theory of unequal probability sampling from a finite population. Generally, it is assumed that we wish to make modelassisted inference, i.e. the inclusion probability for each unit in the population is prescribed before the sample is selected. The sample is then selected using some random mechanism, the sampling design. Mostly, the thesis is focused on three particular unequal probability sampling designs, the conditional Poisson (CP-) design, the Sampford design, and the Pareto design. They have different advantages and drawbacks: The CP design is a maximum entropy design but it is difficult to determine sampling parameters which yield prescribed inclusion probabilities, the Sampford design yields prescribed inclusion probabilities but may be hard to sample from, and the Pareto design makes sample selection very easy but it is very difficult to determine sampling parameters which yield prescribed inclusion probabilities. These three designs are compared probabilistically, and found to be close to each other under certain conditions. In particular the Sampford and Pareto designs are probabilistically close to each other. Some effort is devoted to analytically adjusting the CP and Pareto designs so that they yield inclusion probabilities close to the prescribed ones. The result of the adjustments are in general very good. Some iterative procedures are suggested to improve the results even further. Further, balanced unequal probability sampling is considered. In this kind of sampling, samples are given a positive probability of selection only if they satisfy some balancing conditions. The balancing conditions are given by information from auxiliary variables. Most of the attention is devoted to a slightly less general but practically important case. Also in this case the inclusion probabilities are prescribed in advance, making the choice of sampling parameters important. A complication which arises in the context of choosing sampling parameters is that certain probability distributions need to be calculated, and exact calculation turns out to be practically impossible, except for very small cases. It is proposed that Markov Chain Monte Carlo (MCMC) methods are used for obtaining approximations to the relevant probability distributions, and also for sample selection. In general, MCMC methods for sample selection does not occur very frequently in the sampling literature today, making it a fairly novel idea.
80

Microtomographie X de matériaux à comportement pseudo-fragile : Identification du réseau de fissures / X-ray microtomography of materials to brittle-like behavior : Identification of the crack network

Hauss, Grégory 06 December 2012 (has links)
L'étude de l'endommagement des matériaux à comportement pseudo-fragile fait l'objet denombreuses études et la caractérisation du réseau de fissures constitue une étape nécessairepour une meilleure compréhension de leur comportement. L'objectif principal est ici d'identifierde manière la plus fine possible cet espace fissuré en trois dimensions grâce à la techniqued'imagerie nommée microtomographie X. Pour ce faire, une machine d'essai in-situ a étédéveloppée et une procédure d'analyse des images 3D a été validée. L'objectif du dispositif insituest de maintenir l'échantillon dans différents états fissurés pour rendre possible lesacquisitions microtomographiques. Une fois les images 3D reconstruites, la procédure detraitement est appliquée et l'espace fissuré est identifié. Des mesures sont alors réalisées surl'évolution du réseau de fissures au cours de l'endommagement. Ce travail constitue la premièreétape d'un traitement plus général qui a pour objectif de simuler numériquement lecomportement mécanique de ces matériaux en se basant sur leur géométrie réelle. / Materials displaying a pseudo-brittle behavior have been well studied over the past decade andthe characterization of the cracks network has become nowadays an important step for theunderstanding of their damaging behavior. The aim of this work is to characterize, in the finestavailable way, this crack space in 3D using X-ray computed microtomography. This wasachieved: 1) by designing an in-situ compressive device which maintains a sample in a crackedstate during microtomographic data acquisition and, 2) by processing the images with relevantimage filtering techniques for a better cracks network characterization. Two parameters ofchoice are then measured: the cracks network surface and volume. This work is the first step ofa global procedure which aims to numerically model the mechanical behavior of pseudo-brittlematerials by using real 3D crack geometry.

Page generated in 0.0763 seconds