• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 94
  • 34
  • 29
  • 9
  • 8
  • 4
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 213
  • 31
  • 28
  • 25
  • 25
  • 19
  • 19
  • 17
  • 16
  • 14
  • 14
  • 14
  • 14
  • 14
  • 13
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

DEPOSITION OF COATINGS ONTO NANOFIBERS

Moore, Kevin Charles 05 October 2006 (has links)
No description available.
122

Systematic Tire Testing and Model Parameterization for Tire Traction on Soft Soil

He, Rui 30 January 2020 (has links)
Tire performance over soft soil influences the performance of off-road vehicles on soft soil, as the tire is the only force transmitting element between the off-road vehicles and soil during the vehicle operation. One aspect of the tire performance over soft soil is the tire tractive performance on soft soil, and it attracts the attention of vehicle and geotechnical engineers. The vehicle engineer is interested in the tire tractive performance on soft soil because it is related to vehicle mobility and energy efficiency; the geotechnical engineer is concerned about the soil compaction, brought about by the tire traffic, which accompanies the tire tractive performance on soft soil. In order to improve the vehicle mobility and energy efficiency over soft soil and mitigate the soil compaction, it's essential to develop an in-depth understanding of tire tractive performance on soft soil. This study has enhanced the understanding of tire tractive performance on soft soil and promoted the development of terramechanics and tire model parameterization method through experimental tests. The experimental tests consisted of static tire deflection tests, static tire-soil tests, soil properties tests, and dynamic tire-soil tests. The series of tests (test program) presented herein produced parameterization and validation data that can be used in tire off-road traction dynamics modeling and terramechanics modeling. The 225/60R16 97S Uniroyal (Michelin) Standard Reference Test Tire (SRTT) and loamy sand were chosen to be studied in the test program. The tests included the quantification or/and measurement of soil properties of the test soil, pre-traffic soil condition, the pressure distribution in the tire contact patch, tire off-road tractive performance, and post-traffic soil compaction. The influence of operational parameters, e.g., tire inflation pressure, tire normal load, tire slip ratio, initial soil compaction, or the number of passes, on the measurement data of tire performance parameters or soil response parameters was also analyzed. New methods of the rolling radius estimation for a tire on soft soil and of the 3-D rut reconstruction were developed. A multi-pass effect phenomenon, different from any previously observed phenomenon in the available existing literature, was discovered. The test data was fed into optimization programs for the parameterization of the Bekker's model, a modified Bekker's model, the Magic Formula tire model, and a bulk density estimation model. The modified Bekker's model accounts for the slip sinkage effect which the original Bekker's pressure-sinkage model doesn't. The Magic Formula tire model was adapted to account for the combined influence of tire inflation pressure and initial soil compaction on the tire tractive performance and validated by the test data. The parameterization methods presented herein are new effective terramechanics model parameterization methods, can capture tire-soil interaction which the conventional parameterization methods such as the plate-sinkage test and shear test (not using a tire as the shear tool) cannot sufficiently, and hence can be used to develop tire off-road dynamics models that are heavily based on terramechanics models. This study has been partially supported by the U.S. Army Engineer Research and Development Center (ERDC) and by the Terramechanics, Multibody, and Vehicle (TMVS) Laboratory at Virginia Tech. / Doctor of Philosophy / Big differences exist between a tire moving in on-road conditions, such as asphalt lanes, and a tire moving in off-road conditions, such as soft soil. For example, for passenger cars commonly driven on asphalt lanes, normally, the tire inflation pressure is suggested to be between 30 and 35 psi; very low inflation pressure is also not suggested. By contrast, for off-road vehicles operated on soft soil, low inflation pressure is recommended for their tires; the inflation pressure of a tractor tire can be as low as 12 psi, for the sake of low post-traffic soil compaction and better tire traction. Besides, unlike the research on tire on-road dynamics, the research on off-road dynamics is still immature, while the physics behind the off-road dynamics could be more complex than the on-road dynamics. In this dissertation, experimental tests were completed to study the factors influencing tire tractive performance and soil behavior, and model parameterization methods were developed for a better prediction of tire off-road dynamics models. Tire or vehicle manufacturers can use the research results or methods presented in this dissertation to offer suggestions for the tire or vehicle operation on soft soil in order to maximize the tractive performance and minimize the post-traffic soil compaction.
123

General Weighted Optimality of Designed Experiments

Stallings, Jonathan W. 22 April 2014 (has links)
Design problems involve finding optimal plans that minimize cost and maximize information about the effects of changing experimental variables on some response. Information is typically measured through statistically meaningful functions, or criteria, of a design's corresponding information matrix. The most common criteria implicitly assume equal interest in all effects and certain forms of information matrices tend to optimize them. However, these criteria can be poor assessments of a design when there is unequal interest in the experimental effects. Morgan and Wang (2010) addressed this potential pitfall by developing a concise weighting system based on quadratic forms of a diagonal matrix W that allows a researcher to specify relative importance of information for any effects. They were then able to generate a broad class of weighted optimality criteria that evaluate a design's ability to maximize the weighted information, ultimately targeting those designs that efficiently estimate effects assigned larger weight. This dissertation considers a much broader class of potential weighting systems, and hence weighted criteria, by allowing W to be any symmetric, positive definite matrix. Assuming the response and experimental effects may be expressed as a general linear model, we provide a survey of the standard approach to optimal designs based on real-valued, convex functions of information matrices. Motivated by this approach, we introduce fundamental definitions and preliminary results underlying the theory of general weighted optimality. A class of weight matrices is established that allows an experimenter to directly assign weights to a set of estimable functions and we show how optimality of transformed models may be placed under a weighted optimality context. Straightforward modifications to SAS PROC OPTEX are shown to provide an algorithmic search procedure for weighted optimal designs, including A-optimal incomplete block designs. Finally, a general theory is given for design optimization when only a subset of all estimable functions is assumed to be in the model. We use this to develop a weighted criterion to search for A-optimal completely randomized designs for baseline factorial effects assuming all high-order interactions are negligible. / Ph. D.
124

Understanding Mercury's Thermochemical Evolution Using a Geochemical and Geophysical Lens

Bose, Priyanka 20 May 2024 (has links)
Master of Science / Mercury is the most mysterious planet in the inner Solar System, suggested by observations from the MESSENGER mission. These observations shine a light on potential processes occurring within Mercury as it evolved over time. Scientific instruments aboard MESSENGER indicate that Mercury has a very thin surface layer of broken rocks, a thin crustal layer covered by lavas erupted from a melt formed in a relatively thin, FeO poor mantle, and a large metal rich core made from Fe and some quantity of a light element. These conditions are different than those seen on Earth: a thick crust covered by a layer of varied thickness made up of loose unconsolidated rocks and dust, a large mantle with more FeO, and a smaller core to planet ratio. To understand how these non-Earth like conditions affect how the planet's interior changes with time, a modified evolution model was created to track the changes in heat and chemistry within Mercury. This model accounts for complications like a dynamic core density that changes with a growing inner core, the formation method of the inner core, and the FeO poor mantle composition. Using this model offers illumination on the conditions Mercury experienced after it formed. This model is limited, but results suggest that Mercury's mantle began at an initial mantle temperature of 1600 K, and a mantle reference viscosity of 1021–1022 Pa s, indicating the mantle was less likely to flow easily. Model results also suggest the core contained some sulfur from 0.05–8.9 wt.% S, derived from the MESSENGER data. BepiColombo, a new Mercury mission, will provide some perspectives on the interior of Mercury, leading to more detailed information about conditions present after planetary formation and the effect of non-Earth like conditions on a planet's interior as it cools.
125

Sur la validation des modèles de séries chronologiques spatio-temporelles multivariées

Saint-Frard, Robinson 06 1900 (has links)
Dans ce mémoire, nous avons utilisé le logiciel R pour la programmation. / Le présent mémoire porte sur les séries chronologiques qui en plus d’être observées dans le temps, présentent également une composante spatiale. Plus particulièrement, nous étudions une certaine classe de modèles, les modèles autorégressifs spatio-temporels généralisés, ou GSTAR. Dans un premier temps, des liens sont effectués avec les modèles vectoriels autorégressifs (VAR). Nous obtenons explicitement la distribution asymptotique des autocovariances résiduelles pour les modèles GSTAR en supposant que le terme d’erreur est un bruit blanc gaussien, ce qui représente une première contribution originale. De ce résultat, des tests de type portemanteau sont proposés, dont les distributions asymptotiques sont étudiées. Afin d’illustrer la performance des statistiques de test, une étude de simulations est entreprise où des modèles GSTAR sont simulés et correctement ajustés. La méthodologie est illustrée avec des données réelles. Il est question de la production mensuelle de thé en Java occidental pour 24 villes, pour la période janvier 1992 à décembre 1999. / In this master thesis, time series models are studied, which have also a spatial component, in addition to the usual time index. More particularly, we study a certain class of models, the Generalized Space-Time AutoRegressive (GSTAR) time series models. First, links are considered between Vector AutoRegressive models(VAR) and GSTAR models. We obtain explicitly the asymptotic distribution of the residual autocovariances for the GSTAR models, assuming that the error term is a Gaussian white noise, which is a first original contribution. From that result, test statistics of the portmanteau type are proposed, and their asymptotic distributions are studied. In order to illustrate the behaviour of the test statistics, a simulation study is conducted where GSTAR models are simulated and correctly fitted. The methodology is illustrated with monthly real data concerning the production of tea in west Java for 24 cities from the period January 1992 to December 1999.
126

Modèles paramétriques pour la tomographie sismique bayésienne / Parametric models for bayesian seismic tomography

Belhadj, Jihane 02 December 2016 (has links)
La tomographie des temps de première arrivée vise à retrouver un modèle de vitesse de propagation des ondes sismiques à partir des temps de première arrivée mesurés. Cette technique nécessite la résolution d’un problème inverse afin d’obtenir un modèle sismique cohérent avec les données observées. Il s'agit d'un problème mal posé pour lequel il n'y a aucune garantie quant à l'unicité de la solution. L’approche bayésienne permet d’estimer la distribution spatiale de la vitesse de propagation des ondes sismiques. Il en résulte une meilleure quantification des incertitudes associées. Cependant l’approche reste relativement coûteuse en temps de calcul, les algorithmes de Monte Carlo par chaînes de Markov (MCMC) classiquement utilisés pour échantillonner la loi a posteriori des paramètres n'étant efficaces que pour un nombre raisonnable de paramètres. Elle demande, de ce fait, une réflexion à la fois sur la paramétrisation du modèle de vitesse afin de réduire la dimension du problème et sur la définition de la loi a priori des paramètres. Le sujet de cette thèse porte essentiellement sur cette problématique.Le premier modèle que nous considérons est basé sur un modèle de mosaïque aléatoire, le modèle de Jonhson-Mehl, dérivé des mosaïques de Voronoï déjà proposées en tomographie bayésienne. Contrairement à la mosaïque de Voronoï, les cellules de Johsnon-mehl ne sont pas forcément convexes et sont bornées par des portions d’hyperboloïdes, offrant ainsi des frontières lisses entre les cellules. Le deuxième modèle est, quant à lui, décrit par une combinaison linéaire de fonctions gaussiennes, centrées sur la réalisation d'un processus ponctuel de Poisson. Pour chaque modèle, nous présentons un exemple de validation sur des champs de vitesse simulés. Nous appliquons ensuite notre méthodologie à un modèle synthétique plus complexe qui sert de benchmark dans l'industrie pétrolière. Nous proposons enfin, un modèle de vitesse basé sur la théorie du compressive sensing pour reconstruire le champ de vitesse. Ce modèle, encore imparfait, ouvre plusieurs pistes de recherches futures.Dans ce travail, nous nous intéressons également à un jeu de données réelles acquises dans le contexte de la fracturation hydraulique. Nous développons dans ce contexte une méthode d'inférence bayésienne trans-dimensionnelle et hiérarchique afin de traiter efficacement la complexité du modèle à couches. / First arrival time tomography aims at inferring the seismic wave propagation velocity using experimental first arrival times. In our study, we rely on a Bayesian approach to estimate the wave velocity and the associated uncertainties. This approach incorporates the information provided by the data and the prior knowledge of the velocity model. Bayesian tomography allows for a better estimation of wave velocity as well asassociated uncertainties. However, this approach remains fairly expensive, and MCMC algorithms that are used to sample the posterior distribution are efficient only as long as the number of parameters remains within reason. Hence, their use requires a careful reflection both on the parameterization of the velocity model, in order to reduce the problem's dimension, and on the definition of the prior distribution of the parameters. In this thesis, we introduce new parsimonious parameterizations enabling to accurately reproduce the wave velocity field with the associated uncertainties.The first parametric model that we propose uses a random Johnson-Mehl tessellation, a variation of the Voronoï tessellation. The second one uses Gaussian kernels as basis functions. It is especially adapted to the detection of seismic wave velocity anomalies. Each anomaly isconsidered to be a linear combination of these basis functions localized at the realization of a Poisson point process. We first illustrate the tomography results with a synthetic velocity model, which contains two small anomalies. We then apply our methodology to a more advanced and more realistic synthetic model that serves as a benchmark in the oil industry. The tomography results reveal the ability of our algorithm to map the velocity heterogeneitieswith precision using few parameters. Finally, we propose a new parametric model based on the compressed sensing techniques. The first results are encouraging. However, the model still has some weakness related to the uncertainties estimation.In addition, we analyse real data in the context of induced microseismicity. In this context, we develop a trans-dimensional and hierarchical approach in order to deal with the full complexity of the layered model.
127

Calibration Bayésienne d'un modèle d'étude d'écosystème prairial : outils et applications à l'échelle de l'Europe / no title available

Ben Touhami, Haythem 07 March 2014 (has links)
Les prairies représentent 45% de la surface agricole en France et 40% en Europe, ce qui montre qu’il s’agit d’un secteur important particulièrement dans un contexte de changement climatique où les prairies contribuent d’un côté aux émissions de gaz à effet de serre et en sont impactées de l’autre côté. L’enjeu de cette thèse a été de contribuer à l’évaluation des incertitudes dans les sorties de modèles de simulation de prairies (et utilisés dans les études d’impact aux changements climatiques) dépendant du paramétrage du modèle. Nous avons fait appel aux méthodes de la statistique Bayésienne, basées sur le théorème de Bayes, afin de calibrer les paramètres d’un modèle référent et améliorer ainsi ses résultats en réduisant l’incertitude liée à ses paramètres et, par conséquent, à ses sorties. Notre démarche s’est basée essentiellement sur l’utilisation du modèle d’écosystème prairial PaSim, déjà utilisé dans plusieurs projets européens pour simuler l’impact des changements climatiques sur les prairies. L’originalité de notre travail de thèse a été d’adapter la méthode Bayésienne à un modèle d’écosystème complexe comme PaSim (appliqué dans un contexte de climat altéré et à l’échelle du territoire européen) et de montrer ses avantages potentiels dans la réduction d’incertitudes et l’amélioration des résultats, en combinant notamment méthodes statistiques (technique Bayésienne et analyse de sensibilité avec la méthode de Morris) et outils informatiques (couplage code R-PaSim et utilisation d’un cluster de calcul). Cela nous a conduit à produire d’abord un nouveau paramétrage pour des sites prairiaux soumis à des conditions de sécheresse, et ensuite à un paramétrage commun pour les prairies européennes. Nous avons également fourni un outil informatique de calibration générique pouvant être réutilisé avec d’autres modèles et sur d’autres sites. Enfin, nous avons évalué la performance du modèle calibré par le biais de la technique Bayésienne sur des sites de validation, et dont les résultats ont confirmé l’efficacité de cette technique pour la réduction d’incertitude et l’amélioration de la fiabilité des sorties. / Grasslands cover 45% of the agricultural area in France and 40% in Europe. Grassland ecosystems have a central role in the climate change context, not only because they are impacted by climate changes but also because grasslands contribute to greenhouse gas emissions. The aim of this thesis was to contribute to the assessment of uncertainties in the outputs of grassland simulation models, which are used in impact studies, with focus on model parameterization. In particular, we used the Bayesian statistical method, based on Bayes’ theorem, to calibrate the parameters of a reference model, and thus improve performance by reducing the uncertainty in the parameters and, consequently, in the outputs provided by models. Our approach is essentially based on the use of the grassland ecosystem model PaSim (Pasture Simulation model) already applied in a variety of international projects to simulate the impact of climate changes on grassland systems. The originality of this thesis was to adapt the Bayesian method to a complex ecosystem model such as PaSim (applied in the context of altered climate and across the European territory) and show its potential benefits in reducing uncertainty and improving the quality of model outputs. This was obtained by combining statistical methods (Bayesian techniques and sensitivity analysis with the method of Morris) and computing tools (R code -PaSim coupling and use of cluster computing resources). We have first produced a new parameterization for grassland sites under drought conditions, and then a common parameterization for European grasslands. We have also provided a generic software tool for calibration for reuse with other models and sites. Finally, we have evaluated the performance of the calibrated model through the Bayesian technique against data from validation sites. The results have confirmed the efficiency of this technique for reducing uncertainty and improving the reliability of simulation outputs.
128

Bases de fonctions sur les variétés / Function bases on manifolds

Vallet, Bruno 10 July 2008 (has links)
Les bases de fonctions sont des outils indispensables de la géométrie numérique puisqu'ils permettent de représenter des fonctions comme des vecteurs, c'est à dire d'appliquer les outils de l'algèbre linéaire à l'analyse fonctionnelle. Dans cette thèse, nous présentons plusieurs constructions de bases de fonctions sur des surfaces pour la géométrie numérique. Nous commençons par présenter les bases de fonctions usuelles des éléments finis et du calcul extérieur discret, leur théorie et leurs limites. Nous étudions ensuite le Laplacien et sa discrétisation, ce qui nous permettra de construire une base de fonctions particulière~: les fonctions propres de l'opérateur de Laplace-Beltrami, ou harmoniques variétés. Celles-ci permettent de généraliser la transformée de Fourier et le filtrage spectral aux fonctions définies sur des surfaces. Nous présentons ensuite des applications de cette base de fonction à la géométrie numérique. En particulier, nous montrons qu'une fois calculée, cette base de fonction permet de filtrer la géométrie en temps interactif. Pour pouvoir définir des bases de fonctions de façon plus indépendante du maillage de la surface, nous nous intéressons ensuite aux paramétrisations globales, et en particulier aux champs de directions à symétries qui permettent de les définir. Ainsi, dans la dernière partie, nous étudions ces champs de directions à symétries, et en particulier leur géométrie et leur topologie. Nous donnons alors des outils pour les construire, les manipuler et les visualiser / Function bases are fundamental objects in geometry processing as they allow to represent functions as vectors, that is to apply tools from linear algebra to functional analysis. In this thesis, we present various constructions of useful functions bases for geometry processing. We start by presenting usual function bases, their theory and limits. We then study the Laplacian operator and its discretization, and use it to define a particular function basis: Laplacian eigenfunctions or Manifold harmonics. The Manifold Hamonics form a function basis that allows to generalize the Fourier transform and spectral filtering on a surface. We present some applications and extensions of this basis for geometry processing. To define function bases in a mesh-independant manner, we need to build a global parameterization, and especially the direction fields required to define them. Thus, in the last part of this thesis we study N-symmetry direction fields on surfaces, and in particular their geometry and topology. We then give tools to build, edit, control and visualize them
129

Conversion automatique de maillages en surfaces splines / Automatic mesh to spline conversion

Li, Wan-Chiu 16 November 2006 (has links)
Afin de convertir un maillage triangulaire en une surface spline de CAGD/CAM, cette thèse adresse l’un des problèmes les plus cruciaux du processus de conversion : extraire un “bon” maillage de contrôle quadrilatéral de la surface. Ce que nous entendons par “bon” est que les arêtes du maillage de contrôle se croisent perpendiculairement et sont alignées avec les principales directions de la courbure de la surface. Ces deux propriétés du maillage de contrôle permettent de fournir une bonne approximation de la surface avec peu de points de contrôles. D’ailleurs, ils aident considérablement à réduire des oscillations non désirées sur la surface spline finale. Pour résoudre ce problème, nous proposons un nouvel algorithme automatique, appelé paramétrisation globale périodique. L’idée fondamentale de cet algorithme est de trouver une paramétrisation qui ait un “sens d’un point de vue géométrique”, pour ce faire, elle doit être guidée par la courbure de la surface, représentée par une paire de champs de direction orthogonaux. Les iso-lignes de cette paramétrisation sont ensuite extraites pour définir un maillage de contrôle qui ait les propriétés requises. Ce maillage de contrôle, nous permet de construire une approximation en surface T-spline de la surface triangulée initiale. Nous exposons plusieurs résultats de cette conversion d’un maillage triangulée en surface spline. Les résultats montrent que, grâce aux maillages de contrôle anisotropes, les surfaces spline finales ont beaucoup moins d’oscillations que celles construites par les méthodes précédentes qui ne tiennent pas compte de l’anisotropie de la surface / Aiming at converting a triangular mesh into a CAGD/CAM spline surface, this thesis focuses on one of the most crucial problems of the conversion process, i.e. extracting a “good” quadrilateral control mesh of the surface. What we mean by good is that the edges of the control mesh should be orthogonal and aligned with the principal directions of curvature of the surface. These two properties make the control mesh optimum in an approximation point of view, and greatly help to reduce unwanted oscillations on the final spline surface built from it. To solve this problem, we propose a new automatic algorithm, called periodic global parameterization. The basic idea is to find a “geometry-meaningful” parameterization guided by a pair of orthogonal anisotropic direction fields. Then, the iso-value lines of this parameterization will be extracted to define an initial control mesh, that satisfies the two criteria of a good control mesh. With the initial control mesh, we explain how to construct a T-spline approximation of the initial triangulated surface. We show several examples of the triangular mesh to T-spline conversion. The results show that thanks to the anisotropic control meshes, the final spline surfaces generated have much less oscillations as compared to results of previous methods, that do not take into account of the anisotropy
130

Implementação no software estatístico R de modelos de regressão normal com parametrização geral / Normal regression models with general parametrization in software R

Perette, André Casagrandi 23 August 2019 (has links)
Este trabalho objetiva o desenvolvimento de um pacote no software estatístico R com a implementação de estimadores em modelos de regressão normal univariados com parametrização geral, uma particularidade do modelo definido em Patriota e Lemonte (2011). Essa classe contempla uma ampla gama de modelos conhecidos, tais como modelos de regressão não lineares e heteroscedásticos. São implementadas correções nos estimadores de máxima verossimilhança e na estatística de razão de verossimilhanças. Tais correções são efetivas quando o tamanho amostral é pequeno. Para a correção do estimador de máxima verossimilhança, considerou-se a correção do viés de segunda ordem, enquanto que para a estatística da razão de verossimilhanças aplicou-se a correção desenvolvida em Skovgaard (2001). Todas as funcionalidades do pacote são descritas detalhadamente neste trabalho. Para avaliar a qualidade do algoritmo desenvolvido, realizaram-se simulações de Monte Carlo para diferentes cenários, avaliando taxas de convergência, erros da estimação e eficiência das correções de viés e de Skovgaard. / This work aims to develop a package in R language with the implementation of normal regression models with general parameterization, proposed in Patriota and Lemonte (2011). This model unifies important models, such as nonlinear heteroscedastic models. Corrections are implemented for the MLEs and likelihood-ratio statistics. These corrections are effective in small samples. The algorithm considers the second-order bias of MLEs solution presented in Patriota and Lemonte (2009) and the Skovgaard\'s correction for likelihood-ratio statistics defined in Skovgaard (2001). In addition, a simulation study is developed under different scenarios, where the convergence ratio, relative squared error and the efficiency of bias correction and Skovgaard\'s correction are evaluated.

Page generated in 0.0949 seconds