• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • Tagged with
  • 6
  • 6
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Contribution de la planification expérimentale à la modélisation de phénomènes complexes en formulation / Experimental designs for complex phenomena in formulation

Gomes, Charles 20 December 2018 (has links)
Dans certains domaines de la formulation, comme la cosmétique, les phénomènes étudiés peuvent être très chaotiques avec des zones de rupture ou de non linéarité. Dans ce cas, le formulateur doit se poser de nombreuses questions avant de proposer la stratégie expérimentale optimale qui doit être adaptée au mieux à son problème. Pour de tels phénomènes, des plans d'expériences classiques, tels que les réseaux de Scheffé ou les plans D-optimaux, se révèlent insuffisants car les points expérimentaux ne couvrent pas uniformément l'espace expérimental. En effet, il est intéressant dans ces cas d'étude d'explorer l'ensemble du domaine expérimental et de répartir uniformément les points dans l'espace. Pour cela, les plans uniformes ou Space-Filling Designs (SFD), fréquemment utilisés dans le cas de variables orthogonales, mais très peu dans le cas des variables de mélange, sont particulièrement intéressants. L'objectif de cette thèse est d’adapter des algorithmes de construction de plans uniformes dans le cas de plans de mélanges, de proposer des règles simples pour aider au choix de la nature et du nombre de points du plan d'expériences de mélange / In some domains of formulation, as cosmetics, the phenomena can be very chaotic with discontinuities or not linear zones. In the cosmetic field, the formulator has to propose the optimal experimental strategy which must be well adapted to the constraints imposed by the experimenters. For such phenomena, classical designs of experiments, such as Scheffé simplexes lattices or the D-optimal designs, are proving insufficient because the experimental points do not uniformly cover the experimental space. Indeed, it is essential in these studies to explore the whole experimental domain and to uniformly distribute points in the space. For that purpose, the Space-Filling Designs (SFD), frequently used in the case of orthogonal variables, but less in the case of the mixture variables, are particularly interesting. The objective of this thesis is to adapt the algorithms for construction of uniform designs in the case of mixture designs and to propose guidelines for the choice of the nature and the number of points of the experimental design
2

Optimal Latin Hypercube Designs for Computer Experiments Based on Multiple Objectives

Hou, Ruizhe 22 March 2018 (has links)
Latin hypercube designs (LHDs) have broad applications in constructing computer experiments and sampling for Monte-Carlo integration due to its nice property of having projections evenly distributed on the univariate distribution of each input variable. The LHDs have been combined with some commonly used computer experimental design criteria to achieve enhanced design performance. For example, the Maximin-LHDs were developed to improve its space-filling property in the full dimension of all input variables. The MaxPro-LHDs were proposed in recent years to obtain nicer projections in any subspace of input variables. This thesis integrates both space-filling and projection characteristics for LHDs and develops new algorithms for constructing optimal LHDs that achieve nice properties on both criteria based on using the Pareto front optimization approach. The new LHDs are evaluated through case studies and compared with traditional methods to demonstrate their improved performance.
3

Some Advances in Local Approximate Gaussian Processes

Sun, Furong 03 October 2019 (has links)
Nowadays, Gaussian Process (GP) has been recognized as an indispensable statistical tool in computer experiments. Due to its computational complexity and storage demand, its application in real-world problems, especially in "big data" settings, is quite limited. Among many strategies to tailor GP to such settings, Gramacy and Apley (2015) proposed local approximate GP (laGP), which constructs approximate predictive equations by constructing small local designs around the predictive location under certain criterion. In this dissertation, several methodological extensions based upon laGP are proposed. One methodological contribution is the multilevel global/local modeling, which deploys global hyper-parameter estimates to perform local prediction. The second contribution comes from extending the laGP notion of "locale" to a set of predictive locations, along paths in the input space. These two contributions have been applied in the satellite drag emulation, which is illustrated in Chapter 3. Furthermore, the multilevel GP modeling strategy has also been applied to synthesize field data and computer model outputs of solar irradiance across the continental United States, combined with inverse-variance weighting, which is detailed in Chapter 4. Last but not least, in Chapter 5, laGP's performance has been tested on emulating daytime land surface temperatures estimated via satellites, in the settings of irregular grid locations. / Doctor of Philosophy / In many real-life settings, we want to understand a physical relationship/phenomenon. Due to limited resources and/or ethical reasons, it is impossible to perform physical experiments to collect data, and therefore, we have to rely upon computer experiments, whose evaluation usually requires expensive simulation, involving complex mathematical equations. To reduce computational efforts, we are looking for a relatively cheap alternative, which is called an emulator, to serve as a surrogate model. Gaussian process (GP) is such an emulator, and has been very popular due to fabulous out-of-sample predictive performance and appropriate uncertainty quantification. However, due to computational complexity, full GP modeling is not suitable for “big data” settings. Gramacy and Apley (2015) proposed local approximate GP (laGP), the core idea of which is to use a subset of the data for inference and further prediction at unobserved inputs. This dissertation provides several extensions of laGP, which are applied to several real-life “big data” settings. The first application, detailed in Chapter 3, is to emulate satellite drag from large simulation experiments. A smart way is figured out to capture global input information in a comprehensive way by using a small subset of the data, and local prediction is performed subsequently. This method is called “multilevel GP modeling”, which is also deployed to synthesize field measurements and computational outputs of solar irradiance across the continental United States, illustrated in Chapter 4, and to emulate daytime land surface temperatures estimated by satellites, discussed in Chapter 5.
4

Analyse d'une base de données pour la calibration d'un code de calcul

Feuillard, Vincent 21 May 2007 (has links) (PDF)
Cette recherche s'insère dans le contexte général de la calibration, en vue d'applications industrielles. Son objectif est d'évaluer la qualité d'une base de données, représentant la manière dont celle-ci occupe, au mieux des objectifs recherchés, son domaine de variation. Le travail réalisé ici fournit une synthèse des outils mathématiques et algorithmiques permettant de réaliser une telle opération. Nous proposons en outre des techniques de sélection ou d'importation de nouvelles observations permettant d'améliorer la qualité globale des bases de données. Les méthodes élaborées permettent entre autres d'identifier des défauts dans la structure des données. Leurs applications sont illustrées dans le cadre de l'évaluation de paramètres fonctionnels, dans un contexte d'estimation par fonctions orthogonales.
5

Multi-layer designs and composite gaussian process models with engineering applications

Ba, Shan 21 May 2012 (has links)
This thesis consists of three chapters, covering topics in both the design and modeling aspects of computer experiments as well as their engineering applications. The first chapter systematically develops a new class of space-filling designs for computer experiments by splitting two-level factorial designs into multiple layers. The new design is easy to generate, and our numerical study shows that it can have better space-filling properties than the optimal Latin hypercube design. The second chapter proposes a novel modeling approach for approximating computationally expensive functions that are not second-order stationary. The new model is a composite of two Gaussian processes, where the first one captures the smooth global trend and the second one models local details. The new predictor also incorporates a flexible variance model, which makes it more capable of approximating surfaces with varying volatility. The third chapter is devoted to a two-stage sequential strategy which integrates analytical models with finite element simulations for a micromachining process.
6

Systèmes optiques interférentiels et incertitudes

Vasseur, Olivier 07 September 2012 (has links) (PDF)
Les développements technologiques permettent aujourd'hui l'élaboration de systèmes optiques interférentiels composés d'un grand nombre de composants. Ainsi, des formules de filtres diélectriques multicouches comportant plusieurs dizaines ou centaines de couches minces ont été proposées. La combinaison cohérente de plusieurs dizaines à plusieurs centaines de sources laser fibrées fait également l'objet de nombreux travaux de recherche. De même, d'autres systèmes comme les réseaux diffractifs bi et tridimensionnels composés d'un grand nombre d'ouvertures peuvent être étudiés. L'évaluation de la robustesse de tels systèmes interférentiels aux incertitudes de fabrication constitue un enjeu important mais d'autant plus difficile que le nombre de paramètres décrivant le système est grand. Dans ce document de synthèse, sont rappelés, dans un premier temps, les méthodologies liés aux plans d'expériences numériques et les résultats concernant leur qualité d'exploration des espaces de grandes dimensions au moyen de la construction d'un graphe : l'Arbre de Longueur Minimale. Dans une seconde partie, l'analyse de l'influence des incertitudes des paramètres d'entrée de systèmes interférentiels sur leurs performances est illustrée au moyen de deux applications : les filtres interférentiels multidiélectriques et la combinaison cohérente de sources laser fibrées. La méthodologie mise en oeuvre permet notamment d'identifier les incertitudes et les synergies les plus critiques au sein du système tout en construisant des métamodèles représentatifs. A partir de ces acquis, la caractérisation spatiale du speckle de surfaces rugueuses et plus généralement la caractérisation de la variabilité spatiale de phénomènes optiques sont ensuite explicitées. Enfin, les perspectives scientifiques issues de l'ensemble de ces activités de recherche sont développées. (PS : Les planches présentées lors de la soutenance ont été ajoutées en annexe du document original : pages 170 à 202).

Page generated in 0.0215 seconds