• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 327
  • 113
  • 91
  • 76
  • 36
  • 24
  • 12
  • 8
  • 7
  • 5
  • 5
  • 5
  • 4
  • 3
  • 2
  • Tagged with
  • 878
  • 878
  • 145
  • 124
  • 121
  • 118
  • 113
  • 101
  • 101
  • 85
  • 82
  • 81
  • 73
  • 71
  • 68
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
681

Exploring the network’s world: From omics-driven machine learning workflow for drug target identification to quantification of signaling model diversity.

Dalpedri, Beatrice 30 October 2024 (has links)
The drug discovery process is challenging, time-consuming, and costly, with drug target identification being an essential step in developing effective therapies. Drug repurposing offers a strategy for identifying new uses for existing drugs, aiming to simplify the process. Machine learning models and network analysis methods have demonstrated promise in both drug target identification and repurposing, providing powerful tools for analyzing complex biological data. This thesis will explore the applications of neural networks and multilayer biological networks for drug repurposing opportunities and network inference problems applied to signaling pathways. A novel machine learning and network-based workflow is presented for identifying drug targets for cystinosis, a rare disease that causes progressive kidney disease, currently lacking effective therapies to prevent the kidney failure. This approach permits to recapitulate the disease mechanisms in the context of renal tubular physiology and identify candidate drug targets for further validation using a cross-species workflow and disease-relevant screening technologies. While machine learning approaches have shown promise, they often need more mechanistic understanding, which is necessary for robust drug target identification and repurposing strategies. Mechanistic models provide crucial insights into the underlying biological mechanisms, complementing machine learning techniques. However, inferring mechanistic signaling networks from omics data poses challenges due to non-identifiability, resulting in multiple valid solutions consistent with the data. After that, the focus shifts towards quantifying signaling model diversity through solver-agnostic solution sampling with CORNETO, an ongoing effort that aims to unify network inference problems via constrained optimization. Mechanistic signaling networks can be inferred from omics data and prior knowledge using combinatorial optimization and mathematical solvers to find the optimal network. However, this problem is in general, non-identifiable, and several solutions may be equally valid. Ignoring the existence of these alternative solutions leads to an incomplete picture of the hypothesis space of consistent mechanistic signaling networks. To alleviate this issue, an algorithm to explore the space of alternative solutions and to conduct sensitivity analysis on the optimal solution is implemented and presented. These algorithms are applied to data from pancreatic cancer cell lines treated with kinase inhibitors to study cellular responses to drug perturbations by inferring mechanistic signaling networks from omics data.
682

Functional analytic approaches to some stochastic optimization problems

Backhoff, Julio Daniel 17 February 2015 (has links)
In dieser Arbeit beschäftigen wir uns mit Nutzenoptimierungs- und stochastischen Kontrollproblemen unter mehreren Gesichtspunkten. Wir untersuchen die Parameterunsicherheit solcher Probleme im Sinne des Robustheits- und des Sensitivitätsparadigma. Neben der Betrachtung dieser problemen widmen wir uns auch einem Zweiagentenproblem, bei dem der eine dem anderen das Management seines Portfolios vertraglich überträgt. Wir betrachten das robuste Nutzenoptimierungsproblem in Finanzmarktmodellen, wobei wir Bedingungen für seine Lösbarkeit formulieren, ohne jegliche Kompaktheit der Unsicherheitsmenge zu fordern, welche die Maße enthält, auf die der Optimierer robustifiziert. Unsere Bedingungen sind über gewisse Funktionenräume beschrieben, die allgemein Modularräume sind, mittels dennen wir eine Min-Max-Gleichung und die Existenz optimalen Strategien beweisen. In vollständigen Märkten ist der Raum ein Orlicz, und nachdem man seine Reflexivität explizit überprüft hat, erhält man zusätzlich die Existenz einer Worst-Case-Maße, die wir charakterisieren. Für die Parameterabhängigkeit stochastischer Kontrollprobleme entwickeln wir einen Sensitivitätsansatz. Das Kernargument ist die Korrespondenz zwischen dem adjungierten Zustand zur schwachen Formulierung des Pontryaginschen Prinzips und den Lagrange-Multiplikatoren, die der Kontrollgleichung assoziiert werden, wenn man sie als eine Bedingung betrachtet. Der Sensitivitätsansatz wird dann auf konvexe Probleme mit additiver oder multiplikativer Störung angewendet. Das Zweiagentenproblem formulieren wir in diskreter Zeit. Wir wenden in größter Verallgemeinerung die Methoden der bedingten Analysis auf den Fall linearer Verträge an und zeigen, dass sich die Mehrheit der in der Literatur unter sehr spezifischen Annahmen bekannten Ergebnisse auf eine deutlich umfassenderer Klasse von Modellen verallgemeinern lässt. Insbesondere erhalten wir die Existenz eines first-best-optimalen Vertrags und dessen Implementierbarkeit. / In this thesis we deal with utility maximization and stochastic optimal control through several points of view. We shall be interested in understanding how such problems behave under parameter uncertainty under respectively the robustness and the sensitivity paradigms. Afterwards, we leave the single-agent world and tackle a two-agent problem where the first one delegates her investments to the second through a contract. First, we consider the robust utility maximization problem in financial market models, where we formulate conditions for its solvability without assuming compactness of the densities of the uncertainty set, which is a set of measures upon which the maximizing agent performs robust investments. These conditions are stated in terms of functional spaces wich generally correspond to Modular spaces, through which we prove a minimax equality and the existence of optimal strategies. In complete markets the space is an Orlicz one, and upon explicitly granting its reflexivity we obtain in addition the existence of a worst-case measure, which we fully characterize. Secondly we turn our attention to stochastic optimal control, where we provide a sensitivity analysis to some parameterized variants of such problems. The main tool is the correspondence between the adjoint states appearing in a (weak) stochastic Pontryagin principle and the Lagrange multipliers associated to the controlled equation when viewed as a constraint. The sensitivity analysis is then deployed in the case of convex problems and additive or multiplicative perturbations. In a final part, we proceed to Principal-Agent problems in discrete time. Here we apply in great generality the tools from conditional analysis to the case of linear contracts and show that most results known in the literature for very specific instances of the problem carry on to a much broader setting. In particular, the existence of a first-best optimal contract and its implementability by the Agent is obtained.
683

Sensitivity Analysis and Material Parameter Estimation using Electromagnetic Modelling / Känslighetsanalys och estimering av materialparametrar med elektromagnetisk modellering

Sjödén, Therese January 2012 (has links)
Estimating parameters is the problem of finding their values from measurements and modelling. Parameters describe properties of a system; material, for instance, are defined by mechanical, electrical, and chemical parameters. Fisher information is an information measure, giving information about how changes in the parameter effect the estimation. The Fisher information includes the physical model of the problem and the statistical model of noise. The Cramér-Rao bound is the inverse of the Fisher information and gives the best possible variance for any unbiased estimator. This thesis considers aspects of sensitivity analysis in two applied material parameter estimation problems. Sensitivity analysis with the Fisher information and the Cramér-Rao bound is used as a tool for evaluation of measurement feasibilities, comparison of measurement set-ups, and as a quantitative measure of the trade-off between accuracy and resolution in inverse imaging. The first application is with estimation of the wood grain angle parameter in trees and logs. The grain angle is the angle between the direction of the wood fibres and the direction of growth; a large grain angle strongly correlates to twist in sawn timber. In the thesis, measurements with microwaves are argued as a fast and robust measurement technique and electromagnetic modelling is applied, exploiting the anisotropic properties of wood. Both two-dimensional and three-dimensional modelling is considered. Mathematical modelling is essential, lowering the complexity and speeding up the computations. According to a sensitivity analysis with the Cramér-Rao bound, estimation of the wood grain angle with microwaves is feasible. The second application is electrical impedance tomography, where the conductivity of an object is estimated from surface measurements. Electrical impedance tomography has applications in, for example, medical imaging, geological surveillance, and wood evaluation. Different configurations and noise models are evaluated with sensitivity analysis for a two-dimensional electrical impedance tomography problem. The relation between the accuracy and resolution is also analysed using the Fisher information. To conclude, sensitivity analysis is employed in this thesis, as a method to enhance material parameter estimation. The sensitivity analysis methods are general and applicable also on other parameter estimation problems. / Estimering av parametrar är att finna deras värde utifrån mätningar och modellering. Parametrar beskriver egenskaper hos system och till exempel material kan definieras med mekaniska, elektriska och kemiska parametrar. Fisherinformation är ett informationsmått som ger information om hur ändringar i en parameter påverkar estimeringen. Fisherinformationen ges av en fysikalisk modell av problemet och en statistisk modell av mätbruset. Cramér-Rao-gränsen är inversen av Fisherinformationen och ger den bästa möjliga variansen för alla väntevärdesriktiga estimatorer.Den här avhandlingen behandlar aspekter av känslighetsanalys i två tillämpade estimeringsproblem för materialparametrar. Känslighetsanalys med Fisherinformation och Cramér-Rao-gränsen används som ett redskap för utvärdering av möjligheten att mäta och för jämförelser av mätuppställningar, samt som ett kvantitativt mått på avvägningen mellan noggrannhet och upplösning för inversa bilder. Den första tillämpningen är estimering av fibervinkeln hos träd och stockar. Fibervinkeln är vinkeln mellan växtriktningen och riktningen hos träfibern och en stor fibervinkel är relaterad till problem med formstabilitet i färdiga brädor. Mikrovågsmätningar av fibervinkeln presenteras som en snabb och robust mätteknik. I avhandlingen beskrivs två- och tredimensionella elektromagnetiska modeller som utnyttjar anisotropin hos trä. Eftersom matematisk modellering minskar komplexiteten och beräkningstiden är det en viktig del i estimeringen. Enligt känslighetsanalys med Cramér-Rao-gränsen är estimering av fibervinkeln hos trä möjlig. Den andra tillämpningen är elektrisk impedanstomografi, där ledningsförmågan hos objekt bestäms genom mätningar på ytan. Elektrisk impedanstomografi har tillämpningar inom till exempel medicinska bilder, geologisk övervakning och trämätningar. Olika mätkonfigurationer och brusmodeller utvärderas med känslighetsanalys för ett tvådimensionellt exempel på elektrisk impedanstomografi. Relationen mellan noggrannhet och upplösning analyseras med Fisher information. För att sammanfatta beskrivs känslighetsanalys som en metod för att förbättra estimeringen av materialparametrar. Metoderna för känslighetsanalys är generella och kan tillämpas också på andra estimeringsproblem för parametrar.
684

Mathematical modelling of metabolism and acidity in cancer

McGillen, Jessica Buono January 2014 (has links)
Human cancers exhibit the common phenotype of elevated glycolytic metabolism, which causes acidification of the tissue microenvironment and may facilitate tumour invasion. In this thesis, we use mathematical models to address a series of open problems underlying the glycolytic tumour phenotype and its attendant acidity. We first explore tissue-scale consequences of metabolically-derived acid. Incorporating more biological detail into a canonical model of acidity at the tumour-host interface, we extend the range of tumour behaviours captured by the modelling framework. We then carry out an asymptotic travelling wave analysis to express invasive tumour properties in terms of fundamental parameters, and find that interstitial gaps between an advancing tumour and retreating healthy tissue, characteristic of aggressive invasion and comprising a controversial feature of the original model, are less significant under our generalised formulation. Subsequently, we evaluate a potential role of lactate---historically assumed to be a passive byproduct of glycolytic metabolism---in a perfusion-dependent metabolic symbiosis that was recently proposed as a beneficial tumour behaviour. Upon developing a minimal model of dual glucose-lactate consumption in vivo and employing a multidimensional sensitivity analysis, we find that symbiosis may not be straightforwardly beneficial for our model tumour. Moreover, new in vitro experiments, carried out by an experimental collaborator, place U87 glioblastoma tumours in a weakly symbiotic parameter regime despite their clinical malignancy. These results suggest that intratumoural metabolic cooperation is unlikely to be an important role for lactate. Finally, we examine the complex pH regulation system that governs expulsion of metabolically derived acid loads across tumour cell membranes. This system differs from the healthy system by expression of only a few key proteins, yet its dynamics are non-intuitive in the crowded and poorly perfused in vivo environment. We systematically develop a model of tumour pH regulation, beginning with a single-cell scenario and progressing to a spheroid, within a Bayesian framework that incorporates information from in vitro data contributed by a second experimental collaborator. We predict that a net effect of pH regulation is a straightforward transmembrane pH gradient, but also that existing treatments are unable to disrupt the system strongly enough to cause tumour cell death. Taken together, our models help to elucidate previously unresolved features of glycolytic tumour metabolism, and illustrate the utility of a combined mathematical, statistical, and experimental approach for testing biological hypotheses. Opportunities for further investigation are discussed.
685

Real-time Structural Health Monitoring of Nonlinear Hysteretic Structures

Nayyerloo, Mostafa January 2011 (has links)
The great social and economic impact of earthquakes has made necessary the development of novel structural health monitoring (SHM) solutions for increasing the level of structural safety and assessment. SHM is the process of comparing the current state of a structure’s condition relative to a healthy baseline state to detect the existence, location, and degree of likely damage during or after a damaging input, such as an earthquake. Many SHM algorithms have been proposed in the literature. However, a large majority of these algorithms cannot be implemented in real time. Therefore, their results would not be available during or immediately after a major event for urgent post-event response and decision making. Further, these off-line techniques are not capable of providing the input information required for structural control systems for damage mitigation. The small number of real-time SHM (RT-SHM) methods proposed in the past, resolve these issues. However, these approaches have significant computational complexity and typically do not manage nonlinear cases directly associated with relevant damage metrics. Finally, many available SHM methods require full structural response measurement, including velocities and displacements, which are typically difficult to measure. All these issues make implementation of many existing SHM algorithms very difficult if not impossible. This thesis proposes simpler, more suitable algorithms utilising a nonlinear Bouc-Wen hysteretic baseline model for RT-SHM of a large class of nonlinear hysteretic structures. The RT-SHM algorithms are devised so that they can accommodate different levels of the availability of design data or measured structural responses, and therefore, are applicable to both existing and new structures. The second focus of the thesis is on developing a high-speed, high-resolution, seismic structural displacement measurement sensor to enable these methods and many other SHM approaches by using line-scan cameras as a low-cost and powerful means of measuring structural displacements at high sampling rates and high resolution. Overall, the results presented are thus significant steps towards developing smart, damage-free structures and providing more reliable information for post-event decision making.
686

雙變量Gamma與廣義Gamma分配之探討

曾奕翔 Unknown Date (has links)
Stacy (1962)首先提出廣義伽瑪分配 (generalized gamma distribution),此分布被廣泛應用於存活分析 (survival analysis) 以及可靠度 (reliability) 中壽命時間的資料描述。事實上,像是指數分配 (exponential distribution)、韋伯分配 (Weibull distribution) 以及伽瑪分配 (gamma distribution) 都是廣義伽瑪分配的一個特例。 Bologna (1987)提出一個特殊的雙變量廣義伽瑪分配 (bivariate generalized gamma distribution) 可以經由雙變量常態分配 (bivariate normal distribution) 所推得。我們根據他的想法,提出多變量廣義伽瑪分配可以經由多變量常態分配所推得。在過去的研究中,學者們做了許多有關雙變量伽瑪分配。當我們提到雙變量常態分配,由於其分配的型式為唯一的,所以沒人任何人對其分配的型式有疑問。然而,雙變量伽瑪分配卻有很多不同的型式。 在這篇論文中的架構如下。在第二章中,我們介紹並討論雙變量廣義伽瑪分配可以經由雙變量常態分配所推得,接著推導參數估計以及介紹模擬的程序。在第三章中,我們介紹一些對稱以及非對稱的雙變量伽瑪分配,接著拓展到雙變量廣義伽瑪分配,有關參數的估計以及模擬結果也將在此章中討論。在第三章最後,我們建構參數的敏感度分析 (sensitivity analysis)。最後,在第四章中,我們陳述結論以及未來研究方向。 / The generalized gamma distribution was introduced by Stacy (1962). This distribution is useful to describe lifetime data when conducting survival analysis and reliability. In fact, it includes the widely used exponential, Weibull, and gamma distributions as special cases. Bologna (1987) showed that a special bivariate genenralized gamma distribution can be derived from a bivariate normal distribution. Follow his idea, we show that a multivariate generalized gamma distribution can be derived from a multivariate normal distribution. In the past, researchers spend much time in working on a bivariate gamma distribution. When a bivariate normal distribution is mentioned, no one feels puzzled about its form, since it has only one form. However, there are various forms of bivariate gamma distributions. In this paper is as following. In Chapter 2, we introduce and discuss the bivariate generalized gamma distribution, then the multivariate generalized gamma distribution is derived. We also develop parameters estimation and simulation procedure. In Chapter 3, we introduce some symmetrical and asymmetrical bivariate gamma distributions, then they are extended to the bivariate generalized gamma distributions. Problems of parameters estimation and simulation results are also discussed in Chapter 3. Besides, sensitivity analyses of parameters estimation are conducted. Finally, we state conclusion and future work in Chapter 4.
687

Conception Optimale Intégrée d'une chaîne éolienne "passive" : analyse de robustesse, validation expérimentale / Integrated Optimal Design of a passive wind turbine system : robust analysis, experimental validation

Tran, Duc-Hoan 27 September 2010 (has links)
Ce travail présente une méthodologie de Conception Optimale Intégrée (COI) d'un système éolien entièrement passif pour offrir un compromis coût-fiabilité–performance très satisfaisant. En l'absence d'électronique de puissance et de contrôle par MPPT, le dispositif n'est efficace que si l'adaptation des constituants est optimale. L'extraction de vent ainsi que les pertes globales du système sont donc optimisées à l'aide d'un algorithme génétique multicritère pour augmenter l'efficacité énergétique et minimiser la masse pour un profil de vent donné. La globalité du système (turbine – génératrice – redresseur - stockage) a été modélisée pour parvenir aux résultats d'optimisation et à la réalisation d'un prototype correspondant à une solution particulière. Les résultats obtenus montrent, d'une part, la cohérence entre modèles et expérience. D'autre part, il est possible, pour un profil de vent donné, d'obtenir une configuration optimale de l'ensemble génératrice – pont redresseur présentant des caractéristiques analogues à celles d'architectures « actives" plus complexes, associées à des lois de contrôle par MPPT. Suite à une analyse de sensibilité des performances aux paramètres, une de nos contributions concerne une approche de conception intégrant les questions de robustesse au sein même du processus d'optimisation. / This work deals with an Integrated Optimal Design (IOD) methodology of a full passive wind turbine system offering very good tradeoff in terms of cost, reliability and performance. Without active electronic device (power and MPPT control), efficiency of such architecture can only be obtained if all devices are mutually adapted: this can be achieved through an Integrated Optimal Design (IOD) approach. Wind energy extraction as whole losses are then optimized from a multiobjective genetic algorithm which aims at concurrently optimizing the energy efficiency while reducing the weight of the wind turbine system given a wind cycle. The whole system (turbine, generator, diode reducer, battery DC bus) has been modeled to obtain optimization results and finally to select a particular solution for an experimental validation. On the one hand, the obtained results put forward coherency between models and experience. On the other hand, given a reference wind cycle, it is possible to obtain optimal devices (generator – reducer – DC bus) whose energy efficiency is nearly equivalent to the ones obtained with active and more complex systems with MPPT control. Based on a sensitivity analysis of performance versus parametric uncertainties, one major contribution deals with a design methodology integrating robustness issues inside the optimization process.
688

Prédiction des impacts pharmacocinétiques des interactions médicamenteuses impliquant des CYP3A et les glycoprotéines-P : développement de modèles physiologiques et analyse de sensibilité

Fenneteau, Frédérique 11 1900 (has links)
Les propriétés pharmacocinétiques d’un nouveau médicament et les risques d’interactions médicamenteuses doivent être investigués très tôt dans le processus de recherche et développement. L’objectif principal de cette thèse était de concevoir des approches prédictives de modélisation du devenir du médicament dans l’organisme en présence et en absence de modulation d’activité métabolique et de transport. Le premier volet de recherche consistait à intégrer dans un modèle pharmacocinétique à base physiologique (PBPK), le transport d’efflux membranaire gouverné par les glycoprotéines-P (P-gp) dans le cœur et le cerveau. Cette approche, basée sur des extrapolations in vitro-in vivo, a permis de prédire la distribution tissulaire de la dompéridone chez des souris normales et des souris déficientes pour les gènes codant pour la P-gp. Le modèle a confirmé le rôle protecteur des P-gp au niveau cérébral, et a suggéré un rôle négligeable des P-gp dans la distribution tissulaire cardiaque pour la dompéridone. Le deuxième volet de cette recherche était de procéder à l’analyse de sensibilité globale (ASG) du modèle PBPK précédemment développé, afin d’identifier les paramètres importants impliqués dans la variabilité des prédictions, tout en tenant compte des corrélations entre les paramètres physiologiques. Les paramètres importants ont été identifiés et étaient principalement les paramètres limitants des mécanismes de transport à travers la membrane capillaire. Le dernier volet du projet doctoral consistait à développer un modèle PBPK apte à prédire les profils plasmatiques et paramètres pharmacocinétiques de substrats de CYP3A administrés par voie orale à des volontaires sains, et de quantifier l’impact d’interactions médicamenteuses métaboliques (IMM) sur la pharmacocinétique de ces substrats. Les prédictions des profils plasmatiques et des paramètres pharmacocinétiques des substrats des CYP3A ont été très comparables à ceux mesurés lors d’études cliniques. Quelques écarts ont été observés entre les prédictions et les profils plasmatiques cliniques mesurés lors d’IMM. Cependant, l’impact de ces inhibitions sur les paramètres pharmacocinétiques des substrats étudiés et l’effet inhibiteur des furanocoumarins contenus dans le jus de pamplemousse ont été prédits dans un intervalle d’erreur très acceptable. Ces travaux ont contribué à démontrer la capacité des modèles PBPK à prédire les impacts pharmacocinétiques des interactions médicamenteuses avec une précision acceptable et prometteuse. / Early knowledge of pharmacokinetic properties of a new drug candidate and good characterization of the impact of drug-drug interaction (DDI) on those properties is of crucial importance in the process of drug research and development. The main objective of this thesis consisted in the conception of PBPK models able to predict the drug disposition in the absence and presence of metabolic and transport activity modulation. The first part of this work aimed to develop a PBPK model that incorporates the efflux function of P-gp expressed in various tissues, in order to predict the impact of P-gp activity modulation on drug distribution. This approach, based on in vivo-in vitro extrapolation for estimating the transport-related parameters, allowed the prediction of domperidone distribution in heart and brain of wild-type mice and P-gp deficient mice. The model pointed out the protective function of P-gp in brain whereas it showed the negligible protective effect of P-gp in heart. The second part of the project aimed to perform the global sensitivity analysis of the previous PBPK model, in order to investigate how the uncertainly and variability of the correlated physiological parameters influence the outcome of the drug distribution process. While a moderate variability of the model predictions was observed, this analysis confirmed the importance for a better quantitative characterization of parameters related to the transport processes trough the blood-tissue membrane. Accounting for the input correlation allowed the delineation of the true contribution of each input to the variability of the model outcome. The last part of the project consisted in predicting the pharmacokinetics of selected CYP3A substrates administered at a single oral dose to human, alone or with an inhibitor. Successful predictions were obtained for a single administration of the CYP3A substrates. Some deviations were observed between the predictions and in vivo plasma profiles in the presence of DDI. However, the impact of inhibition on the PK parameters of the selected substrates and the impact of grapefruit juice-mediated inhibition on the extent of intestinal pre-systemic elimination were predicted within a very acceptable error range. Overall, this thesis demonstrated the ability of PBPK models to predict DDI with promising accuracy.
689

Comparaison de quatre méthodes pour le traitement des données manquantes au sein d’un modèle multiniveau paramétrique visant l’estimation de l’effet d’une intervention

Paquin, Stéphane 03 1900 (has links)
Les données manquantes sont fréquentes dans les enquêtes et peuvent entraîner d’importantes erreurs d’estimation de paramètres. Ce mémoire méthodologique en sociologie porte sur l’influence des données manquantes sur l’estimation de l’effet d’un programme de prévention. Les deux premières sections exposent les possibilités de biais engendrées par les données manquantes et présentent les approches théoriques permettant de les décrire. La troisième section porte sur les méthodes de traitement des données manquantes. Les méthodes classiques sont décrites ainsi que trois méthodes récentes. La quatrième section contient une présentation de l’Enquête longitudinale et expérimentale de Montréal (ELEM) et une description des données utilisées. La cinquième expose les analyses effectuées, elle contient : la méthode d’analyse de l’effet d’une intervention à partir de données longitudinales, une description approfondie des données manquantes de l’ELEM ainsi qu’un diagnostic des schémas et du mécanisme. La sixième section contient les résultats de l’estimation de l’effet du programme selon différents postulats concernant le mécanisme des données manquantes et selon quatre méthodes : l’analyse des cas complets, le maximum de vraisemblance, la pondération et l’imputation multiple. Ils indiquent (I) que le postulat sur le type de mécanisme MAR des données manquantes semble influencer l’estimation de l’effet du programme et que (II) les estimations obtenues par différentes méthodes d’estimation mènent à des conclusions similaires sur l’effet de l’intervention. / Missing data are common in empirical research and can lead to significant errors in parameters’ estimation. This dissertation in the field of methodological sociology addresses the influence of missing data on the estimation of the impact of a prevention program. The first two sections outline the potential bias caused by missing data and present the theoretical background to describe them. The third section focuses on methods for handling missing data, conventional methods are exposed as well as three recent ones. The fourth section contains a description of the Montreal Longitudinal Experimental Study (MLES) and of the data used. The fifth section presents the analysis performed, it contains: the method for analysing the effect of an intervention from longitudinal data, a detailed description of the missing data of MLES and a diagnosis of patterns and mechanisms. The sixth section contains the results of estimating the effect of the program under different assumptions about the mechanism of missing data and by four methods: complete case analysis, maximum likelihood, weighting and multiple imputation. They indicate (I) that the assumption on the type of MAR mechanism seems to affect the estimate of the program’s impact and, (II) that the estimates obtained using different estimation methods leads to similar conclusions about the intervention’s effect.
690

Développement de nouveaux plans d'expériences uniformes adaptés à la simulation numérique en grande dimension

Santiago, Jenny 04 February 2013 (has links)
Cette thèse propose une méthodologie pour des études en simulation numérique en grande dimension. Elle se décline en différentes étapes : construction de plan d'expériences approprié, analyse de sensibilité et modélisation par surface de réponse. Les plans d'expériences adaptés à la simulation numérique sont les "Space Filling Designs", qui visent à répartir uniformément les points dans l'espace des variables d'entrée. Nous proposons l'algorithme WSP pour construire ces plans, rapidement, avec de bons critères d'uniformité, même en grande dimension. Ces travaux proposent la construction d'un plan polyvalent, qui sera utilisé pour les différentes étapes de l'étude : de l'analyse de sensibilité aux surfaces de réponse. L'analyse de sensibilité sera réalisée avec une approche innovante sur les points de ce plan, pour détecter le sous-ensemble de variables d'entrée réellement influentes. Basée sur le principe de la méthode de Morris, cette approche permet de hiérarchiser les variables d'entrée selon leurs effets. Le plan initial est ensuite "replié" dans le sous-espace des variables d'entrée les plus influentes, ce qui nécessite au préalable une étude pour vérifier l'uniformité de la répartition des points dans l'espace réduit et ainsi détecter d'éventuels amas et/ou lacunes. Ainsi, après réparation, ce plan est utilisé pour l'étape ultime : étude de surfaces de réponse. Nous avons alors choisi d'utiliser l'approche des Support Vector Regression, indépendante de la dimension et rapide dans sa mise en place. Obtenant des résultats comparables à l'approche classique (Krigeage), cette technique semble prometteuse pour étudier des phénomènes complexes en grande dimension. / This thesis proposes a methodology of study in numeric simulation for high dimensions. There are several steps in this methodology : setting up an experimental design, performing sensitivity analysis, then using response surface for modelling. In numeric simulation, we use a Space Filling Design that scatters the points in the entire domain. The construction of an experimental design in high dimensions must be efficient, with good uniformity properties. Moreover, this construction must be fast. We propose using the WSP algorithm to construct such an experimental design. This design is then used in all steps of the methodology, making it a versatile design, from sensitivity analysis to modelling. A sensitivity analysis allows identifying the influent factors. Adapting the Morris method principle, this approach classifies the inputs into three groups according to their effects. Then, the experimental design is folded over in the subspace of the influent inputs. This action can modify the uniformity properties of the experimental design by creating possible gaps and clusters. So, it is necessary to repair it by removing clusters and filling gaps. We propose a step-by-step approach to offer suitable repairing for each experimental design. Then, the repaired design is used for the final step: modelling from the response surface. We consider a Support Vector Machines method because dimension does not affect the construction. Easy to construct and with good results, similar to the results obtained by Kriging, the Support Vector Regression method is an alternative method for the study of complex phenomena in high dimensions.

Page generated in 0.0734 seconds