• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 31
  • 6
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 69
  • 30
  • 24
  • 21
  • 17
  • 11
  • 11
  • 10
  • 9
  • 9
  • 7
  • 7
  • 7
  • 7
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Critérios robustos de seleção de modelos de regressão e identificação de pontos aberrantes / Robust model selection criteria in regression and outliers identification

Guirado, Alia Garrudo 08 March 2019 (has links)
A Regressão Robusta surge como uma alternativa ao ajuste por mínimos quadrados quando os erros são contaminados por pontos aberrantes ou existe alguma evidência de violação das suposições do modelo. Na regressão clássica existem critérios de seleção de modelos e medidas de diagnóstico que são muito conhecidos. O objetivo deste trabalho é apresentar os principais critérios robustos de seleção de modelos e medidas de detecção de pontos aberrantes, assim como analisar e comparar o desempenho destes de acordo com diferentes cenários para determinar quais deles se ajustam melhor a determinadas situações. Os critérios de validação cruzada usando simulações de Monte Carlo e o Critério de Informação Bayesiano são conhecidos por desenvolver-se de forma adequada na identificação de modelos. Na dissertação confirmou-se este fato e além disso, suas alternativas robustas também destacam-se neste aspecto. A análise de resíduos constitui uma forte ferramenta da análise diagnóstico de um modelo, no trabalho detectou-se que a análise clássica de resíduos sobre o ajuste do modelo de regressão linear robusta, assim como a análise das ponderações das observações, são medidas de detecção de pontos aberrantes eficientes. Foram aplicados os critérios e medidas analisados ao conjunto de dados obtido da Estação Meteorológica do Instituto de Astronomia, Geofísica e Ciências Atmosféricas da Universidade de São Paulo para detectar quais variáveis meteorológicas influem na temperatura mínima diária durante o ano completo, e ajustou-se um modelo que permite identificar os dias associados à entrada de sistemas frontais. / Robust Regression arises as an alternative to least squares method when errors are contaminated by outliers points or there are some evidence of violation of model assumptions. In classical regression there are several criteria for model selection and diagnostic measures that are well known. The objective of this work is to present the main robust criteria of model selection and outliers detection measures, as well as to analyze and compare their performance according to different stages to determine which of them fit better in certain situations. The cross-validation criteria using Monte Carlo simulations and Beyesian Information Criterion are known to be adequately developed in model identification. This fact was confirmed, and in addition, its robust alternatives also stand out in this aspect. The residual analysis is a strong tool for model diagnostic analysis, in this work it was detected that the classic residual analysis on the robust linear model regression fit, as well as the analysis of the observations weights, are efficient measures of outliers detection points. The analyzed criteria and measures were applied to the data set obtained from the Meteorological Station of the Astronomy, Geophysics and Atmospheric Sciences Institute of São Paulo University to detect which meteorological variables influence the daily minimum temperature during the whole year, and was fitted a model that allows identify the days associated with the entry of frontal systems.
12

Uncertainty Assessment of Hydrogeological Models Based on Information Theory / Bewertung der Unsicherheit hydrogeologischer Modelle unter Verwendung informationstheoretischer Grundlagen

De Aguinaga, José Guillermo 17 August 2011 (has links) (PDF)
There is a great deal of uncertainty in hydrogeological modeling. Overparametrized models increase uncertainty since the information of the observations is distributed through all of the parameters. The present study proposes a new option to reduce this uncertainty. A way to achieve this goal is to select a model which provides good performance with as few calibrated parameters as possible (parsimonious model) and to calibrate it using many sources of information. Akaike’s Information Criterion (AIC), proposed by Hirotugu Akaike in 1973, is a statistic-probabilistic criterion based on the Information Theory, which allows us to select a parsimonious model. AIC formulates the problem of parsimonious model selection as an optimization problem across a set of proposed conceptual models. The AIC assessment is relatively new in groundwater modeling and it presents a challenge to apply it with different sources of observations. In this dissertation, important findings in the application of AIC in hydrogeological modeling using different sources of observations are discussed. AIC is tested on ground-water models using three sets of synthetic data: hydraulic pressure, horizontal hydraulic conductivity, and tracer concentration. In the present study, the impact of the following factors is analyzed: number of observations, types of observations and order of calibrated parameters. These analyses reveal not only that the number of observations determine how complex a model can be but also that its diversity allows for further complexity in the parsimonious model. However, a truly parsimonious model was only achieved when the order of calibrated parameters was properly considered. This means that parameters which provide bigger improvements in model fit should be considered first. The approach to obtain a parsimonious model applying AIC with different types of information was successfully applied to an unbiased lysimeter model using two different types of real data: evapotranspiration and seepage water. With this additional independent model assessment it was possible to underpin the general validity of this AIC approach. / Hydrogeologische Modellierung ist von erheblicher Unsicherheit geprägt. Überparametrisierte Modelle erhöhen die Unsicherheit, da gemessene Informationen auf alle Parameter verteilt sind. Die vorliegende Arbeit schlägt einen neuen Ansatz vor, um diese Unsicherheit zu reduzieren. Eine Möglichkeit, um dieses Ziel zu erreichen, besteht darin, ein Modell auszuwählen, das ein gutes Ergebnis mit möglichst wenigen Parametern liefert („parsimonious model“), und es zu kalibrieren, indem viele Informationsquellen genutzt werden. Das 1973 von Hirotugu Akaike vorgeschlagene Informationskriterium, bekannt als Akaike-Informationskriterium (engl. Akaike’s Information Criterion; AIC), ist ein statistisches Wahrscheinlichkeitskriterium basierend auf der Informationstheorie, welches die Auswahl eines Modells mit möglichst wenigen Parametern erlaubt. AIC formuliert das Problem der Entscheidung für ein gering parametrisiertes Modell als ein modellübergreifendes Optimierungsproblem. Die Anwendung von AIC in der Grundwassermodellierung ist relativ neu und stellt eine Herausforderung in der Anwendung verschiedener Messquellen dar. In der vorliegenden Dissertation werden maßgebliche Forschungsergebnisse in der Anwendung des AIC in hydrogeologischer Modellierung unter Anwendung unterschiedlicher Messquellen diskutiert. AIC wird an Grundwassermodellen getestet, bei denen drei synthetische Datensätze angewendet werden: Wasserstand, horizontale hydraulische Leitfähigkeit und Tracer-Konzentration. Die vorliegende Arbeit analysiert den Einfluss folgender Faktoren: Anzahl der Messungen, Arten der Messungen und Reihenfolge der kalibrierten Parameter. Diese Analysen machen nicht nur deutlich, dass die Anzahl der gemessenen Parameter die Komplexität eines Modells bestimmt, sondern auch, dass seine Diversität weitere Komplexität für gering parametrisierte Modelle erlaubt. Allerdings konnte ein solches Modell nur erreicht werden, wenn eine bestimmte Reihenfolge der kalibrierten Parameter berücksichtigt wurde. Folglich sollten zuerst jene Parameter in Betracht gezogen werden, die deutliche Verbesserungen in der Modellanpassung liefern. Der Ansatz, ein gering parametrisiertes Modell durch die Anwendung des AIC mit unterschiedlichen Informationsarten zu erhalten, wurde erfolgreich auf einen Lysimeterstandort übertragen. Dabei wurden zwei unterschiedliche reale Messwertarten genutzt: Evapotranspiration und Sickerwasser. Mit Hilfe dieser weiteren, unabhängigen Modellbewertung konnte die Gültigkeit dieses AIC-Ansatzes gezeigt werden.
13

Uncertainty Assessment of Hydrogeological Models Based on Information Theory

De Aguinaga, José Guillermo 03 December 2010 (has links)
There is a great deal of uncertainty in hydrogeological modeling. Overparametrized models increase uncertainty since the information of the observations is distributed through all of the parameters. The present study proposes a new option to reduce this uncertainty. A way to achieve this goal is to select a model which provides good performance with as few calibrated parameters as possible (parsimonious model) and to calibrate it using many sources of information. Akaike’s Information Criterion (AIC), proposed by Hirotugu Akaike in 1973, is a statistic-probabilistic criterion based on the Information Theory, which allows us to select a parsimonious model. AIC formulates the problem of parsimonious model selection as an optimization problem across a set of proposed conceptual models. The AIC assessment is relatively new in groundwater modeling and it presents a challenge to apply it with different sources of observations. In this dissertation, important findings in the application of AIC in hydrogeological modeling using different sources of observations are discussed. AIC is tested on ground-water models using three sets of synthetic data: hydraulic pressure, horizontal hydraulic conductivity, and tracer concentration. In the present study, the impact of the following factors is analyzed: number of observations, types of observations and order of calibrated parameters. These analyses reveal not only that the number of observations determine how complex a model can be but also that its diversity allows for further complexity in the parsimonious model. However, a truly parsimonious model was only achieved when the order of calibrated parameters was properly considered. This means that parameters which provide bigger improvements in model fit should be considered first. The approach to obtain a parsimonious model applying AIC with different types of information was successfully applied to an unbiased lysimeter model using two different types of real data: evapotranspiration and seepage water. With this additional independent model assessment it was possible to underpin the general validity of this AIC approach. / Hydrogeologische Modellierung ist von erheblicher Unsicherheit geprägt. Überparametrisierte Modelle erhöhen die Unsicherheit, da gemessene Informationen auf alle Parameter verteilt sind. Die vorliegende Arbeit schlägt einen neuen Ansatz vor, um diese Unsicherheit zu reduzieren. Eine Möglichkeit, um dieses Ziel zu erreichen, besteht darin, ein Modell auszuwählen, das ein gutes Ergebnis mit möglichst wenigen Parametern liefert („parsimonious model“), und es zu kalibrieren, indem viele Informationsquellen genutzt werden. Das 1973 von Hirotugu Akaike vorgeschlagene Informationskriterium, bekannt als Akaike-Informationskriterium (engl. Akaike’s Information Criterion; AIC), ist ein statistisches Wahrscheinlichkeitskriterium basierend auf der Informationstheorie, welches die Auswahl eines Modells mit möglichst wenigen Parametern erlaubt. AIC formuliert das Problem der Entscheidung für ein gering parametrisiertes Modell als ein modellübergreifendes Optimierungsproblem. Die Anwendung von AIC in der Grundwassermodellierung ist relativ neu und stellt eine Herausforderung in der Anwendung verschiedener Messquellen dar. In der vorliegenden Dissertation werden maßgebliche Forschungsergebnisse in der Anwendung des AIC in hydrogeologischer Modellierung unter Anwendung unterschiedlicher Messquellen diskutiert. AIC wird an Grundwassermodellen getestet, bei denen drei synthetische Datensätze angewendet werden: Wasserstand, horizontale hydraulische Leitfähigkeit und Tracer-Konzentration. Die vorliegende Arbeit analysiert den Einfluss folgender Faktoren: Anzahl der Messungen, Arten der Messungen und Reihenfolge der kalibrierten Parameter. Diese Analysen machen nicht nur deutlich, dass die Anzahl der gemessenen Parameter die Komplexität eines Modells bestimmt, sondern auch, dass seine Diversität weitere Komplexität für gering parametrisierte Modelle erlaubt. Allerdings konnte ein solches Modell nur erreicht werden, wenn eine bestimmte Reihenfolge der kalibrierten Parameter berücksichtigt wurde. Folglich sollten zuerst jene Parameter in Betracht gezogen werden, die deutliche Verbesserungen in der Modellanpassung liefern. Der Ansatz, ein gering parametrisiertes Modell durch die Anwendung des AIC mit unterschiedlichen Informationsarten zu erhalten, wurde erfolgreich auf einen Lysimeterstandort übertragen. Dabei wurden zwei unterschiedliche reale Messwertarten genutzt: Evapotranspiration und Sickerwasser. Mit Hilfe dieser weiteren, unabhängigen Modellbewertung konnte die Gültigkeit dieses AIC-Ansatzes gezeigt werden.
14

Risk Management Project

Yan, Lu 02 May 2012 (has links)
In order to evaluate and manage portfolio risk, we separated this project into three sections. In the first section we constructed a portfolio with 15 different stocks and six options with different strategies. The portfolio was implemented in Interactive Brokers and rebalanced weekly through five holding periods. In the second section we modeled the loss distribution of the whole portfolio with normal and student-t distributions, we computed the Value-at-Risk and expected shortfall in detail for the portfolio loss in each holding week, and then we evaluated differences between the normal and student-t distributions. In the third section we applied the ARMA(1,1)-GARCH(1,1) model to simulate our assets and compared the polynomial tails with Gaussian and t-distribution innovations.
15

Risk Management Project

Shen, Chen 02 May 2012 (has links)
In order to evaluate and manage portfolio risk, we separated this project into three sections. In the first section we constructed a portfolio with 15 different stocks and six options with different strategies. The portfolio was implemented in Interactive Brokers and rebalanced weekly through five holding periods. In the second section we modeled the loss distribution of the whole portfolio with normal and student-t distributions, we computed the Value-at-Risk and expected shortfall in detail for the portfolio loss in each holding week, and then we evaluated differences between the normal and student-t distributions. In the third section we applied the ARMA(1,1)-GARCH(1,1) model to simulate our assets and compared the polynomial tails with Gaussian and t-distribution innovations.
16

Model selection criteria in the presence of missing data based on the Kullback-Leibler discrepancy

Sparks, JonDavid 01 December 2009 (has links)
An important challenge in statistical modeling involves determining an appropriate structural form for a model to be used in making inferences and predictions. Missing data is a very common occurrence in most research settings and can easily complicate the model selection problem. Many useful procedures have been developed to estimate parameters and standard errors in the presence of missing data;however, few methods exist for determining the actual structural form of a modelwhen the data is incomplete. In this dissertation, we propose model selection criteria based on the Kullback-Leiber discrepancy that can be used in the presence of missing data. The criteria are developed by accounting for missing data using principles related to the expectation maximization (EM) algorithm and bootstrap methods. We formulate the criteria for three specific modeling frameworks: for the normal multivariate linear regression model, a generalized linear model, and a normal longitudinal regression model. In each framework, a simulation study is presented to investigate the performance of the criteria relative to their traditional counterparts. We consider a setting where the missingness is confined to the outcome, and also a setting where the missingness may occur in the outcome and/or the covariates. The results from the simulation studies indicate that our criteria provide better protection against underfitting than their traditional analogues. We outline the implementation of our methodology for a general discrepancy measure. An application is presented where the proposed criteria are utilized in a study that evaluates the driving performance of individuals with Parkinson's disease under low contrast (fog) conditions in a driving simulator.
17

Statistical model building and inference about the normalized site attenuation (NSA) measurements for electromagnetic interference (EMI)

Chiu, Shih-ting 09 August 2004 (has links)
Open site measurement on the electromagnetic interference is the most direct and universally accepted standard approach for measuring radiated emissions from an equipment or the radiation susceptibility of a component or equipment. A site is qualified for testing EMI or not is decided by the antenna measurements. In this work, we use data from setups with di erent factors to find relations of measurement and the situation of antenna. A one change point model has been used to fit observed measurements and compare the di erences with two kinds of antenna (broadband antenna and dipole antenna). However, with only one change point model it may not give a suitable fit for all data sets in this work. Therefore, we have tried other models and applied them to the data. Furthermore, we try to set up another standard more strict than ¡Ó4dB based on statistical inference results in deciding whether a site is a better one with more precision in measuring EMI values. Finally, a program by Matlab with a complete analysis based on the procedure performed here is provided, so that it may be used as a standard tool for evaluating whether a site is with good measurement quality in practice.
18

Cyanide-catalyzed C-C bond formation: synthesis of novel compounds, materials and ligands for homogeneous catalysis

Reich, Blair Jesse Ellyn 25 April 2007 (has links)
Cyanide-catalyzed aldimine coupling was employed to synthesize compounds with 1,2-ene-diamine and α-imine-amine structural motifs: 1,2,N,N'- tetraphenyletheylene-1,2-diamine (13) and (+/-)-2,3-di-(2-hydroxyphenyl)-1,2- dihydroquinoxaline (17), respectively. Single crystal X-ray diffraction provided solidstate structures and density functional theory calculations were used to probe isomeric preferences within this and the related hydroxy-ketone/ene-diol system. The enediamine and imine-amine core structures were calculated to be essentially identical in energy. However, additional effects-such as π conjugation-in 13 render an enediamine structure that is slightly more stable than the imine-amine tautomer (14). In contrast, the intramolecular hydrogen bonding present in 17 significantly favors the imine-amine isomer over the ene-diamine tautomer (18). Aldimine coupling (AIC) is the nitrogen analogue of the benzoin condensation and has been applied to dialdimines, providing the first examples of cyclizations effected by cyanide-catalyzed AIC. Sodium cyanide promoted the facile, intramolecular cyclization of several dialdimines in N,N-dimethylformamide, methanol, or dichloromethane/water (phase-transfer conditions) yielding a variety of six-membered heterocycles. Under aerobic conditions, an oxidative cyclization occurs to provide the diimine heterocycle. Cyanide-catalyzed aldimine coupling was employed as a new synthetic method for oligomerization. Nine rigidly spaced dialdimines were oxidatively coupled under aerobic conditions to yield conjugated oligoketimines and polyketimines with unprecedented structure and molecular weight (DP = 2 - 23, ~700 -8200 g/mol). The α- diimine linkage was established based on IR spectroscopy, NMR spectroscopy, size exclusion chromatography, and X-ray crystallographic characterization of the model oxidized dimer of N-benzylidene-(p-phenoxy)-aniline. Cyclic voltammetry indicates ptype electrical conductivity, suggesting they are promising candidates for plastic electronic devices. The cyanide-catalyzed benzoin condensation reaction of 4-substituted benzaldehydes followed by oxidation to the diketone, and the Schiff Base condensation of two equivalents of o-aminophenol provides 2,3-(4-X-phenyl)2-1,4-(2- hydroxyphenyl)2-1,4-diazabutadiene. The ligand is given the moniker X-dabphol. These ligands are readily metallated to form M-X-dabphol complexes. The copper complexes catalytically fix CO2 with propylene oxide to yield propylene carbonate. DFT studies along with a comparison with Hammet parameters help validate and elaborate on the catalytic cycle and the catalytic results obtained. The nickel complex is competent for olefin epoxidation. Synthesis, characterization, X-ray structure, DFT analysis, and catalytic activity of the parent nickel dabphol complex are reported.
19

A Monte Carlo Approach to Change Point Detection in a Liver Transplant

Makris, Alexia Melissa 01 January 2013 (has links)
Patient survival post liver transplant (LT) is important to both the patient and the center's accreditation, but over the years physicians have noticed that distant patients struggle with post LT care. I hypothesized that patient's distance from the transplant center had a detrimental effect on post LT survival. I suspected Hepatitis C (HCV) and Hepatocellular Carcinoma (HCC) patients would deteriorate due to their recurrent disease and there is a need for close monitoring post LT. From the current literature it was not clear if patients' distance from a transplant center affects outcomes post LT. Firozvi et al. (Firozvi AA, 2008) reported no difference in outcomes of LT recipients living 3 hours away or less. This study aimed to examine outcomes of LT recipients based on distance from a transplant center. I hypothesized that the effect of distance from a LT center was detrimental after adjusting for HCV and HCC status. Methods: This was a retrospective single center study of LT recipients transplanted between 1996 and 2012. 821 LT recipients were identified who qualified for inclusion in the study. Survival analysis was performed using standard methods as well as a newly developed Monte Carlo (MC) approach for change point detection. My new methodology, allowed for detection of both a change point in distance and a time by maximizing the two parameter score function (M2p) over a two dimensional grid of distance and time values. Extensive simulations using both standard distributions and data resembling the LT data structure were used to prove the functionality of the model. Results: Five year survival was 0.736 with a standard error of 0.018. Using Cox PH it was demonstrated that patients living beyond 180 miles had a hazard ratio (HR) of 2.68 (p-value<0.004) compared to those within 180 miles from the transplant center. I was able to confirm these results using KM and HCV/HCC adjusted AFT, while HCV and HCC adjusted LR confirmed the distance effect at 180 miles (p=0.0246), one year post LT. The new statistic that has been labeled M2p allows for simultaneous dichotomization of distance in conjunction with the identification of a change point in the hazard function. It performed much better than the previously available statistics in the standard simulations. The best model for the data was found to be extension 3 which dichotomizes the distance Z, replacing it by I(Z>c), and then estimates the change point c and tau. Conclusions: Distance had a detrimental effect and this effect was observed at 180 miles from the transplant center. Patients living beyond 180 miles from the transplant center had 2.68 times the death rate compared to those living within the 180 mile radius. Recipients with HCV fared the worst with the distance effect being more pronounced (HR of 3.72 vs. 2.68). Extensive simulations using different parameter values in both standard simulations and simulations resembling LT data, proved that these new approaches work for dichotomizing a continuous variable and finding a point beyond which there is an incremental effect from this variable. The recovered values were very close to the true values and p-values were small.
20

Apport des réalités virtuelles et augmentées dans la planification et le suivi in situ de travaux de rénovation.

Landrieu, Jeremie 18 December 2013 (has links) (PDF)
Ce mémoire de thèse de doctorat présente l'évaluation d'un système de réalité mixte dans le cadre d'une utilisation sur le site d'opération de construction. Il s'agit notamment de déterminer la pertinence d'un tel outil mobile pour des opérateurs pour les assister dans leurs tâches quotidiennes, et notamment dans la préparation et le suivi des opérations de construction ou rénovation. Cet outil se voudrait être un outil d'aide à la décision pour les maîtres d'ouvrages et maîtres d'œuvre. Le cas d'étude porte sur la rénovation virtuelle des baies anciennes (datées du XVIIIème siècle) dans une cellule des bâtiments conventuels de Cluny (Saône et Loire, France). L'approche consiste à comparer l'efficacité, la précision et la rapidité d'opérateurs dans la réalisation de tâches identiques. Trois méthodes liées à l'utilisation des nouvelles technologies dans le domaine de l'AIC (Architecture Ingénierie et Construction) sont comparées, elles ont guidé la définition du protocole expérimental. La première méthode traditionnelle fournit à l'opérateur de la documentation papier. La seconde met en œuvre la dématérialisation des données constructives, accessibles depuis un ordinateur de bureau. La troisième méthode plus innovante ajoute à la précédente l'accès colocalisé à la base de données grâce à un terminal mobile. Une étude qui a suivi a porté sur la visualisation et l'interprétation de résultats de simulation thermique, en réalité virtuelle. La comparaison des trois premières méthodes a donné lieu à une première expérimentation dont les résultats ont montré la légère prédominance du second scénario (travail sur ordinateur de bureau). Cependant, au-delà des aspects d'ergonomie et d'interface utilisateur, des investigations complémentaires doivent être menées pour étudier l'opportunité du développement du BIM in situ, c'est-à-dire l'utilisation du modèle de données du bâtiment (BIM- Building Information Model) sur le chantier.

Page generated in 0.0238 seconds