• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 767
  • 229
  • 138
  • 95
  • 30
  • 29
  • 19
  • 16
  • 14
  • 10
  • 7
  • 5
  • 4
  • 4
  • 4
  • Tagged with
  • 1615
  • 591
  • 342
  • 248
  • 246
  • 235
  • 191
  • 187
  • 177
  • 169
  • 168
  • 160
  • 143
  • 135
  • 132
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1051

Eco-inspired Robust Control Design for Linear Dynamical Systems with Applications

Devarakonda, Nagini 20 October 2011 (has links)
No description available.
1052

Eco-Inspired Robustness Analysis of Linear Uncertain Systems Using Elemental Sensitivities

Dande, Ketan Kiran 19 June 2012 (has links)
No description available.
1053

Sliding Mode Approaches for Robust Control, State Estimation, Secure Communication, and Fault Diagnosis in Nuclear Systems

Ablay, Gunyaz 19 December 2012 (has links)
No description available.
1054

ESSAYS IN NONSTATIONARY TIME SERIES ECONOMETRICS

Xuewen Yu (13124853) 26 July 2022 (has links)
<p>This dissertation is a collection of four essays on nonstationary time series econometrics, which are grouped into four chapters. The first chapter investigates the inference in mildly explosive autoregressions under unconditional heteroskedasticity. The second chapter develops a new approach to forecasting a highly persistent time series that employs feasible generalized least squares (FGLS) estimation of the deterministic components in conjunction with Mallows model averaging. The third chapter proposes new bootstrap procedures for detecting multiple persistence shifts in a time series driven by nonstationary volatility. The last chapter studies the problem of testing partial parameter stability in cointegrated regression models.</p>
1055

[pt] ESTUDO DO IMPACTO DO PROCESSO DE AMOSTRAGEM NA INCERTEZA DE MEDIÇÃO DE ALGUNS PARÂMETROS FÍSICOQUÍMICOS EM ÁGUA E ÓLEO DIESEL / [en] STUDY OF THE IMPACT OF THE SAMPLING PROCESS ON THE MEASUREMENT UNCERTAINTY OF SOME PHYSICAL-CHEMICAL PARAMETERS IN WATER AND DIESEL OIL

VIVIANE DE JESUS LEITE 26 June 2020 (has links)
[pt] O processo de medição é constituído por um conjunto de sub processos no qual seu início está na seleção da amostra até o final da marcha analítica. Para melhor compreensão, estabeleceu-se que a incerteza proveniente do processo de medição é constituída pela associação da incerteza proveniente do conjunto desses sub processos. A literatura organizou esse conjunto em duas grandes vertentes, o processo de amostragem e o processo analítico, cada um com suas incertezas características. Visando melhor estimar a incerteza de medição, é necessário considerar integralmente a incerteza proveniente de ambos os processos, sem antecipadamente determinar qual componente será mais significativo para a incerteza de medição. Este trabalho teve como motivação o desconhecimento da contribuição da incerteza da amostragem na incerteza de medição. Uma vez que o processo de amostragem é normalmente realizado por profissionais pouco experientes, sem ter conhecimento do quanto a incerteza da amostragem pode impactar na incerteza de medição. O objetivo desta dissertação foi avaliar o quão significativa é a incerteza da amostragem na incerteza de medição, além de estimular a disseminação da cultura desta prática em medições físico-químicas. As metodologias utilizadas para o desenvolvimento deste trabalho foram a análise do estado da arte da incerteza da amostragem e o tratamento de um conjunto de dados por quatro diferentes técnicas estatísticas: análise de variância clássica, análise de variância robusta, e dois modelos diferentes da Range Statistics. Os resultados mostraram que não há um padrão de comportamento no que tange à incerteza da amostragem versus a incerteza analítica, bem como a ausência de metodologias que leve em consideração as características dos dados. Conclui-se que a incerteza da amostragem pode impactar a incerteza de medição, independente da matriz e/ou grandeza em questão. / [en] The measurement process consists of a set of sub-processes where the beginning is in the selection of the sample until the end of the analytical steps. In a simplified way, it can be established that the uncertainty coming from the measurement process is constituted by the association of the uncertainty coming from the set of these sub-processes. The literature has organized this set of subprocesses into two important areas, the sampling process and the analytical process, each one with its own uncertainties characteristics. In order to better estimate the measurement uncertainty, it is necessary to fully consider the uncertainty arising from both processes without firstly determining which component will be most significant for the measurement uncertainty. This work project was motivated by the indifference to the sampling process, which is usually carried out by little experienced professionals, without knowledge of how much uncertainty of sampling can affect the measurement uncertainty. The objective of this dissertation was to evaluate how much uncertainty is the uncertainty of the measurement uncertainty in addition to stimulating the dissemination of the culture of this practice in physicochemical measurements; since, actually, only analytical uncertainty is privileged. The methodologies used were the analysis of the state of art of sampling uncertainty and the treatment of a data set by four different statistical techniques: classical variance analysis, robust variance analysis, and two different models of the Range Statistics. The results showed that there is no behavior pattern regarding to sampling uncertainty versus analytical uncertainty, as well as the absence of methodologies that take into consideration the characteristics of the data. It is concluded that the sampling uncertainty can impact the uncertainty of measurement, regardless of the matrix and / or quantity in question.
1056

Robust Reinforcement Learning in Continuous Action/State Space

Grönland, Axel, Eriksson Möllerstedt, Viktor January 2020 (has links)
In this project we aim to apply Robust Reinforce-ment Learning algorithms, presented by Doya and Morimoto [1],[2], to control problems. Specifically, we train an agent to balancea pendulum in the unstable equilibrium, which is the invertedstate.We investigate the performance of controllers based on twodifferent function approximators. One is quadratic, and the othermakes use of a Radial Basis Function neural network. To achieverobustness we will make use of an approach similar toH∞control, which amounts to introducing an adversary in the controlsystem.By changing the mass of the pendulum after training, we aimedto show as in [2] that the supposedly robust controllers couldhandle this disruption better than its non-robust counterparts.This was not the case. We also added a random disturber signalafter training and performed similar tests, but we were againunable to show robustness. / I detta projekt applicerar vi Robust Rein- forcement Learning (RRL) algoritmer, framtagna av Doya och Morimoto [1], [2], på reglerproblem. Målet var att träna en agent att balansera en pendel i det instabila jämviktsläget; det inverterade tillståndet. Vi undersökte prestandan hos regulatorer baserade på två value function approximators. Den ena är kvadratisk och den andra en Radial Basis Function neuralt nätverk. För att skapa robusthet så använder vi en metod som är ekvivalent med H∞ - reglering, som innebär att man introducerar en motståndare i reglersystemet. Genom att ändra pendelns massa efter träning, hoppas vi att som i [2] kunna visa att den förment robusta regulatorn klarar av denna störning bättre än sin icke-robusta mostvarighet. Detta var inte fallet. Vi lade även till en slumpmässig störsignal efter träning och utförde liknande tester, men lyckades inte visa robusthet i detta fall heller. / Kandidatexjobb i elektroteknik 2020, KTH, Stockholm
1057

Contributions to statistical methods for meta-analysis of diagnostic test accuracy studies / Methods for meta-analysis of diagnostic test accuracy studies

Negeri, Zelalem January 2019 (has links)
Meta-analysis is a popular statistical method that synthesizes evidence from multiple studies. Conventionally, both the hierarchical and bivariate models for meta-analysis of diagnostic test accuracy (DTA) studies assume that the random-effects follow the bivariate normal distribution. However, this assumption is restrictive, and inferences could be misleading when it is violated. On the other hand, subjective methods such as inspection of forest plots are used to identify outlying studies in a meta-analysis of DTA studies. Moreover, inferences made using the well-established bivariate random-effects models, when outlying or influential studies are present, may lead to misleading conclusions. Thus, the aim of this thesis is to address these issues by introducing alternative and robust statistical methods. First, we extend the current bivariate linear mixed model (LMM) by assuming a flexible bivariate skew-normal distribution for the random-effects. The marginal distribution of the proposed model is analytically derived so that parameter estimation can be performed using standard likelihood methods. Overall, the proposed model performs better in terms of confidence interval width of the overall sensitivity and specificity, and with regards to bias and root mean squared error of the between-study (co)variances than the traditional bivariate LMM. Second, we propose objective methods based on solid statistical reasoning for identifying outlying and/or influential studies in a meta-analysis of DTA studies. The performances of the proposed methods are evaluated using a simulation study. The proposed methods outperform and avoid the subjectivity of the currently used ad hoc approaches. Finally, we develop a new robust bivariate random-effects model which accommodates outlying and influential observations and leads to a robust statistical inference by down-weighting the effect of outlying and influential studies. The proposed model produces robust point estimates of sensitivity and specificity compared to the standard models, and also generates a similar point and interval estimates of sensitivity and specificity as the standard models in the absence of outlying or influential studies. / Thesis / Doctor of Philosophy (PhD) / Diagnostic tests vary from the noninvasive rapid strep test used to identify whether a patient has a bacterial sore throat to the much complex and invasive biopsy test used to examine the presence, cause, and extent of a severe condition, say cancer. Meta-analysis is a widely used statistical method that synthesizes evidence from several studies. In this thesis, we develop novel statistical methods extending the traditional methods for meta-analysis of diagnostic test accuracy studies. Our proposed methods address the issue of modelling asymmetrical data, identifying outlier studies, and optimally accommodating these outlying studies in a meta-analysis of diagnostic test accuracy studies. Using both real-life and simulated datasets, we show that our proposed methods perform better than conventional methods in a wide range of scenarios. %Therefore, we believe that our proposed methods are essential for methodologists, clinicians and health policy professionals in the process of making a correct judgment to using the appropriate diagnostic test to diagnose patients.
1058

Integrated and Coordinated Relief Logistics Planning Under Uncertainty for Relief Logistics Operations

Kamyabniya, Afshin 22 September 2022 (has links)
In this thesis, we explore three critical emergency logistics problems faced by healthcare and humanitarian relief service providers for short-term post-disaster management. In the first manuscript, we investigate various integration mechanisms (fully integrated horizontal-vertical, horizontal, and vertical resource sharing mechanisms) following a natural disaster for a multi-type whole blood-derived platelets, multi-patient logistics network. The goal is to reduce the amount of shortage and wastage of multi-blood-group of platelets in the response phase of relief logistics operations. To solve the logistics model for a large scale problem, we develop a hybrid exact solution approach involving an augmented epsilon-constraint and Lagrangian relaxation algorithms and demonstrate the model's applicability for a case study of an earthquake. Due to uncertainty in the number of injuries needing multi-type blood-derived platelets, we apply a robust optimization version of the proposed model which captures the expected performance of the system. The results show that the performance of the platelets logistics network under coordinated and integrated mechanisms better control the level of shortage and wastage compared with that of a non-integrated network. In the second manuscript, we propose a two-stage casualty evacuation model that involves routing of patients with different injury levels during wildfires. The first stage deals with field hospital selection and the second stage determines the number of patients that can be transferred to the selected hospitals or shelters via different routes of the evacuation network. The goal of this model is to reduce the evacuation response time, which ultimately increase the number of evacuated people from evacuation assembly points under limited time windows. To solve the model for large-scale problems, we develop a two-step meta-heuristic algorithm. To consider multiple sources of uncertainty, a flexible robust approach considering the worst-case and expected performance of the system simultaneously is applied to handle any realization of the uncertain parameters. The results show that the fully coordinated evacuation model in which the vehicles can freely pick up and off-board the patients at different locations and are allowed to start their next operations without being forced to return to the departure point (evacuation assembly points) outperforms the non-coordinated and non-integrated evacuation models in terms of number of evacuated patients. In the third manuscript, we propose an integrated transportation and hospital capacity model to optimize the assignment of relevant medical resources to multi-level-injury patients in the time of a MCI. We develop a finite-horizon MDP to efficiently allocate resources and hospital capacities to injured people in a dynamic fashion under limited time horizon. We solve this model using the linear programming approach to ADP, and by developing a two-phase heuristics based on column generation algorithm. The results show better policies can be derived for allocating limited resources (i.e., vehicles) and hospital capacities to the injured people compared with the benchmark. Each paper makes a worthwhile contribution to the humanitarian relief operations literature and can help relief and healthcare providers optimize resource and service logistics by applying the proposed integration and coordination mechanisms.
1059

Polynomial Chaos Approaches to Parameter Estimation and Control Design for Mechanical Systems with Uncertain Parameters

Blanchard, Emmanuel 03 May 2010 (has links)
Mechanical systems operate under parametric and external excitation uncertainties. The polynomial chaos approach has been shown to be more efficient than Monte Carlo approaches for quantifying the effects of such uncertainties on the system response. This work uses the polynomial chaos framework to develop new methodologies for the simulation, parameter estimation, and control of mechanical systems with uncertainty. This study has led to new computational approaches for parameter estimation in nonlinear mechanical systems. The first approach is a polynomial-chaos based Bayesian approach in which maximum likelihood estimates are obtained by minimizing a cost function derived from the Bayesian theorem. The second approach is based on the Extended Kalman Filter (EKF). The error covariances needed for the EKF approach are computed from polynomial chaos expansions, and the EKF is used to update the polynomial chaos representation of the uncertain states and the uncertain parameters. The advantages and drawbacks of each method have been investigated. This study has demonstrated the effectiveness of the polynomial chaos approach for control systems analysis. For control system design the study has focused on the LQR problem when dealing with parametric uncertainties. The LQR problem was written as an optimality problem using Lagrange multipliers in an extended form associated with the polynomial chaos framework. The solution to the Hâ problem as well as the H2 problem can be seen as extensions of the LQR problem. This method might therefore have the potential of being a first step towards the development of computationally efficient numerical methods for Hâ design with parametric uncertainties. I would like to gratefully acknowledge the support provided for this work under NASA Grant NNL05AA18A. / Ph. D.
1060

Robust and Equitable Public Health Screening Strategies, with Application to Genetic and Infectious Diseases

El Hajj, Hussein Mohammad 07 June 2021 (has links)
Public health screening plays an important role in the overall healthcare system. As an example, consider newborn screening, a state-level initiative that screens newborns for life-threatening genetic disorders for which early treatment can substantially improve health outcomes. Another topical example is in the realm of infectious disease screening, e.g., screening for COVID-19. The common features of both public health screening problems include large testing populations and resource limitations that inhibit screening efforts. Cost is a major barrier to the inclusion of genetic disorders in newborn screening, and thus screening must be both highly accurate and efficient; and for COVID-19, limited testing kits, and other shortages, have been major barriers to screening efforts. Further, for both newborn screening and infectious disease screening, equity (reducing health disparities among different sub-populations) is an important consideration. We study the testing process design for newborn screening for genetic diseases, considering cystic fibrosis as a model disorder. Our optimization-based models take into account disease-related parameters, subject risk factors, test characteristics, parameter uncertainty, and limited testing resources so as to design equitable, accurate, and robust screening processes that classify newborns as positive or negative for cystic fibrosis. Our models explicitly consider the trade-off between false-negatives, which lead to missed diagnoses, and the required testing resources; and the trade-off between the accuracy and equity of screening. We also study the testing process design for infectious disease screening, considering COVID-19 as a model disease. Our optimization-based models account for key subject risk factors that are important to consider, including the likelihood of being disease-positive, and the potential harm that could be averted through testing and the subsequent interventions. Our objectives include the minimization of harm (through detection and mitigation) or maximization of testing coverage. These are complex problems. We develop novel mathematical models and characterize key structural properties of optimal solutions. This, in turn, allows the development of effective and efficient algorithms that exploit these structural properties. These algorithms are either polynomial- or pseudo-polynomial-time algorithms, and are able to solve realistic-sized problems efficiently. Our case studies on cystic fibrosis screening and COVID-19 screening, based on realistic data, underscore the value of the proposed optimization-based approaches for public health screening, compared to current practices. Our findings have important implications for public policy. / Doctor of Philosophy / Public health screening plays an important role in the overall healthcare system. As an example, consider newborn screening, a state-level initiative that screens newborns for life-threatening genetic disorders for which early treatment can substantially improve health outcomes. Another topical example is in the realm of infectious disease screening, e.g., screening for COVID-19. The common features of both public health screening problems include large testing populations and resource limitations that inhibit screening efforts. Cost is a major barrier to the inclusion of genetic disorders in newborn screening, and thus screening must be both highly accurate and efficient; and for COVID-19, limited testing kits, and other shortages, have been major barriers to screening efforts. Further, for both newborn screening and infectious disease screening, equity (reducing health disparities among different sub-populations) is an important consideration. We study the testing process design for newborn screening for genetic diseases, considering cystic fibrosis as a model disorder. Our optimization-based models take into account disease-related parameters, subject risk factors, test characteristics, parameter uncertainty, and limited testing resources so as to design screening processes that classify newborns as positive or negative for cystic fibrosis. Our models explicitly consider the trade-off between false-negatives, which lead to missed diagnoses, and the required testing resources; and the trade-off between the accuracy and equity of screening. We also study the testing process design for infectious disease screening, considering COVID-19 as a model disease. Our optimization-based models account for key subject risk factors that are important to consider, including the likelihood of being disease-positive, and the potential harm that could be averted through testing and the subsequent interventions. Our objectives include the minimization of harm (through detection and mitigation) or maximization of testing coverage. These are complex problems. We develop novel mathematical models and characterize key structural properties of optimal solutions. This, in turn, allows the development of effective and efficient algorithms that exploit these structural properties. Our case studies on cystic fibrosis screening and COVID-19 screening, based on realistic data, underscore the value of the proposed optimization-based approaches for public health screening, compared to current practices. Our findings have important implications for public policy.

Page generated in 0.079 seconds