• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 345
  • 128
  • 49
  • 39
  • 12
  • 10
  • 9
  • 7
  • 5
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 709
  • 183
  • 94
  • 88
  • 87
  • 76
  • 69
  • 54
  • 53
  • 53
  • 53
  • 51
  • 49
  • 43
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
391

An Efficient Randomized Approximation Algorithm for Volume Estimation and Design Centering

Asmus, Josefine 03 July 2017 (has links) (PDF)
Die Konstruktion von Systemen oder Modellen, welche unter Unsicherheit und Umweltschwankungen robust arbeiten, ist eine zentrale Herausforderung sowohl im Ingenieurwesen als auch in den Naturwissenschaften. Dies ist im Design-Zentrierungsproblem formalisiert als das Finden eines Designs, welches vorgegebene Spezifikationen erfüllt und dies mit einer hohen Wahrscheinlichkeit auch noch tut, wenn die Systemparameter oder die Spezifikationen zufällig schwanken. Das Finden des Zentrums wird oft durch das Problem der Quantifizierung der Robustheit eines Systems begleitet. Hier stellen wir eine neue adaptive statistische Methode vor, um beide Probleme gleichzeitig zu lösen. Unsere Methode, Lp-Adaptation, ist durch Robustheit in biologischen Systemen und durch randomisierte Lösungen für konvexe Volumenberechnung inspiriert. Lp-Adaptation ist in der Lage, beide Probleme im allgemeinen, nicht-konvexen Fall und bei niedrigen Rechenkosten zu lösen. In dieser Arbeit beschreiben wir die Konzepte des Algorithmus und seine einzelnen Schritte. Wir testen ihn dann anhand bekannter Vergleichsfälle und zeigen seine Anwendbarkeit in elektronischen und biologischen Systemen. In allen Fällen übertrifft das vorliegende Verfahren den bisherigen Stand der Technik. Dies ermöglicht die Umformulierung von Optimierungsproblemen im Ingenieurwesen und in der Biologie als Design-Zentrierungsprobleme unter Berücksichtigung der globalen Robustheit des Systems. / The design of systems or models that work robustly under uncertainty and environmental fluctuations is a key challenge in both engineering and science. This is formalized in the design centering problem, defined as finding a design that fulfills given specifications and has a high probability of still doing so if the system parameters or the specifications randomly fluctuate. Design centering is often accompanied by the problem of quantifying the robustness of a system. Here we present a novel adaptive statistical method to simultaneously address both problems. Our method, Lp-Adaptation, is inspired by how robustness evolves in biological systems and by randomized schemes for convex volume computation. It is able to address both problems in the general, non-convex case and at low computational cost. In this thesis, we describe the concepts of the algorithm and detail its steps. We then test it on known benchmarks, and demonstrate its real-world applicability in electronic and biological systems. In all cases, the present method outperforms the previous state of the art. This enables re-formulating optimization problems in engineering and biology as design centering problems, taking global system robustness into account.
392

The evolutionary dynamics of biochemical networks in fluctuating environments

Platt, Robert John January 2011 (has links)
Typically, systems biology focuses on the form and function of networks of biochemical interactions. Questions inevitably arise as to the evolutionary origin of those networks' properties. Such questions are of interest to a growing number of systems biologists, and several groups have published studies shown how varying environments can affect network topology and lead to increased evolvability. For decades, evolutionary biologists have also investigated the evolution of evolvability and its relationship to the interactions between genotype and phenotype. While the perspectives of systems and evolutionary biologists sometimes differ, their interests in patterns of interactions and evolvability have much in common. This thesis attempts to bring together the perspectives of systems and evolutionary theory to investigate the evolutionary effects of fluctuating environments. Chapter 1 introduces the necessary themes, terminology and literature from these fields. Chapter 2 explores how rapid environmental fluctuations, or "noise", affects network size and robustness. In Chapter 3, we use the Avida platform to investigate the relationship between genetic architecture, fluctuating environments and population biology. Chapter 4 examines contingency loci as a physical basis for evolvability, while chapter 5 presents a 500-generation laboratory evolution experiment which exposes E. coli to varying environments. The final discussion, concludes that the evolution of generalism can lead to genetic architectures which confer evolvability, which may arise in rapidly fluctuating environments as a by-product of generalism rather than as a selected trait.
393

Testabilité des services Web / Web services testability

Rabhi, Issam 09 January 2012 (has links)
Cette thèse s’est attaquée sous diverses formes au test automatique des services Web : une première partie est consacrée au test fonctionnel à travers le test de robustesse. La seconde partie étend les travaux précédents pour le test de propriétés non fonctionnelles, telles que les propriétés de testabilité et de sécurité. Nous avons abordé ces problématiques à la fois d’un point de vue théorique et pratique. Nous avons pour cela proposé une nouvelle méthode de test automatique de robustesse des services Web non composés, à savoir les services Web persistants (stateful) et ceux non persistants. Cette méthode consiste à évaluer la robustesse d’un service Web par rapport aux opérations déclarées dans sa description WSDL, en examinant les réponses reçues lorsque ces opérations sont invoquées avec des aléas et en prenant en compte l’environnement SOAP. Les services Web persistants sont modélisés grâce aux systèmes symboliques. Notre méthode de test de robustesse dédiée aux services Web persistants consiste à compléter la spécification du service Web afin de décrire l’ensemble des comportements corrects et incorrects. Puis, en utilisant cette spécification complétée, les services Web sont testés en y intégrant des aléas. Un verdict est ensuite rendu. Nous avons aussi réalisé une étude sur la testabilité des services Web composés avec le langage BPEL. Nous avons décrit précisément les problèmes liés à l’observabilité qui réduisent la faisabilité du test de services Web. Par conséquent, nous avons évalué des facteurs de la testabilité et proposé des solutions afin d’améliorer cette dernière. Pour cela, nous avons proposé une approche permettant, en premier lieu, de transformer la spécification ABPEL en STS. Cette transformation consiste à convertir successivement et de façon récursive chaque activité structurée en un graphe de sous-activités. Ensuite, nous avons proposé des algorithmes d’améliorations permettant de réduire ces problèmes de testabilité. Finalement, nous avons présenté une méthode de test de sécurité des services Web persistants. Cette dernière consiste à évaluer quelques propriétés de sécurité, tel que l’authentification, l’autorisation et la disponibilité, grâce à un ensemble de règles. Ces règles ont été crée, avec le langage formel Nomad. Cette méthodologie de test consiste d’abord à transformer ces règles en objectifs de test en se basant sur la description WSDL, ensuite à compléter, en parallèle, la spécification du service Web persistant et enfin à effectuer le produit synchronisé afin de générer les cas de test. / This PhD thesis focuses on diverse forms of automated Web services testing : on the one hand, is dedicated to functional testing through robustness testing. On the other hand, is extends previous works on the non-functional properties testing, such as the testability and security properties. We have been exploring these issues both from a theoretical and practical perspective. We proposed a robustness testing method which generates and executes test cases automatically from WSDL descriptions. We analyze the Web service over hazards to find those which may be used for testing. We show that few hazards can be really handled and then we improve the robustness issue detection by separating the SOAP processor behavior from the Web service one. Stateful Web services are modeled with Symbolic Systems. A second method dedicated to stateful Web services consists in completing the Web service specification to describe correct and incorrect behaviors. By using this completed specification, the Web services are tested with relevant hazards and a verdict is returned. We study the BPEL testability on a well-known testability criterion called observability. To evaluate, we have chosen to transform ABPEL specifications into STS to apply existing methods. Then, from STS testability issues, we deduce some patterns of ABPEL testability degradation. These latter help to finally propose testability enhancement methods of ABPEL specifications. Finally, we proposed a security testing method for stateful Web Services. We define some specific security rules with the Nomad language. Afterwards, we construct test cases from a symbolic specification and test purposes derived from the previous rules. Moreover, to validate our proposal, we have applied our testing approach on real size case studies.
394

Interactive 3D segmentation repair with image-foresting transform, supervoxels and seed robustness / Reparação interativa de segmentações 3D com transformada imagem-floresta, supervoxels, robustez de sementes

Anderson Carlos Moreira Tavares 02 June 2017 (has links)
Image segmentation consists on its partition into relevant regions, such as to isolate the pixels belonging to desired objects in the image domain, which is an important step for computer vision, medical image processing, and other applications. Many times automatic segmentation generates results with imperfections. The user can correct them by editing manually, interactively or can simply discard the segmentation and try to automatically generate another result by a different method. Interactive methods combine benefits from manual and automatic ones, reducing user effort and using its high-level knowledge. In seed-based methods, to continue or repair a prior segmentation (presegmentation), avoiding the user to start from scratch, it is necessary to solve the Reverse Interactive Segmentation Problem (RISP), that is, how to automatically estimate the seeds that would generate it. In order to achieve this goal, we first divide the segmented object into its composing cores. Inside a core, two seeds separately always produce the same result, making one redundant. With this, only one seed per core is required. Cores leading to segmentations which are contained in the result of other cores are redundant and can also be discarded, further reducing the seed set, a process called Redundancy Analysis. A minimal set of seeds for presegmentation is generated and the problem of interactive repair can be solved by adding new seeds or removing seeds. Within the framework of the Image-Foresting Transform (IFT), new methods such as Oriented Image-Foresting Transform (OIFT) and Oriented Relative Fuzzy Connectedness (ORFC) were developed. However, there were no known algorithms for computing the core of these methods. This work develops such algorithms, with proof of correctness. The cores also give an indication of the degree of robustness of the methods on the positioning of the seeds. Therefore, a hybrid method that combines GraphCut and the ORFC cores, as well as the Robustness Coefficient (RC), have been developed. In this work, we present another developed solution to repair segmentations, which is based on IFT-SLIC, originally used to generate supervoxels. Experimental results analyze, compare and demonstrate the potential of these solutions. / Segmentação de imagem consiste no seu particionamento em regiões, tal como para isolar os pixels pertencentes a objetos de interesse em uma imagem, sendo uma etapa importante para visão computacional, processamento de imagens médicas e outras aplicações. Muitas vezes a segmentação automática gera resultados com imperfeições. O usuário pode corrigi-las editando-a manualmente, interativamente ou simplesmente descartar o resultado e gerar outro automaticamente. Métodos interativos combinam os benefícios dos métodos manuais e automáticos, reduzindo o esforço do usuário e utilizando seu conhecimento de alto nível. Nos métodos baseados em sementes, para continuar ou reparar uma segmentação prévia (presegmentação), evitando o usuário começar do zero, é necessário resolver o Problema da Segmentação Interativa Reversa (RISP), ou seja, estimar automaticamente as sementes que o gerariam. Para isso, este trabalho particiona o objeto da segmentação em núcleos. Em um núcleo, duas sementes separadamente produzem o mesmo resultado, tornando uma delas redundante. Com isso, apenas uma semente por núcleo é necessária. Núcleos contidos nos resultados de outros núcleos são redundantes e também podem ser descartados, reduzindo ainda mais o conjunto de sementes, um processo denominado Análise de Redundância. Um conjunto mínimo de sementes para a presegmentação é gerado e o problema da reparação interativa pode então ser resolvido através da adição de novas sementes ou remoção. Dentro do arcabouço da Transformada Imagem-Floresta (IFT), novos métodos como Oriented Image-Foresting Transform (OIFT) e Oriented Relative Fuzzy Connectedness (ORFC) foram desenvolvidos. Todavia, não há algoritmos para calcular o núcleo destes métodos. Este trabalho desenvolve tais algoritmos, com prova de corretude. Os núcleos também nos fornecem uma indicação do grau de robustez dos métodos sobre o posicionamento das sementes. Por isso, um método híbrido do GraphCut com o núcleo do ORFC, bem como um Coeficiente de Robustez (RC), foram desenvolvidos. Neste trabalho também foi desenvolvida outra solução para reparar segmentações, a qual é baseada em IFT-SLIC, originalmente utilizada para gerar supervoxels. Resultados experimentais analisam, comparam e demonstram o potencial destas soluções.
395

Sistemas de informação cientes de processos, robustos e confiáveis / Robust and reliable process-aware information systems

André Luis Schwerz 08 December 2016 (has links)
Atualmente, diversas empresas e organizações estão cada vez mais empreendendo esforços para transformar rapidamente as suas potenciais ideias em produtos e serviços. Esses esforços também têm estimulado a evolução dos sistemas de informação que passaram a ser apoiados por modelos de alto nível de abstração para descrever a lógica do processo. Neste contexto, destaca-se o sucesso dos Sistemas de Informação cientes de Processos (PAIS, do inglês Process-Aware Information Systems) para o gerenciamento de processos de negócios e automação de processos científicos de larga escala (e-Science). Grande parte do sucesso dos PAIS é devido à capacidade de prover funcionalidades genéricas para modelagem, execução e monitoramento dos processos. Essas características são bem-sucedidas quando os modelos de processos têm um caminho bem-comportado no sentido de atingir os seus objetivos. No entanto, situações anômalas que desviam a execução desse caminho bem-comportado ainda representam um significativo desafio para os PAIS. Por causa dos vários tipos de falhas que desviam a execução do comportamento esperado, prover uma execução robusta e confiável é uma tarefa complexa para os atuais PAIS, uma vez que nem todas as situações de falha podem ser eficientemente descritas dentro da estrutura do fluxo tradicional. Como consequência, o tratamento de tais situações geralmente envolve intervenções manuais nos sistemas por operadores humanos, o que resulta em custos adicionais e significativos para as empresas. Neste trabalho é introduzido um método de composição para recuperação ciente de custos e benefícios que é capaz de encontrar e seguir caminhos alternativos que reduzam os prejuízos financeiros do tratamento de exceções. Do ponto de vista prático, esse método provê o tratamento de exceção automatizado e otimizado ao calcular os custos e benefícios de cada caminho de recuperação e escolher o caminho com a melhor relação custo-benefício disponível. Mais especificamente, o método de recuperação proposto estende a abordagem WED-flow (Workflow, Event processing and Data-flow) para permitir a composição ciente de custos e benefícios de passos de recuperação transacionais backward e forward. Por fim, os experimentos mostram que esse método de recuperação pode ser adequadamente incorporado para manipular exceções em uma ampla variedade de processos. / Nowadays, many corporations and organizations are increasingly making efforts to transform quickly and effectively their potential ideas into products and services. These efforts have also stimulated the evolution of information systems that are now supported by higher-level abstract models to describe the process logic. In this context, several sophisticated Process-Aware Information Systems (PAIS) have successfully been proposed for managing business processes and automating large-scale scientific (e-Science) processes. Much of this success is due to their ability to provide generic functionality for modeling, execution and monitoring processes. These functionalities work well when process models have a well-behaved path towards achieving their objectives. However, anomalous situations that fall outside of the well-behaved execution path still pose a significant challenge to PAIS. Because of the many types of failures that may deviate execution away from expected behaviors, provision of robust and reliable execution is a complex task for current PAIS, since not all failure situations can be efficiently modeled within the traditional flow structure. As a consequence, the treatment for such situations usually involves interventions in systems by human operators, which result in significant additional cost for businesses. In this work, we introduce a cost/benefit-aware recovery composition method that is able to find and follow alternative paths to reduce the financial side effects of exception handling. From a practical point of view, this method provides the automated and optimized exception handling, by calculating the cost and benefits of each recovery path, and choosing the recovery path with the best cost/benefits available. More specifically, our recovery method extends the WED-flow (Workflow, Event processing and Data-flow) approach for enabling cost/benefit-aware composition of forward and/or backward transactional recovery steps. Finally, the experiments point out that this recovery method can be suitably incorporated into exception handling within a wide variety of processes.
396

Robust strategies for glucose control in type 1 diabetes

Revert Tomás, Ana 15 October 2015 (has links)
[EN] Type 1 diabetes mellitus is a chronic and incurable disease that affects millions of people all around the world. Its main characteristic is the destruction (totally or partially) of the beta cells of the pancreas. These cells are in charge of producing insulin, main hormone implied in the control of blood glucose. Keeping high levels of blood glucose for a long time has negative health effects, causing different kinds of complications. For that reason patients with type 1 diabetes mellitus need to receive insulin in an exogenous way. Since 1921 when insulin was first isolated to be used in humans and first glucose monitoring techniques were developed, many advances have been done in clinical treatment with insulin. Currently 2 main research lines focused on improving the quality of life of diabetic patients are opened. The first one is concentrated on the research of stem cells to replace damaged beta cells and the second one has a more technological orientation. This second line focuses on the development of new insulin analogs to allow emulating with higher fidelity the endogenous pancreas secretion, the development of new noninvasive continuous glucose monitoring systems and insulin pumps capable of administering different insulin profiles and the use of decision-support tools and telemedicine. The most important challenge the scientific community has to overcome is the development of an artificial pancreas, that is, to develop algorithms that allow an automatic control of blood glucose. The main difficulty avoiding a tight glucose control is the high variability found in glucose metabolism. This fact is especially important during meal compensation. This variability, together with the delay in subcutaneous insulin absorption and action causes controller overcorrection that leads to late hypoglycemia (the most important acute complication of insulin treatment). The proposals of this work pay special attention to overcome these difficulties. In that way interval models are used to represent the patient physiology and to be able to take into account parametric uncertainty. This type of strategy has been used in both the open loop proposal for insulin dosage and the closed loop algorithm. Moreover the idea behind the design of this last proposal is to avoid controller overcorrection to minimize hypoglycemia while adding robustness against glucose sensor failures and over/under- estimation of meal carbohydrates. The algorithms proposed have been validated both in simulation and in clinical trials. / [ES] La diabetes mellitus tipo 1 es una enfermedad crónica e incurable que afecta a millones de personas en todo el mundo. Se caracteriza por una destrucción total o parcial de las células beta del páncreas. Estas células son las encargadas de producir la insulina, hormona principal en el control de glucosa en sangre. Valores altos de glucosa en la sangre mantenidos en el tiempo afectan negativamente a la salud, provocando complicaciones de diversa índole. Es por eso que los pacientes con diabetes mellitus tipo 1 necesitan recibir insulina de forma exógena. Desde que se consiguiera en 1921 aislar la insulina para poder utilizarla en clínica humana, y se empezaran a desarrollar las primeras técnicas de monitorización de glucemia, se han producido grandes avances en el tratamiento con insulina. Actualmente, las líneas de investigación que se están siguiendo en relación a la mejora de la calidad de vida de los pacientes diabéticos, tienen fundamentalmente 2 vertientes: una primera que se centra en la investigación en células madre para la reposición de las células beta y una segunda vertiente de carácter más tecnológico. Dentro de esta segunda vertiente, están abiertas varias líneas de investigación, entre las que se encuentran el desarrollo de nuevos análogos de insulina que permitan emular más fielmente la secreción endógena del páncreas, el desarrollo de monitores continuos de glucosa no invasivos, bombas de insulina capaces de administrar distintos perfiles de insulina y la inclusión de sistemas de ayuda a la decisión y telemedicina. El mayor reto al que se enfrentan los investigadores es el de conseguir desarrollar un páncreas artificial, es decir, desarrollar algoritmos que permitan disponer de un control automático de la glucosa. La principal barrera que se encuentra para conseguir un control riguroso de la glucosa es la alta variabilidad que presenta su metabolismo. Esto es especialmente significativo durante la compensación de las comidas. Esta variabilidad junto con el retraso en la absorción y actuación de la insulina administrada de forma subcutánea favorece la aparición de hipoglucemias tardías (complicación aguda más importante del tratamiento con insulina) a consecuencia de la sobreactuación del controlador. Las propuestas presentadas en este trabajo hacen especial hincapié en sobrellevar estas dificultades. Así, se utilizan modelos intervalares para representar la fisiología del paciente, y poder tener en cuenta la incertidumbre en sus parámetros. Este tipo de estrategia se ha utilizado tanto en la propuesta de dosificación automática en lazo abierto como en el algoritmo en lazo cerrado. Además la principal idea de diseño de esta última propuesta es evitar la sobreactuación del controlador evitando hipoglucemias y añadiendo robustez ante fallos en el sensor de glucosa y en la estimación de las comidas. Los algoritmos propuestos han sido validados en simulación y en clínica. / [CAT] La diabetis mellitus tipus 1 és una malaltia crònica i incurable que afecta milions de persones en tot el món. Es caracteritza per una destrucció total o parcial de les cèl.lules beta del pàncrees. Aquestes cèl.lules són les encarregades de produir la insulina, hormona principal en el control de glucosa en sang. Valors alts de glucosa en la sang mantinguts en el temps afecten negativament la salut, provocant complicacions de diversa índole. És per això que els pacients amb diabetis mellitus tipus 1 necessiten rebre insulina de forma exògena. Des que s'aconseguís en 1921 aïllar la insulina per a poder utilitzar-la en clínica humana, i es començaren a desenrotllar les primeres tècniques de monitorització de glucèmia, s'han produït grans avanços en el tractament amb insulina. Actualment, les línies d'investigació que s'estan seguint en relació a la millora de la qualitat de vida dels pacients diabètics, tenen fonamentalment 2 vessants: un primer que es centra en la investigació de cèl.lules mare per a la reposició de les cèl.lules beta i un segon vessant de caràcter més tecnològic. Dins d' aquest segon vessant, estan obertes diverses línies d'investigació, entre les que es troben el desenrotllament de nous anàlegs d'insulina que permeten emular més fidelment la secreció del pàncrees, el desenrotllament de monitors continus de glucosa no invasius, bombes d'insulina capaces d'administrar distints perfils d'insulina i la inclusió de sistemes d'ajuda a la decisió i telemedicina. El major repte al què s'enfronten els investigadors és el d'aconseguir desenrotllar un pàncrees artificial, és a dir, desenrotllar algoritmes que permeten disposar d'un control automàtic de la glucosa. La principal barrera que es troba per a aconseguir un control rigorós de la glucosa és l'alta variabilitat que presenta el seu metabolisme. Açò és especialment significatiu durant la compensació dels menjars. Aquesta variabilitat junt amb el retard en l'absorció i actuació de la insulina administrada de forma subcutània afavorix l'aparició d'hipoglucèmies tardanes (complicació aguda més important del tractament amb insulina) a conseqüència de la sobreactuació del controlador. Les propostes presentades en aquest treball fan especial insistència en suportar aquestes dificultats. Així, s'utilitzen models intervalares per a representar la fisiologia del pacient, i poder tindre en compte la incertesa en els seus paràmetres. Aquest tipus d'estratègia s'ha utilitzat tant en la proposta de dosificació automàtica en llaç obert com en l' algoritme en llaç tancat. A més, la principal idea de disseny d'aquesta última proposta és evitar la sobreactuació del controlador evitant hipoglucèmies i afegint robustesa. / Revert Tomás, A. (2015). Robust strategies for glucose control in type 1 diabetes [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/56001 / TESIS
397

Robustness and stability in dynamic constraint satisfaction problems

Climent Aunés, Laura Isabel 07 January 2014 (has links)
Constraint programming is a paradigm wherein relations between variables are stated in the form of constraints. It is well-known that many real life problems can be modeled as Constraint Satisfaction Problems (CSPs). Much effort has been spent to increase the efficiency of algorithms for solving CSPs. However, many of these techniques assume that the set of variables, domains and constraints involved in the CSP are known and fixed when the problem is modeled. This is a strong limitation because many problems come from uncertain and dynamic environments, where both the original problem may evolve because of the environment, the user or other agents. In such situations, a solution that holds for the original problem can become invalid after changes. There are two main approaches for dealing with these situations: reactive and proactive approaches. Using reactive approaches entails re-solving the CSP after each solution loss, which is a time consuming. That is a clear disadvantage, especially when we deal with short-term changes, where solution loss is frequent. In addition, in many applications, such as on-line planning and scheduling, the delivery time of a new solution may be too long for actions to be taken on time, so a solution loss can produce several negative effects in the modeled problem. For a task assignment production system with several machines, it could cause the shutdown of the production system, the breakage of machines, the loss of the material/object in production, etc. In a transport timetabling problem, the solution loss, due to some disruption at a point, may produce a delay that propagates through the entire schedule. In addition, all the negative effects stated above will probably entail an economic loss. In this thesis we develop several proactive approaches. Proactive approaches use knowledge about possible future changes in order to avoid or minimize their effects. These approaches are applied before the changes occur. Thus, our approaches search for robust solutions, which have a high probability to remain valid after changes. Furthermore, some of our approaches also consider that the solutions can be easily adapted when they did not resist the changes in the original problem. Thus, these approaches search for stable solutions, which have an alternative solution that is similar to the previous one and therefore can be used in case of a value breakage. In this context, sometimes there exists knowledge about the uncertain and dynamic environment. However in many cases, this information is unknown or hard to obtain. For this reason, for the majority of our approaches (specifically 3 of the 4 developed approaches), the only assumptions made about changes are those inherent in the structure of problems with ordered domains. Given this framework and therefore the existence of a significant order over domain values, it is reasonable to assume that the original bounds of the solution space may undergo restrictive or relaxed modifications. Note that the possibility of solution loss only exists when changes over the original bounds of the solution space are restrictive. Therefore, the main objective for searching robust solutions in this framework is to find solutions located as far away as possible from the bounds of the solution space. In order to meet this criterion, we propose several approaches that can be divided in enumeration-based techniques and a search algorithm. / Climent Aunés, LI. (2013). Robustness and stability in dynamic constraint satisfaction problems [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/34785 / TESIS
398

Bootstrap and uniform bounds for Harris Markov chains / Bootstrap et bornes uniformes pour des chaînes de Markov Harris récurrentes

Ciolek, Gabriela 14 December 2018 (has links)
Cette thèse se concentre sur certaines extensions de la théorie des processus empiriques lorsque les données sont Markoviennes. Plus spécifiquement, nous nous concentrons sur plusieurs développements de la théorie du bootstrap, de la robustesse et de l’apprentissage statistique dans un cadre Markovien Harris récurrent positif. Notre approche repose sur la méthode de régénération qui s’appuie sur la décomposition d’une trajectoire de la chaîne de Markov atomique régénérative en blocs d’observations indépendantes et identiquement distribuées (i.i.d.). Les blocs de régénération correspondent à des segments de la trajectoire entre des instants aléatoires de visites dans un ensemble bien choisi (l’atome) formant une séquence de renouvellement. Dans la premiére partie de la thèse nous proposons un théorème fonctionnel de la limite centrale de type bootstrap pour des chaînes de Markov Harris récurrentes, d’abord dans le cas de classes de fonctions uniformément bornées puis dans un cadre non borné. Ensuite, nous utilisons les résultats susmentionnés pour obtenir unthéorème de la limite centrale pour des fonctionnelles Fréchet différentiables dans un cadre Markovien. Motivés par diverses applications, nous discutons la manière d’étendre certains concepts de robustesse à partir du cadre i.i.d. à un cas Markovien. En particulier, nous considérons le cas où les données sont des processus Markoviens déterministes par morceaux. Puis, nous proposons des procédures d’échantillonnage résiduel et wild bootstrap pour les processus périodiquement autorégressifs et établissons leur validité. Dans la deuxième partie de la thèse, nous établissons des versions maximales d’inégalités de concentration de type Bernstein, Hoeffding et des inégalités de moments polynomiales en fonction des nombres de couverture et des moments des temps de retour et des blocs. Enfin, nous utilisons ces inégalités sur les queues de distributions pour calculer des bornes de généralisation pour une estimation d’ensemble de volumes minimum pour les chaînes de Markov régénératives. / This thesis concentrates on some extensions of empirical processes theory when the data are Markovian. More specifically, we focus on some developments of bootstrap, robustness and statistical learning theory in a Harris recurrent framework. Our approach relies on the regenerative methods that boil down to division of sample paths of the regenerative Markov chain under study into independent and identically distributed (i.i.d.) blocks of observations. These regeneration blocks correspond to path segments between random times of visits to a well-chosen set (the atom) forming a renewal sequence. In the first part of the thesis we derive uniform bootstrap central limit theorems for Harris recurrent Markov chains over uniformly bounded classes of functions. We show that the result can be generalized also to the unbounded case. We use the aforementioned results to obtain uniform bootstrap central limit theorems for Fr´echet differentiable functionals of Harris Markov chains. Propelledby vast applications, we discuss how to extend some concepts of robustness from the i.i.d. framework to a Markovian setting. In particular, we consider the case when the data are Piecewise-determinic Markov processes. Next, we propose the residual and wild bootstrap procedures for periodically autoregressive processes and show their consistency. In the second part of the thesis we establish maximal versions of Bernstein, Hoeffding and polynomial tail type concentration inequalities. We obtain the inequalities as a function of covering numbers and moments of time returns and blocks. Finally, we use those tail inequalities toderive generalization bounds for minimum volume set estimation for regenerative Markov chains.
399

Robust Treatment Planning and Robustness Evaluation for Proton Therapy of Head and Neck Cancer

Cubillos Mesías, Macarena Yasmara 19 January 2021 (has links)
Intensity modulated proton therapy (IMPT) in head and neck squamous cell carcinoma (HNSCC) offers superior advantages over conventional photon therapy, by generating high conformal doses to the target volume and improved sparing of the organ at risks (OARs). Besides, robust treatment planning approaches, which account for uncertainties directly into the plan optimization process, are able to generate high quality plans robust against uncertainties compared to a PTV margin expansion approach. During radiation treatment, patients are prone to present anatomical variations during the treatment course, which can be random deviations in patient positioning, as well as treatment-induced tumor shrinkage and patient weight variations. For IMPT plans using a PTV margin expansion, these anatomical variations might disturb the calculated nominal plan, with a decrease to the dose delivered to the target volume and/or increased dose to the OARs above its tolerance, and a plan adaptation might be needed. However, the influence of these anatomical variations in robustly optimized plans for HNSCC entities has not been determined. The first part of this thesis compared two proton therapy methods, single-field optimization (SFO) and multi-field optimization (MFO), applied to the treatment of unilateral HNSCC target volumes, consisting of a cohort of 8 patients. For each method, a PTV-based and a robustly optimized plan were generated, resulting in four plans per patient. The four plans showed adequate target coverage on the nominal plan, with larger doses to the ipsilateral parotid gland for both SFO approaches. No plan showed a clear advantage when variations in the anatomy during the treatment course were considered, and the same was observe considering additional setup and range uncertainties. Hence, no plan showed a decisive superiority regarding plan robustness and potential need of replanning. In the second part of this thesis, an anatomical robustly optimized plan approach was proposed (aRO), which considers additional CT datasets in the plan optimization, representing random non-rigid patient positioning variations. The aRO approach was compared to a classical robustly optimized plan (cRO) and a PTV-based approach for a cohort of 20 bilateral HNSCC patients. PTV-based and cRO approaches were not sufficient to account for weekly anatomical variations, showing a degradation in the target coverage in 10 and 5 of 20 cases, respectively. Conversely, the proposed aRO approach was able to preserve the target coverage in 19 of 20 cases, with only one patient requiring plan adaptation. An extended robustness analysis conducted on both cRO and aRO plan approaches considering weekly anatomical variations, setup and range errors, showed that the variations in anatomy were the most critical variable for loss in target coverage, while setup and range uncertainties played a minor role. The price of the increased plan robustness for the aRO approach was a significant larger integral dose to the healthy tissue, compared to the cRO plan. However, the increase in integral dose was not reflected on the planned dose to the OARs, which were comparable between both plans. Therefore, the price for a superior plan robustness can be considered as low. In the current clinical practice, the implementation of the aRO approach would be able to reduce the need of plan adaptation. For its application, the acquisition of additional planning CT datasets, considering a complete patient repositioning between scans is required, in order to simulate random non-rigid position variations as simulated in this study by the use of the first two weekly cCTs in the plan optimization. Further studies using multiple planning CT acquisition, including strategies to reduce the patient CT dose such as dual-energy CT and iterative reconstruction algorithms, are needed to confirm the presented findings. Additionally, the aRO approach applied to other body sites and entities might also be investigated. In near future, further in-room imaging methods such as cone-beam CT and magnetic resonance imaging, optimized for proton therapy, might be used to acquire additional datasets. Moreover, alternative approaches capable of modeling variations in patient positioning as biomechanical models and deep learning methods might be able to generate in silico additional image datasets for use in proton treatment planning. In summary, this thesis proposes an additional contribution for robust treatment planning in IMPT, with the generation of treatment plans robust against anatomy variations, together with setup and range uncertainties, which can benefit the clinical workflow by reducing the need of plan adaptation.:Contents List of Figures List of Tables List of Abbreviations 1 Introduction 2 Proton Therapy 2.1 Rationale for Proton Therapy 2.2 Beam Delivery Techniques 2.2.1 Passive Scattering 2.2.2 Pencil Beam Scanning 2.3 Uncertainties in Proton Therapy 2.3.1 Target Volume Definition 2.3.2 Range Uncertainty 2.3.3 Setup Uncertainty 2.3.4 Biological Uncertainty 2.3.5 Anatomical Variations 3 Robust Treatment Planning and Robustness Evaluation 3.1 Robust Treatment Planning 3.1.1 Including Uncertainties in the Optimization 3.1.2 Differences Between Approaches 3.2 Robustness Evaluation 3.2.1 Error Scenarios 3.2.2 Visual Evaluation of Plan Robustness 3.2.3 Summary 4 Illustration of Robust Treatment Planning in a Simple Geometry 4.1 Plan Design 4.2 Plan Results 4.2.1 Doses on Nominal Plan 4.2.2 Influence of Uncertainties in Plan Robustness 4.3 Discussion and Conclusion 5 Evaluation of Robust Treatment Plans in Unilateral Head and Neck Squamous Cell Carcinoma 5.1 Study Design 5.1.1 Calculation Parameters 5.1.2 Plan Robustness Evaluation 5.2 Results 5.2.1 Evaluation of Nominal Plan Doses 5.2.2 Evaluation of Plan Robustness Against Uncertainties 5.3 Discussion 5.4 Conclusions 6 Assessment of Anatomical Robustly Optimized Plans in Bilateral Head and Neck Squamous Cell Carcinoma 6.1 Anatomical Robust Optimization 6.2 Study Design 6.2.1 Calculation Parameters 6.2.2 Assessment of Plan Robustness 6.3 Results 6.3.1 Evaluation of Nominal Plan Doses 6.3.2 Evaluation of Plan Robustness Against Uncertainties 6.4 Discussion 6.4.1 Robustness Against Anatomical Variations 6.4.2 Robustness Against Additional Setup and Range Uncertainties 6.4.3 Study Limitations 6.5 Conclusions 7 Summary 8 Zusammenfassung Bibliography Appendix
400

Nástroj pro testování odolnosti webových služeb / A Tool for Robustness Testing of Web-Services

Zelinka, Tomáš January 2013 (has links)
This project deals with testing of web services. The result of this work will be a tool for load testing of web services using fault injection in their communication. The first part of the project discusses the basic aspects of testing web services. The second part of the work is more focused on testing high loads in combination with fault injection. The tool will allow automated run of the tests. The distributed model of the tool was designed to simulate real loads. In the last chapter are summarized achieved results.

Page generated in 0.2773 seconds