• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 25
  • 25
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

[en] COMBINATORIAL GAMES AND THE NEIGHBORHOOD CONJECTURE / [pt] JOGOS COMBINATÓRIOS E A CONJECTURA DA VIZINHANÇA

HANDEL SCHOLZE MARQUES 22 June 2021 (has links)
[pt] A teoria dos Jogos Combinatórios é o estudo de jogos com informação completa. Isso é, todos os jogadores conhecem todos os possíveis movimentos, além disso, temos que não há sorte ou a habilidade de realizar um movimento, então, em teoria jogar perfeitamente é possível. Exemplos de jogos assim são jogo da velha, xadrez, damas, Nim... a lista continua. Nessa dissertação focamos no jogo Maker-Breaker. Ele tem dois jogadores que sequencialmente escolhem um vértice de um hipergrafo. O objetivo de Maker é escolher todos os vértices de uma aresta e o objetivo de Breaker é prevenir isso. Para entender em quais tipos de hipergrafos Maker ou Breaker ganha e quais são as estratégias de vitória utilizamos SAT, probabilidade, teoria dos grafos em geral e mais. / [en] The theory of Combinatorial Games is the study of games with perfect information. This means that all players have knowledge of all possible moves, also there isn t luck or skill to perform a move, so, in theory perfect play is possible. Examples of games like these are tic-tac-toe, chess, checkers, Nim... the list goes on. In this dissertation we focus on the Maker-Breaker game. It has two players that pick a vertex from a hypergraph. The goal of Maker is to claim all vertices of an edge and the goal of Breaker is to prevent it. To understand in which types of hypergraphs does Maker or Breaker win and what are the winning strategies, we make use of SAT, Probability, general Graph Theory and more.
22

Ice load prediction for design of ice-going ships for inland waterways

Zhang, Meng January 2019 (has links)
With increasing interest in utilizing the inland waterways (IWW) in European countries, the design of IWW vessels gains attention both from a transport efficiency and an emission control point of view. However, unlike in western and central European countries, in Nordic countries, e.g. Sweden, IWW ships must deal with ice on the fairway during every winter. Usually, IWW ships are designed without ice concerns and are structurally weaker compared to ships designed according to ice class notification from the classification societies. Developing such ships requires particular concerns since there is no strict requirements regarding ice class notifications for IWW ships. A primary challenge is to estimate both the global and local ice loads acting on the ship hull structure. To consolidate the design problems for IWW ice-going ships, Lake Mälaren is selected. Ice conditions, i.e. ice type and concentration, and ice data, e.g. ice thickness and ice flexural strength, are extracted and analysed for the ice load estimation. The ice mechanical properties have great influence on the ice load. Ice characteristics are studied based on empirical formulae and properties are calibrated by reference data. The deterministic approach is widely used to predict the ice loads. It is suitable when all variables, i.e. ship geometry and ice properties, are known and refers to rule-based design hereby. For first year light ice conditions in Lake Mälaren, the Finnish Swedish Ice Class Rule (FSICR) is widely used. The thesis uses guidelines from the Finnish Swedish Ice Class Rules as a reference and compare the results with other methods. The probabilistic approach, on contrary, is useful when certain variables are unknow, which are interpreted as random variables, for instance ice breaking pattern. Here the probabilistic method and ice-hull interaction mechanism are studied. The probabilistic method simplifies the ice pressure in relation to the contact area between the ice and the ship hull. It predicts maximum ice pressure acting on the ship hull based on field ice test data and ice exposure conditions. Such semi-empirical method can be used regardless of ship type and size. For this, a numerical model is introduced based on ice-hull collision mechanisms and the essential ice breaking characteristics. The physical mechanism is studied for idealizing ship-ice impact model. The idealization model includes the ice failure process, ice conditions and ship geometry. The ice failure is assumed to be initiated by crushing ice and followed by breaking due to bending failure. Ice properties are set as constant values without any variations. The stochasticity in interact process is represented by randomness in collision location and number of pieces of ice floe formed after breaking. An energy method is used to calculate the ice crushing force, indentation displacement and contact area. The ice bending scenario is simplified as an infinite plate resting on an elastic foundation under a concentrated load. Ice impact load and critical load can be obtained for global and local structural assessment respectively. The structural responses and structural strength of a representative panel at linear and nonlinear contexts are investigated as well. Ship structure is commonly designed with material yield strength as limit. However, the study shows a lighter structure can be achieved if plastic deformation is allowed without causing failure. Therefore, the design can be optimized with regards to ice loading capacity and weight control.
23

Contrôle en temps réel de la précision du suivi indirect de tumeurs mobiles en radiothérapie

Remy, Charlotte 08 1900 (has links)
Le but de la radiothérapie est d’irradier les cellules cancéreuses tout en préservant au maximum les tissus sains environnants. Or, dans le cas du cancer du poumon, la respiration du patient engendre des mouvements de la tumeur pendant le traitement. Une solution possible est de repositionner continuellement le faisceau d’irradiation sur la cible tumorale en mouvement. L’e cacité et la sûreté de cette approche reposent sur la localisation précise en temps réel de la tumeur. Le suivi indirect consiste à inférer la position de la cible tumorale à partir de l’observation d’un signal substitut, visible en continu sans nécessiter de rayonnement ionisant. Un modèle de corrélation spatial doit donc être établi. Par ailleurs, pour compenser la latence du système, l’algorithme de suivi doit pouvoir également anticiper la position future de la cible. Parce que la respiration du patient varie dans le temps, les modèles de prédiction et de corrélation peuvent devenir imprécis. La prédiction de la position de la tumeur devrait alors idéalement être complétée par l’estimation des incertitudes associées aux prédictions. Dans la pratique clinique actuelle, ces incertitudes de positionnement en temps réel ne sont pas explicitement prédites. Cette thèse de doctorat s’intéresse au contrôle en temps réel de la précision du suivi indirect de tumeurs mobiles en radiothérapie. Dans un premier temps, une méthode bayésienne pour le suivi indirect en radiothérapie est développée. Cette approche, basée sur le filtre de Kalman, permet de prédire non seulement la position future de la tumeur à partir d’un signal substitut, mais aussi les incertitudes associées. Ce travail o re une première preuve de concept, et montre également le potentiel du foie comme substitut interne, qui apparait plus robuste et fiable que les marqueurs externes communément utilisés dans la pratique clinique. Dans un deuxième temps, une adaptation de la méthode est proposée afin d’améliorer sa robustesse face aux changements de respiration. Cette innovation permet de prédire des régions de confiance adaptatives, capables de détecter les erreurs de prédiction élevées, en se basant exclusivement sur l’observation du signal substitut. Les résultats révèlent qu’à sensibilité élevée (90%), une spécificité d’environ 50% est obtenue. Un processus de validation innovant basé sur ces régions de confiance adaptatives est ensuite évalué et comparé au processus conventionnel qui consiste en des mesures de la cible à intervalles de temps fixes et prédéterminés. Une version adaptative de la méthode bayésienne est donc développée afin d’intégrer des mesures occasionnelles de la position de la cible. Les résultats confirment que les incertitudes prédites par la méthode bayésienne permettent de détecter les erreurs de prédictions élevées, et démontrent que le processus de validation basé sur ces incertitudes a le potentiel d’être plus e cace que les validations régulières. Ces approches bayésiennes sont validées sur des séquences respiratoires de volontaires, acquises par imagerie par résonance magnétique (IRM) dynamique et interpolées à haute fréquence. Afin de compléter l’évaluation de la méthode bayésienne pour le suivi indirect, une validation expérimentale préliminaire est conduite sur des données cliniques de patients atteints de cancer du poumon. Les travaux de ce projet doctoral promettent une amélioration du contrôle en temps réel de la précision des prédictions lors des traitements de radiothérapie. Finalement, puisque l’imagerie ultrasonore pourrait être employée pour visualiser les substituts internes, une étude préliminaire sur l’évaluation automatique de la qualité des images ultrasonores est présentée. Ces résultats pourront être utilisés ultérieurement pour le suivi indirect en radiothérapie en vue d’optimiser les acquisitions ultrasonores pendant les traitements et faciliter l’extraction automatique du mouvement du substitut. / The goal of radiotherapy is to irradiate cancer cells while maintaining a low dose of radiation to the surrounding healthy tissue. In the case of lung cancer, the patient’s breathing causes the tumor to move during treatment. One possible solution is to continuously reposition the irradiation beam on the moving target. The e ectiveness and safety of this approach rely on accurate real-time localization of the tumor. Indirect strategies derive the target positions from a correlation model with a surrogate signal, which is continuously monitored without the need for radiation-based imaging. In addition, to compensate for system latency, the tracking algorithm must also be able to anticipate the future position of the target. Because the patient’s breathing varies over time, prediction and correlation models can become inaccurate. Ideally, the prediction of the tumor location would also include an estimation of the uncertainty associated with the prediction. However, in current clinical practice, these real-time positioning uncertainties are not explicitly predicted. This doctoral thesis focuses on real-time control of the accuracy of indirect tracking of mobile tumors in radiotherapy. First, a Bayesian method is developed. This approach, based on Kalman filter theory, allows predicting both future target motion in real-time from a surrogate signal and associated uncertainty. This work o ers a first proof of concept, and also shows the potential of the liver as an internal substitute as it appears more robust and reliable than the external markers commonly used in clinical practice. Second, an adaptation of the method is proposed to improve its robustness against changes in breathing. This innovation enables the prediction of adaptive confidence regions that can be used to detect significant prediction errors, based exclusively on the observation of the surrogate signal. The results show that at high sensitivity (90%), a specificity of about 50% is obtained. A new validation process based on these adaptive confidence regions is then evaluated and compared to the conventional validation process (i.e., target measurements at fixed and predetermined time intervals). An adaptive version of the Bayesian method is therefore developed to valuably incorporate occasional measurements of the target position. The results confirm that the uncertainties predicted by the Bayesian method can detect high prediction errors, and demonstrate that the validation process based on these uncertainties has the potential to be more e cient and e ective than regular validations. For these studies, the proposed Bayesian methods are validated on respiratory sequences of volunteers, acquired by dynamic MRI and interpolated at high frequency. In order to complete the evaluation of the Bayesian method for indirect tracking, experimental validation is conducted on clinical data of patients with lung cancer. The work of this doctoral project promises to improve the real-time control of the accuracy of predictions during radiotherapy treatments. Finally, since ultrasound imaging could be used to visualize internal surrogates, a preliminary study on automatic ultrasound image quality assessment is presented. These results can later be used for indirect tracking in radiotherapy to optimize ultrasound acquisitions during treatments and facilitate the automatic estimation of surrogate motion.
24

Variabilité et incertitudes en géotechnique : de leur estimation à leur prise en compte

Dubost, Julien 08 June 2009 (has links)
L’évolution actuelle de l’ingénierie géotechnique place la maîtrise des risques d’origine géotechnique au cœur de ses objectifs. On constate aussi que la complexité des projets d’aménagement (à travers les objectifs coûts/délais/performances qui sont recherchés) est croissante et que les terrains choisis pour les recevoir présentent, quant à eux, des conditions géotechniques de plus en plus souvent « difficiles ». Ces conditions défavorables se traduisent par une variabilité forte des propriétés des sols, rendant leur reconnaissance et leur analyse plus complexe. Ce travail de thèse traite de la caractérisation de la variabilité naturelle des sols et des incertitudes liées aux reconnaissances géotechniques dans le but de mieux les prendre en compte dans les dimensionnements des ouvrages. Il se positionne dans le contexte de la maîtrise des risques de projet d’origine géotechnique. Les principaux outils statistiques servant à décrire la dispersion des données et leur structuration spatiale (géostatistique), ainsi que des méthodes probabilistes permettant d’utiliser leur résultats dans des calculs, sont présentés sous l’angle de leur application en géotechnique. La démarche est appliquée à un projet de plate-forme ferroviaire. Cette infrastructure a été implantée sur un site géologiquement et géotechniquement complexe, et présente aujourd’hui des déformations importantes dues aux tassements des sols. Une nouvelle analyse des données géotechniques a donc été entreprise. Elles ont, au préalable, été regroupées dans une base de données qui a facilité leur traitement statistique et géostatistique. Leur variabilité statistique et spatiale a été caractérisée permettant une meilleure compréhension du site. Le modèle géologique et géotechnique ainsi établi a ensuite été utilisé pour calculer les tassements. Une démarche en trois temps est proposée : globale, locale et spatialisée permettant une estimation des tassements et de leur incertitude, respectivement, à l’échelle du site, aux points de sondages, et spatialisée sur la zone d’étude. Les résultats montrent clairement l’intérêt des méthodes statistiques et géostatistiques pour la caractérisation des sites complexes et l’élaboration d’un modèle géologique et géotechnique du site adapté. La démarche d’analyse des tassements proposée met en avant le fait que les incertitudes des paramètres se répercutent sur les résultats des calculs de dimensionnement et expliquent le comportement global de l’infrastructure. Ces résultats peuvent se traduire sous forme d’une probabilité de ruine qui peut ensuite être utilisée dans un processus de prise de décision et de management des risques. D’une manière plus large, ce travail de thèse constitue une contribution à l’élaboration et l’analyse des campagnes de reconnaissances géotechniques, en ayant le souci d’identifier, d’évaluer et de prendre en compte la variabilité et les incertitudes des données lors des différentes phases du projet pour permettre une meilleure maîtrise du risque d’origine géotechnique. / The current evolution of the geotechnical engineering places the risk management of geotechnical origin in the heart of its objectives. We also notice that the complexity of the projects of development (through the objectives costs/deadline/performances which are sought) is increasing and that soil chosen to receive them present unusual geotechnical conditions. These unfavourable conditions usually mean a strong variability of the soil properties, which induces soil investigation and data analysis more difficult. This work of thesis deals with the characterization of the natural variability of soils and with the uncertainties dues to geotechnical investigations, with the aim to better take them into account in geotechnical engineering project. This work takes place in the context of the management of the risk of project with geotechnical origin. The main statistical tools used for describe the scattering of the data and their spatial variability (geostatistic), as well as the probabilistic methods enabling to use their results in calculations, are presented under the view of their application in geotechnical design. The approach is applied to a project of railway platform. This infrastructure was located on a site where the geology and the geotechnical conditions are complex, and which present important deformations due to the soil settlements. A new analysis of geotechnical data was started again. First, geotechnical data were included in a database in order to ease their statistical and geostatistical treatment. Their statistical and spatial variability were characterized allowing a better understanding of the site. The geologic and geotechnical model so established was then used to assess the settlement effects. An analysis in three levels is proposed: global, local and spatial, which give means to estimate the settlement values and its uncertainty, respectively, on the scale of the site, on the boring points, and on zone of study according to the spatial connectivity of soil properties. The results clearly show the interest of statistical and geostatiscal methods in characterizing complex sites and in the elaboration of a relevant geologic and geotechnical model. The settlement analysis proposed highlight that the parameter uncertainties are of first importance on the design calculations and explain the global behaviour of the infrastructure. These results can be translated in the form of a reliabilitry analysis which can be then used in a process of decision-making and risk management. In a wider way, this work of thesis contributes toward the elaboration and the analysis of the geotechnical investigations, with the aim to identify, to estimate and to take into account the variability and the uncertainties of the data during the various stages of the project. It leads to better control of the risk of geotechnical origin.
25

[pt] MÉTODO PROBABILÍSTICO PARA CONSIDERAÇÃO DE INCERTEZAS BASEADO NO MÉTODO DAS FUNÇÕES DE GREEN E NO MÉTODO ESTATÍSTICO FIRST-ORDER SECONDMOMENT / [en] PROBABILISTIC METHOD FOR UNCERTAINTIES CONSIDERATION IN GEOMECHANICAL PROBLEMS BASED ON GREEN S FUNCTION APPROACH AND FIRST-ORDER SECOND-MOMENT METHOD

LEONARDO CARVALHO MESQUITA 04 May 2023 (has links)
[pt] O presente trabalho propõe um método estatístico computacionalmente eficiente (chamado Green-FOSM) para consideração de incertezas em problemas geomecânicos, com o objetivo de melhorar o processo de tomada de decisão ao analisar problemas associados com o processo de injeção ou depleção de fluídos. A novidade do método proposto está associada com a utilização do método das funções de Green (GFA), que, com o auxílio do método estatístico first-order second-moment (FOSM), é utilizado para propagar as inerentes incertezas associadas às propriedades mecânicas do material para o campo de deslocamento da formação geológica. Além disso, através dos conceitos de grid estocástico e função de autocorrelação, o método proposto permite a consideração da variabilidade espacial de variáveis aleatórias de entrada que representam essas propriedades mecânicas. O GFA utiliza as soluções fundamentais da mecânica clássica (solução fundamental de Kelvin, solução fundamental de Melan, entre outras) e o teorema da reciprocidade para determinar o campo de deslocamento de uma formação geológica com geometria irregular e diferentes tipos de materiais. A grande vantagem deste método em relação ao clássico método dos elementos finitos (MEF) é que ele não requer a imposição de condições de contorno e a análise do problema pode ser realizada considerando apenas o domínio do reservatório ou outras regiões de interesse. Esta estratégia de modelagem diminui os graus de liberdade do modelo e o tempo de processamento da análise. Desta forma, como o GFA requer menos esforço computacional, este método torna-se ideal para ser utilizado na propagação de incertezas em problemas geomecânicos. Inicialmente, baseado no método das funções de Green original proposto por Peres et al. (2021), foi proposto uma versão iterativa do método Green-FOSM, que apresenta resultados estatísticos semelhantes aos encontrados através da clássica simulação de Monte Carlo (SMC). Nesta versão original, o campo de deslocamento é PUC-Rio - Certificação Digital Nº 1912634/CA calculado usando um esquema numérico iterativo que diminui o desempenho computacional do método e pode gerar problemas de convergência. Tais limitações tem dificultado a aplicação do GFA original e do método Green-FOSM iterativo em problemas reais. Assim, o presente trabalho desenvolveu uma nova versão do GFA que utiliza um esquema numérico não-iterativo. Para os problemas de validação analisados, o método não-iterativo demonstra ser até 17.5 vezes mais rápido do que a versão original. Além disso, esta versão demonstra ser capaz de expandir a aplicabilidade do GFA, pois os problemas de convergência foram eliminados e os resultados obtidos por este método, ao analisar um perfil geológico representativo do pré-sal brasileiro, são semelhantes aos encontrados via MEF. Por fim, a partir do GFA não-iterativo foi proposta uma versão não-iterativa do método Green-FOSM. Esta versão não-iterativa é capaz de analisar probabilisticamente formações geológicas complexas, como é o caso das formações geológicas do présal brasileiro. Utilizando os mesmos recursos computacionais, o método GreenFOSM não-iterativo é no mínimo 200 vezes mais rápido que o método iterativo. De forma geral, os resultados encontrados nas análises realizadas (determinísticas e probabilísticas) são próximos dos resultados obtidos pelo método de referência (MEF e SMC, respectivamente). / [en] The present work proposes a computationally efficient stochastic statistical method (called Green-FOSM) that considers uncertainties in geomechanical problems, with the objective of improving the decision-making process related to problems associated with the process of fluid injection or depletion. The novelty of the method lies in the use of the Green s function approach (GFA), which, together, with the first-order second-moment statistical method (FOSM), is used to propagate uncertainties associated with the mechanical properties of material to the displacement field of the geological formation. Furthermore, using the concepts of stochastic grid and autocorrelation function, the proposed method allows the consideration of the spatial variability of random variables that represent these mechanical properties. The GFA uses the fundamental solutions of classical mechanics (Kelvin fundamental solution, Melan fundamental solution, among others) and the reciprocity theorem to calculate the displacement field of a geological formation with irregular geometry, and different types of materials. The great advantage of this method compared to the classical finite element method (FEM) is that it does not require the imposition of boundary conditions and the analysis of the problem can be performed considering only the reservoir or other regions of interest. This modeling strategy decreases the degrees of freedom of the model and the CPU time of the deterministic analysis. In this way, as the GFA requires less computational effort, this approach becomes ideal for propagating the uncertainties in geomechanical problems. Initially, an iterative version of the Green-FOSM method was proposed, which presents statistical results similar to those found through the classic Monte Carlo simulation (MCS). In this initial version, the displacement field is calculated using an iterative numerical scheme, which decreases the computational performance of the method and can generate convergence problems. Such limitations would restrict the application of the PUC-Rio - Certificação Digital number 1912634/CA original GFA and the iterative Green-FOSM method in real problems. Thus, the present work also developed a new version of the GFA, which uses a non-iterative numerical scheme. For the proposed validation problems, the non-iterative method proved to be up to 17.5 times faster than the original version. This version is able to expand the applicability of the GFA, since the convergence problems were eliminated and the results obtained by this method, when analyzing a representative geological profile of the Brazilian pre-salt, are similar to those found via FEM. Finally, based on the non-iterative GFA, a non-iterative version of the Green-FOSM method was proposed. This non-iterative version is capable of probabilistically analyzing complex geological formations, such as the Brazilian pre-salt geological formations. Using the same computational resources, the non-iterative GreenFOSM method is at least 200 times faster than the iterative Green-FOSM method. In general, the results found in the investigated analyzes (deterministic and probabilistic) are close to the results obtained by the reference method (FEM and MCS, respectively).

Page generated in 0.0688 seconds