• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 102
  • 54
  • 20
  • 7
  • 5
  • 4
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 239
  • 77
  • 31
  • 28
  • 28
  • 26
  • 26
  • 25
  • 22
  • 21
  • 17
  • 17
  • 16
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Modélisation de corps mous appliquée à la commande de procédé robotisé de découpe anatomique de muscles / Soft material modeling applied to the control of robotized technology of deboning and muscle separation in meat cutting

Essahbi, Nabil 13 December 2013 (has links)
Cette thèse intervient dans le cadre du projet ANR ARMS. L'objectif est de concevoir un système robotisé multi-bras pour la découpe anatomique de muscles. Ce travail vise à développer les modèles mécaniques nécessaires à la mise en place de la stratégie de commande. Il expose le cycle de développement d'un modèle mécanique faisant intervenir la construction de modèles géométriques à partir d'images IRM, l'identification expérimentale des paramètres rhéologiques des matériaux modélisés en passant par les étapes de maillage, de paramétrage, d'implémentation et de validation de tels modèles. Il présente une nouvelle méthode de modélisation dynamique de structures intitulée modèle masse-ressort non-linéaire isotrope transverse, une méthode qui témoigne d'un comportement mécanique alliant réalisme et interactivité. Il intervient aussi dans l'identification dynamique des trajectoires de coupe robotisée en proposant de nouvelles approches de modélisation de la découpe de corps mous et en développant un nouvel algorithme basé sur le calcul de courbures. Cette thèse aborde, aussi, le problème de variabilité des muscles bovins et propose une méthode de recalage dimensionnel du modèle géométrique générique par le biais de transformations géométriques définies par optimisation multicritère d'une fonction objectif. Enfin, en vue de synchroniser le flux d'informations entre les différents modules de commande de la cellule robotisée, une combinaison de la méthode des éléments finis avec la technique de condensation statique de Guyan a permis de développer un modèle mécanique quasi-statique réduit permettant de prédire rapidement l'évolution de la trajectoire de coupe robotisée. / This PhD thesis is done within the framework of the ANR ARMS project. The global objective is to study the robotization of deboning and muscle separation in meat cutting and transformation processes applied to beef rounds. This work aims to develop the necessary mechanical models feeding the process control strategy. It outlines the development cycle of a mechanical model involving the construction of geometrical models using MRI techniques, the experimental identification of rheological parameters of materials while going through the steps of meshing, parameterization, implementation and validation of such models. It presents and tests a new way to fix the parameters of the mass-spring model whilst taking into account material anisotropy. The new approach is entitled « non-linear transversally isotropic mass-sping model » and sets the model in non-linear mechanical behavior mode which therefore increases the realism of the simulations performed. It is also involved in the dynamical estimation of the cutting guideline by proposing new approaches for soft materials cutting and by developing a new algorithm based on vision perception and curvature estimation of 3D surfacic meshes. This work addresses also the problem of muscles variability and provides a readjustement method of the generic geometrical model based on a multicriteria optimization of an objective function. Finally, in order to synchronize the information flow between the control modules of the robotic cell, a combination of the finite element method with Guyan static condensation technique allowed developing a reduced quasi-static mechanical model rapidly predicting the evolution of the cutting trajectory.
192

[en] MATERIALIZATION AND MAINTENANCE OF OWL: SAMEAS LINKS / [pt] MATERIALIZAÇÃO E MANUTENÇÃO DE LIGAÇÕES OWL: SAMEAS

CARLA GONCALVES OUROFINO 17 January 2017 (has links)
[pt] A Web de Dados cresceu significativamente nos últimos anos, tanto em quantidade de dados, quanto em fontes responsáveis por esses. A partir desse aumento no número de fontes de dados, ligações owl:sameAs têm sido cada vez mais utilizadas para conectar dados equivalentes e publicados por fontes distintas. Com isso, torna-se necessário haver uma rotina de identificação e manutenção dessas conexões. Com o objetivo de automatizar essa tarefa, desenvolvemos o Framework MsA – Materialização de sameAs para materializar e recomputar ligações do tipo owl:sameAs entre bancos de dados locais e dados publicados na Web. Essas ligações, uma vez identificadas, são materializadas juntamente aos dados locais e recomputadas apenas quando necessário. Para isso, a ferramenta monitora as operações (cadastramento, remoção e atualização) realizadas nos dados locais e remotos e, para cada tipo, implementa uma estratégia de manutenção das ligações envolvidas. / [en] The Web of Data has grown significantly in recent years, not only in the amount of data but also in the number of data sources. In parallel with this tendency, owl:sameAs links have been increasingly used to connect equivalent data published by different sources. As a consequence, it becomes necessary to have a routine for the identification and maintenance of these connections. In order to automate this task, we have developed the MsA Framework - sameAs Materialization to materialize and recompute owl:sameAs links between local databases and data published on the Web. These connections, once identified, are materialized along with the local data and recomputed only when necessary. To achieve this goal, the tool monitors the operations (insertion, update and deletion) performed on local and remote records, and for each type of operation it implements a maintenance strategy on the links involved.
193

Vibration-Based Structural Health Monitoring of Structures Using a New Algorithm for Signal Feature Extraction and Investigation of Vortex-Induced Vibrations

Qarib, Hossein January 2020 (has links)
No description available.
194

AKTUALIZACE DTMM S VYUŽITÍM MOBILNÍHO SKENOVACÍHO SYSTÉMU / UPDATE OF DTMM USING MOBILE SCANNING SYSTEM

Cimpl, Tomáš January 2013 (has links)
This diploma thesis deals with the updating of digital technical city map. Specifically with map updating using data from mobile scanning system and a comparison with map updating performed without the use of a mobile mapping system. Aim of the thesis was to update the digital technical map of Pardubice in the range of map sheet Pardubice 8-1/21.
195

Numerical analysis and model updating of a steel-concrete composite bridge : Parametric study & Statistical evaluation

Abdulrahman, Keiwan, Potrus, Fadi January 2015 (has links)
In the year 2006, only 10 years after the steel- concrete composite bridge, Vårby bridge was built, fatigue cracks were found during an inspection. To further investigate the reasons and the potential danger of the cracks, an investigation under the commission of the Swedish Transport Administration was issued in 2009. After the detection of fatigue cracks, several measurements were carried out in order to monitor the static behavior by the use of strain gauges at selected positions along the bridge. The measurements from the strain gauges monitoring the global behavior were then used to calibrate an finite element model.   The present report is part of the research of understanding the behavior of steel-concrete composite bridges. Numerical analysis and model updating have been used in order to understand and determine how different parameters affects the strain range and the global behavior. The numerical analysis and parameter study were performed in the Finite Element software Abaqus and programming language Python. The outcome of the parameter study was then used to perform the model updating by the method of falsification in MATLAB.   The results from the parameter study and the model updating showed that the measured strains could be reached with a wide range of parameter combinations. Even with unreasonable parameter values, the measured strains were obtained. To investigate the reason for this, a multiple linear regression analysis was performed which showed that the strain range is strongly correlated to the Young’s modulus of steel and concrete and also to the connector elasticity, which resembles the studs in the real bridge.   Two different finite element models, with two completely different input parameter values, obtain the same strain range for the global behavior. It is therefore not certain to assume that a model is accurate and valid based on the fact that the predicted strain range from the finite element model is close to the measured strain range since the global behavior of a steel- concrete composite bridge can be modeled by many different sets of parameters.
196

Mise à jour d’une base de données d’occupation du sol à grande échelle en milieux naturels à partir d’une image satellite THR / Updating large-scale land-use database on natural environments from a VHR satellite image

Gressin, Adrien 12 December 2014 (has links)
Les base de données (BD) d'Occupation du Sol (OCS) sont d'une grande utilité, dans divers domaines. Les utilisateurs recherchent des niveaux de détails tant géométriques que sémantiques très fins. Ainsi, une telle BD d'OCS à Grande Échelle (OCS-GE) est en cours de constitution à l'IGN. Cependant, pour répondre aux besoins des utilisateurs, cette BD doit être mise à jour le plus régulièrement possible, avec une notion de millésime. Ainsi, des méthodes automatiques de mise à jour doivent être mises en place, afin de traiter rapidement des zones étendues. Par ailleurs, les satellites d'observation de la terre ont fait leurs preuves dans l'aide à la constitution de BD d'OCS à des échelles comparables à celle de CLC. Avec l'arrivée de nouveaux capteurs THR, comme celle du satellite Pléiades, la question de la pertinence de ces images pour la mise à jour de BD d'OCS-GE se pose naturellement. Ainsi, l'objet de cette thèse est de développer une méthode automatique de mise à jour de BDs d'OCS-GE, à partir d'une image satellite THR monoscopique (afin de réduire les coûts d'acquisition), tout en garantissant la robustesse des changements détectés. Le cœur de la méthode est un algorithme d'apprentissage supervisés multi-niveaux appelé MLMOL, qui permet de prendre en compte au mieux les apparences, éventuellement multiples, de chaque thème de la BD. Cet algorithme, complètement indépendant du choix du classifieur et des attributs extraits de l'image, peut être appliqué sur des jeux de données très variés. De plus, la multiplication de classifications permet d'améliorer la robustesse de la méthode, en particulier sur des thèmes ayant des apparences multiples (e,g,. champs labourés ou non, bâtiments de type maison ou hangar industriel, ...). De plus, l'algorithme d'apprentissage est intégré dans une chaîne de traitements (LUPIN) capable, d'une part de s'adapter automatiquement aux différents thèmes de la BD pouvant exister et, d'autre part, d'être robuste à l'existence de thèmes in-homogènes. Par suite, la méthode est appliquée avec succès à une image Pléiades, sur une zone à proximité de Tarbes (65) couverte par la BD OCS-GE constituée par IGN. Les résultats obtenus montrent l'apport des images Pléiades tant en terme de résolution sub-métrique que de dynamique spectrale. D'autre part, la méthode proposée permet de fournir des indicateurs pertinents de changements sur la zone. Nous montrons par ailleurs que notre méthode peut fournir une aide précieuse à la constitution de BD d'OCS issues de la fusion de différentes BDs. En effet, notre méthode a la capacité de prise de décisions lorsque la fusion de BDs génère des zones de recouvrement, phénomène courant notamment lorsque les données proviennent de différentes sources, avec leur propre spécification. De plus, notre méthode permet également de compléter d'éventuels lacunes dans la zone de couverture de la BD générée, mais aussi d'étendre cette couverture sur l'emprise d'une image couvrant une étendue plus large. Enfin, la chaîne de traitements LUPIN est appliquée à différents jeux de données de télédétection afin de valider sa polyvalence et de juger de la pertinence de ces données. Les résultats montrent sa capacité d'adaptation aux données de différentes résolutions utilisées (Pléiades à 0,5m, SPOT 6 à 1,5m et RapidEye à 5m), ainsi que sa capacité à utiliser les points forts des différents capteurs, comme par exemple le canal red-edge de RapidEye pour la discrimination du thème forêts, le bon compromis de résolution que fournit SPOT 6 pour le thème zones bâties et l'apport de la THR de Pléiades pour discriminer des thèmes précis comme les routes ou les haies. / Land-Cover geospatial databases (LC-BDs) are mandatory inputs for various purposes such as for natural resources monitoring land planning, and public policies management. To improve this monitoring, users look for both better geometric, and better semantic levels of detail. To fulfill such requirements, a large-scale LC-DB is being established at the French National Mapping Agency (IGN). However, to meet the users needs, this DB must be updated as regularly as possible while keeping the initial accuracies. Consequently, automatic updating methods should be set up in order to allow such large-scale computation. Furthermore, Earth observation satellites have been successfully used to the constitution of LC-DB at various scales such as Corine Land Cover (CLC). Nowadays, very high resolution (VHR) sensors, such as Pléiades satellite, allow to product large-scale LC-DB. Consequently, the purpose of this thesis is to propose an automatic updating method of such large-scale LC-DB from VHR monoscopic satellite image (to limit acquisition costs) while ensuring the robustness of the detected changes. Our proposed method is based on a multilevel supervised learning algorithm MLMOL, which allows to best take into account the possibly multiple appearances of each DB classes. This algorithm can be applied to various images and DB data sets, independently of the classifier, and the attributes extracted from the input image. Moreover, the classifications stacking improves the robustness of the method, especially on classes having multiple appearances (e.g., plowed or not plowed fields, stand-alone houses or industrial warehouse buildings, ...). In addition, the learning algorithm is integrated into a processing chain (LUPIN) allowing, first to automatically fit to the different existing DB themes and, secondly, to be robust to in-homogeneous areas. As a result, the method is successfully applied to a Pleiades image on an area near Tarbes (southern France) covered by the IGN large-scale LC-DB. Results show the contribution of Pleiades images (in terms of sub-meter resolution and spectral dynamics). Indeed, thanks to the texture and shape attributes (morphological profiles, SFS, ...), VHR satellite images give good classification results, even on classes such as roads, and buildings that usually require specific methods. Moreover, the proposed method provides relevant change indicators in the area. In addition, our method provides a significant support for the creation of LC-DB obtain by merging several existing DBs. Indeed, our method allows to take a decision when the fusion of initials DBs generates overlapping areas, particularly when such DBs come from different sources with their own specification. In addition, our method allows to fill potential gaps in the coverage of such generating DB, but also to extend the data to the coverage of a larger image. Finally, the proposed workflow is applied to different remote sensing data sets in order to assess its versatility and the relevance of such data. Results show that our method is able to deal with such different spatial resolutions data sets (Pléiades at 0.5 m, SPOT 6 at 1.5 m and RapidEye at 5 m), and to take into account the strengths of each sensor, e.g., the RapidEye red-edge channel for discrimination theme forest, the good balance of the SPOT~6 resolution for built-up areas classes and the capability of VHR of Pléiades images to discriminate objects of small spatial extent such as roads or hedge.
197

évaluation du risque sismique par approches neuronales / a framework for seismic risk assessment based on artificial neural networks

Wang, Zhiyi 27 November 2018 (has links)
L'étude probabiliste de sûreté (EPS) parasismique est l'une des méthodologies les plus utiliséespour évaluer et assurer la performance des infrastructures critiques, telles que les centrales nucléaires,sous excitations sismiques. La thèse discute sur les aspects suivants: (i) Construction de méta-modèlesavec les réseaux de neurones pour construire les relations entre les intensités sismiques et les paramètresde demande des structures, afin d'accélérer l'analyse de fragilité. L'incertitude liée à la substitution desmodèles des éléments finis par les réseaux de neurones est étudiée. (ii) Proposition d'une méthodologiebayésienne avec réseaux de neurones adaptatifs, afin de prendre en compte les différentes sourcesd'information, y compris les résultats des simulations numériques, les valeurs de référence fournies dansla littérature et les évaluations post-sismiques, dans le calcul de courbes de fragilité. (iii) Calcul des loisd'atténuation avec les réseaux de neurones. Les incertitudes épistémiques des paramètres d'entrée de loisd'atténuation, tels que la magnitude et la vitesse moyenne des ondes de cisaillement de trente mètres, sontprises en compte dans la méthodologie développée. (iv) Calcul du taux de défaillance annuel en combinantles résultats des analyses de fragilité et de l'aléa sismique. Les courbes de fragilité sont déterminées parle réseau de neurones adaptatif, tandis que les courbes d'aléa sont obtenues à partir des lois d'atténuationconstruites avec les réseaux de neurones. Les méthodologies proposées sont appliquées à plusieurs casindustriels, tels que le benchmark KARISMA et le modèle SMART. / Seismic probabilistic risk assessment (SPRA) is one of the most widely used methodologiesto assess and to ensure the performance of critical infrastructures, such as nuclear power plants (NPPs),faced with earthquake events. SPRA adopts a probabilistic approach to estimate the frequency ofoccurrence of severe consequences of NPPs under seismic conditions. The thesis provides discussionson the following aspects: (i) Construction of meta-models with ANNs to build the relations betweenseismic IMs and engineering demand parameters of the structures, for the purpose of accelerating thefragility analysis. The uncertainty related to the substitution of FEMs models by ANNs is investigated.(ii) Proposal of a Bayesian-based framework with adaptive ANNs, to take into account different sourcesof information, including numerical simulation results, reference values provided in the literature anddamage data obtained from post-earthquake observations, in the fragility analysis. (iii) Computation ofGMPEs with ANNs. The epistemic uncertainties of the GMPE input parameters, such as the magnitudeand the averaged thirty-meter shear wave velocity, are taken into account in the developed methodology.(iv) Calculation of the annual failure rate by combining results from the fragility and hazard analyses.The fragility curves are determined by the adaptive ANN, whereas the hazard curves are obtained fromthe GMPEs calibrated with ANNs. The proposed methodologies are applied to various industrial casestudies, such as the KARISMA benchmark and the SMART model.
198

Stratégies numériques innovantes pour l’assimilation de données par inférence bayésienne / Development of innovative numerical strategies for Bayesian data assimilation

Rubio, Paul-Baptiste 15 October 2019 (has links)
Ce travail se place dans le cadre de l'assimilation de données en mécanique des structures. Il vise à développer de nouveaux outils numériques pour l'assimilation de données robuste et en temps réel afin d'être utilisés dans diverses activités d'ingénierie. Une activité cible est la mise en œuvre d'applications DDDAS (Dynamic Data Driven Application System) dans lesquelles un échange continu entre les outils de simulation et les mesures expérimentales est requis dans le but de créer une boucle de contrôle rétroactive sur des systèmes mécaniques connectés. Dans ce contexte, et afin de prendre en compte les différentes sources d'incertitude (erreur de modélisation, bruit de mesure,...), une méthodologie stochastique puissante est considérée dans le cadre général de l’inférence bayésienne. Cependant, un inconvénient bien connu d'une telle approche est la complexité informatique qu’elle engendre et qui rend les simulations en temps réel et l'assimilation séquentielle des données difficiles.Le travail de thèse propose donc de coupler l'inférence bayésienne avec des techniques numériques attrayantes et avancées afin d'envisager l’assimilation stochastique de données de façon séquentielle et en temps réel. Premièrement, la réduction de modèle PGD est introduite pour faciliter le calcul de la fonction de vraisemblance, la propagation des incertitudes dans des modèles complexes et l'échantillonnage de la densité a posteriori. Ensuite, l'échantillonnage par la méthode des Transport Maps est étudiée comme un substitut aux procédures classiques MCMC pour l'échantillonnage de la densité a posteriori. Il est démontré que cette technique conduit à des calculs déterministes, avec des critères de convergence clairs, et qu'elle est particulièrement adaptée à l'assimilation séquentielle de données. Là encore, l'utilisation de la réduction de modèle PGD facilite grandement le processus en utilisant les informations des gradients et hessiens d'une manière simple. Enfin, et pour accroître la robustesse, la correction à la volée du biais du modèle est abordée à l'aide de termes d'enrichissement fondés sur les données. Aussi, la sélection des données les plus pertinentes pour l’objectif d’assimilation est abordée.Cette méthodologie globale est appliquée et illustrée sur plusieurs applications académiques et réelles, comprenant par exemple le recalage en temps réel de modèles pour le contrôle des procédés de soudage, ou l’étude d'essais mécaniques impliquant des structures endommageables en béton instrumentées par mesures de champs. / The work is placed into the framework of data assimilation in structural mechanics. It aims at developing new numerical tools in order to permit real-time and robust data assimilation that could then be used in various engineering activities. A specific targeted activity is the implementation of DDDAS (Dynamic Data Driven Application System) applications in which a continuous exchange between simulation tools and experimental measurements is envisioned to the end of creating retroactive control loops on mechanical systems. In this context, and in order to take various uncertainty sources (modeling error, measurement noise,..) into account, a powerful and general stochastic methodology with Bayesian inference is considered. However, a well-known drawback of such an approach is the computational complexity which makes real-time simulations and sequential assimilation some difficult tasks.The PhD work thus proposes to couple Bayesian inference with attractive and advanced numerical techniques so that real-time and sequential assimilation can be envisioned. First, PGD model reduction is introduced to facilitate the computation of the likelihood function, uncertainty propagation through complex models, and the sampling of the posterior density. Then, Transport Map sampling is investigated as a substitute to classical MCMC procedures for posterior sampling. It is shown that this technique leads to deterministic computations, with clear convergence criteria, and that it is particularly suited to sequential data assimilation. Here again, the use of PGD model reduction highly facilitates the process by recovering gradient and Hessian information in a straightforward manner. Eventually, and to increase robustness, on-the-fly correction of model bias is addressed using data-based enrichment terms.The overall cost-effective methodology is applied and illustrated on several academic and real-life test cases, including for instance the real-time updating of models for the control of welding processes, or that of mechanical tests involving damageable concrete structures with full-field measurements.
199

Dynamic Soil-Structure Interactionof Soil-Steel Composite Bridges : A Frequency Domain Approach Using PML Elements and Model Updating

FERNANDEZ BARRERO, DIEGO January 2019 (has links)
This master thesis covers the dynamic soil structure interaction of soil-steel culverts applyinga methodology based on the frequency domain response. At the first stage of this masterthesis, field tests were performed on one bridge using controlled excitation. Then, themethodology followed uses previous research, the field tests, finite element models (FEM)and perfectly matched layer (PML) elements.Firstly, a 2D model of the analysed bridge, Hårestorp, was made to compare the frequencyresponse functions (FRF) with the ones obtained from the field tests. Simultaneously, a 3Dmodel of the bridge is created for the following purposes: compare it against the 2D modeland the field tests, and to implement a model updating procedure with the particle swarmalgorithm to calibrate the model parameters. Both models use PML elements, which areverified against previous solution from the literature. The verification concludes that thePML behave correctly except for extreme parameter values.In the course of this master thesis, relatively advanced computation techniques were requiredto ensure the computational feasibility of the problem with the resources available.To do that, a literature review of theoretical aspects of parallel computing was performed, aswell as the practical aspects in Comsol. Then, in collaboration with Comsol Support and thehelp given by PDC at KTH it was possible to reduce the computational time to a feasiblepoint of around two weeks for the model updating of the 3D model.The results are inconclusive, in terms of searching for a perfectly fitting model. Therefore,further research is required to adequately face the problem. Nevertheless, there are some accelerometerswhich show a considerable level of agreement. This thesis concludes to discardthe 2D models due to their incapability of facing the reality correctly, and establishes a modeloptimisation methodology using Comsol in connection with Matlab.
200

[pt] AVALIAÇÃO DO CICLO DE VIDA DE PONTES DE AÇO CONSIDERANDO O DANO POR FADIGA E CORROSÃO / [en] LIFE CYCLE ASSESSMENT OF STEEL BRIDGES CONSIDERING FATIGUE AND CORROSION DAMAGE

VERISSA PINTO MARQUES QUEIROZ 14 December 2020 (has links)
[pt] Pontes rodoviárias são estruturas sujeitas à ação conjunta de fadiga e corrosão. Nesse sentido, este trabalho propõe um protótipo de sistema para auxiliar na tomada de decisão sobre a manutenção de pontes de viga de aço, no qual apenas pontes simplesmente apoiadas são consideradas. O sistema considera os efeitos da corrosão-fadiga e estima o tempo de vida útil dessas estruturas a partir de uma análise de confiabilidade estrutural. Tal análise é baseada nas curvas S-N para fadiga da norma americana AASHTO e na regra do acúmulo de dano de Miner. Para considerar a perda de material e a perda de resistência causadas pela corrosão, são realizados ajustes nos parâmetros de fadiga do modelo de confiabilidade utilizado. Além disso, algumas alternativas de manutenção que reduzem a taxa de degradação são consideradas no sistema com a correção dos parâmetros de corrosão do material. O sistema utiliza uma rede neural artificial para estimar as tensões nas vigas de aço com a passagem de um veículo de fadiga da AASHTO a partir de dados geométricos e de material da ponte. A base de dados utilizada para o desenvolvimento da rede neural foi criada a partir de resultados de simulações por elementos finitos. O ciclo de vida de uma ponte de aço projetada conforme a norma americana é simulado usando o sistema proposto. Através dessa simulação, conclui-se que o tempo de vida útil da mesma está relacionado com a combinação dos parâmetros de tráfego e da agressividade do ambiente. Conclui-se também que o aumento no volume de tráfego médio diário de caminhões (ADTT) pode causar uma redução de 48 por cento a 76 por cento na vida útil e dependendo do aumento na sua taxa de crescimento anual a redução pode ser de até 80 por cento. As alternativas de lavagem da superestrutura proporcionam ganho de vida útil para todos os ambientes, sobretudo no ambiente marinho (podendo chegar até 30 por cento). No entanto, em alguns dos cenários de tráfego testados nas simulações esse ganho não é suficiente para garantir o tempo de vida útil mínimo recomendado pela AASHTO. / [en] Highway bridges are structures subjected to the both action of fatigue and corrosion. In this sense, this research proposes the prototype of a system to assist in the maintenance decision making process of steel beam bridges, in which only simply supported steel girder bridges are considered. The system contemplates the effects of corrosion-fatigue and estimates the lifetime of these structures based on a structural reliability analysis. Such analysis is based on AASHTO S-N fatigue curves and Miner s damage accumulation rule. In order to account for material and resistance loss caused by corrosion, adjustments are made to the fatigue parameters of the reliability model used. In addition, some maintenance strategies that reduce the rate of corrosion are considered in the system by correcting the corrosion parameters of the material. The system uses an artificial neural network model to estimate the stresses in steel beams under the passage of an AASHTO fatigue vehicle using geometric and material data from the bridge. The database used for the development of the neural network was created from finite element simulations results. The life cycle of a steel bridge, designed according to the AASHTO American standard, is simulated using the proposed system. Through this simulation, it is concluded that the structure s lifetime is related to the combination of traffic parameters and the environment corrosivity. It is also observed that the increase in the average daily truck traffic (ADTT) can cause a reduction from 48 percent to 76 percent in the bridge s lifetime and depending on its annual growth increase rate the reduction can go up to 80 percent. Maintenance alternatives for the superstructure such as washing provide a lifetime gain for all environments, especially in the marine environment (up to 30 percent). However, in some of the tested traffic scenarios this gain is not sufficient to guarantee the minimum service life recommended by AASHTO.

Page generated in 0.0503 seconds