• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 26
  • 21
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 107
  • 18
  • 16
  • 15
  • 15
  • 15
  • 15
  • 14
  • 12
  • 11
  • 11
  • 11
  • 9
  • 9
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Design centrado no usuário de um ambiente de reunião instrumentado / User-centered design of an instrumented meeting environment

Vivian Genaro Motti 18 February 2009 (has links)
A computação ubíqua explora o fato de que é possível embutir em um ambiente sistemas computacionais que transparentemente apóiem tarefas cotidianas do usuário. Sistemas desse tipo podem ser aplicados a diferentes domínios contribuindo, por exemplo, com atividades educacionais, médicas ou administrativas. A captura automática de informação em ambientes de computação ubíqua visa documentar atividades cotidianas de modo a possibilitar, posteriormente, o acesso às informações capturadas. O nível de transparência da interação do usuário com as chamadas aplicações ubíquas de captura e acesso é um desafio para projetistas, uma vez que os requisitos da aplicação devem ser conhecidos em profundidade para que a interação usuário-ambiente seja eficiente, eficaz e satisfatória. O trabalho realizado teve por objetivo especificar uma versão da aplicação para comunicação síncrona entre usuários que participem de reuniões distribuídas usando o DiGaE (Distributed Gathering Environment), desenvolver protótipos de interfaces, avaliá-los e sugerir soluções que facilitem a interação do usuário, aproximando o modelo conceitual da aplicação de seu modelo mental. Foram utilizados conceitos de Engenharia de Software para especificação e modelagem do sistema, e de Interação Humano-Computador para o desenvolvimento e a avaliação das interfaces. O projeto realizado orientou a implementação de um protótipo do ambiente DiGaE pela equipe do Projeto TIDIA-Ae, em cujo contexto este trabalho se insere. A principal contribuição do trabalho é um projeto que considera usabilidade em ambientes ubíquos para comunicação remota entre usuários, de modo a tornar a interação fácil, eficiente, eficaz e satisfatória até mesmo para usuários não especialistas em informática / Ubiquitous computing explores the possibility of instrumenting an environment with computational infrastructure that transparently supports users in their daily tasks. This kind of systems can be applied in different contexts contributing in medical, educational or administrative activities, for instance. The automatic capture of information in ubiquitous computing environments aims at documenting daily activities so that the corresponding information can be later accessed for review. Achieving a high level of transparency concerning the user interaction in such environments is a challenge to designers, since many users\' requirements must be gathered so that the user-environment interaction is efficient, effective and satisfactory. The main objective of the work reported is the specification of a version of the DiGaE (Distributed Gathering Environment) application which allows the synchronous communication among remote users in distributed meetings developing interfaces prototypes, evaluating them and suggesting solutions to facilitate users\' interaction. Concepts from Software Engineering were employed to specify and model the application, and Human-Computer Interaction concepts were employed to develop and evaluate the interfaces. The project guided the implementation of a DiGaE prototype by TIDIA-Ae Project team, in which context this work is inserted. The main contribution of this work is a project which considers usability in ubiquitous distributed gathering environments in order to make the interaction easy, efficient, effective and satisfactory, even for non-expert users
42

A System for Foot Joint Kinetics – Integrating Plantar Pressure/Shear with Multisegment Foot Modeling

Petersen, Spencer Ray 04 June 2020 (has links)
Introduction: Instrumented gait analysis and inverse dynamics are commonly used in research and clinical practice to calculate lower extremity joint kinetics, such as power and work. However, multisegment foot (MSF) model kinetics have been limited by ground reaction force (GRF) measurements. New technology enables simultaneous capture of plantar pressure and shear stress distributions but has not yet been used with motion capture. Integrating MSF models and pressure/shear measurements will enhance the analysis of foot joint kinetics. The purpose of this study was to develop methodology to integrate these systems, then analyze the effects of speed on foot joint kinetics. Methods: Custom software was developed to synchronize motion capture and pressure/shear data using measured offsets between reference frame origins and time between events. Marker trajectories were used to mask pressure/shear data and construct segment specific GRFs. Inverse dynamics were done in commercial software. Demonstrative data was from 5 healthy adults walking unshod at 3 fixed speeds (1.0, 1.3, and 1.6 m/s, respectively) wearing retroreflective markers according to an MSF model. Plantar shear forces and ankle, midtarsal, and first metatarsophalangeal (MTP) joint kinetics were reported. Speed effects on joint net work were evaluated with a repeated measures ANOVA. Results: Plantar shear forces during stance showed some spreading effects (directionally opposing shear forces) that relatively were unaffected by walking speed. Midtarsal joint power seemed to slightly lag behind the ankle, particularly in late stance. Net work at the ankle (p = 0.024), midtarsal (p = 0.023), and MTP (p = 0.009) joints increased with speed. Conclusions: Functionally, the ankle and midtarsal joints became more motorlike with increasing speed by generating more energy than they absorbed, while the MTP joint became more damperlike by absorbing more energy than it generated. System integration appears to be an overall success. Limitations and suggestions for future work are presented.
43

Investigation of the Effect of Particle Size and Particle Loading on Thermal Conductivity and Dielectric Strength of Thermoset Polymers

Warner, Nathaniel A. 05 1900 (has links)
Semiconductor die attach materials for high voltage, high reliability analog devices require high thermal conductivity and retention of dielectric strength. A comparative study of effective thermal conductivity and dielectric strength of selected thermoset/ceramic composites was conducted to determine the effect of ceramic particle size and ceramic particle loading on thermoset polymers. The polymer chosen for this study is bismaleimide, a common aerospace material chosen for its strength and thermal stability. The reinforcing material chosen for this study is a ceramic, hexagonal boron nitride. Thermal conductivity and dielectric breakdown strength are measured in low and high concentrations of hexagonal boron nitride. Adhesive fracture toughness of the composite is evaluated on copper to determine the composite’s adhesive qualities. SEM imaging of composite cross-sections is used to visualize particle orientation within the matrix. Micro-indentation is used to measure mechanical properties of the composites which display increased mechanical performance in loading beyond the percolation threshold of the material. Thermal conductivity of the base polymer increases by a factor of 50 in 80%wt loading of 50µm hBN accompanied by a 10% increase in composite dielectric strength. A relationship between particle size and effective thermal conductivity is established through comparison of experimental data with an empirical model of effective thermal conductivity of composite materials.
44

Analysis of free-riding behaviour using instrumented bicycles

Johansson, Jonathan January 2023 (has links)
The use of bicycle as a transportation mode has increased in popularity during the last four decades. The reasons that could explain why the use of bicycles have increased in popularity are many. Nevertheless, three possible reasons for the increasing in popularity are because of the benefit in terms of health, reduced motorised traffic congestion, and air pollution. As bicycle traffic flows increase, the evaluating of the bicycle traffic infrastructure will become more important for bicyclist safety, and comfort. One possible evaluating tool for bicycle traffic is microscopic traffic simulation and one key component is the free-riding. The free-riding is a bicyclist that is not interacting with other bicyclist, and other road users during a ride. Furthermore, to develop a suitable simulation model of the free-riding needs data and a possible methodology for data collection on the free-riding is using instrumented bicycles. Therefore, this thesis will investigate a methodology that can be used to collect, process, and analyse data for two bicyclist and their interactions with the infrastructure. The methodology for data collection using instrumented bicycle includes a pre-defined travel route, two types of bicycles as a conventional, and an electric bicycle. Additionally, the equipment is used in the methodology should be easy to switch between bicycles to keep the behaviour as natural as possible for bicyclists. Nevertheless, the equipment is easy to switch between bicycles, if only a few tools is needed to switch between bicycles in the methodology. Moreover, data collection using instrument bicycle includes an interview survey on each participating bicyclist, and investigate weather conditions, and effort experience during the data collection on each participating bicyclist. Results indicate that negative acceleration i.e., deceleration, at intersections, curves, uphill when a conventional bicycle is used. Meanwhile, it is also negative acceleration i.e., deceleration at downhills when an electric bicycle is used. Furthermore, the use of electric bicycle leads to higher travel speed and lower power output usage on average than when a conventional bicycle is used as expected. Moreover, at downhills the speed can still increase even though the power output usage is zero, according to the analysis of free-riding behaviour. In addition, data collection using instrumented bicycle collects other measurement of the effort for the bicyclist such as the heart rate, and cadence. / <p>Examensarbetet är utfört vid Institutionen för teknik och naturvetenskap (ITN) vid Tekniska fakulteten, Linköpings universitet</p>
45

Strengthening Mechanisms in Nanostructured Materials

Yailuth Alexandra Loaiza Lopera (13176354) 29 July 2022 (has links)
<p>Understanding the behavior of materials with nanoscale features is important because of both the  size  of  engineering  devices  and  the  internal  microstructure  of  more  bulk  materials.  Many electronic  components  have  been  miniaturized  in  recent  years  to  attend  the  high  demand  of technology development. Similarly, new stronger bulk metallic materials use nm-scale grain sizes or  precipitates  to  increase  their  strength  over  more  conventionally  processed  alloys.  Nanoscale testing   also   offers   a   route   for   mechanical   behavior   understanding   at   the   microscale. Nanoindentation  has  been  used  to  find  structure-properties  relationships  of  nanostructured materials due to its high load-depth resolution and versatility of the test. Nanoindentation can be used  to  find  hardness  and  modulus  of  the  materials,  important  characteristics  to  evaluate mechanical  performance.  An  introduction  to  strengthening  mechanism  and  generalities  of nanoindentation is shown in Chapter 1.</p> <p>This thesis explores how traditional strengthening mechanisms for bulk materials, can be  applied  to  nanomaterials  and  how  the  microstructure  could  be  tailored  to  achieve  the  desired outcomes on the specific materials studied. The first one is the study of mechanical properties of Nanometallic  Foams  (NMF)  and  its  relationship  with  the  nanostructure.  NMFs  of  pure  copper, CuNi and CuZn alloys were fabricated and tested to find the predominant structural and chemical parameters  of  the  mechanical  properties.  Research  on  how  to  control  and  tailor  the  structural parameters of NMF with viscosity of the precursors is shown in Chapter 2. The relative density was  the  most  predominant  parameter  among  the  structural  parameters  studied.  However,  when relative density parameter is isolated, NMF  are more susceptible to strengthen by second phase precipitation instead of solid solution. The solid solution strengthening mechanism was validated with  MD  simulation  and  agrees  with  the  experimental  findings  that  showed  the  addition  of  Ni atoms to Cu have a moderate effect on the mechanical properties. Chapter 3 presents these findings The  second  example  presented  shows  the  strengthening  effect  of  precipitates  in  nanometallic multilayer. The precipitation was achieved by aging treatment. High temperature nanomechanical testing is also presented in Chapter 4. The third and final example, presented in chapter 5, shows how  the  second  phase  precipitation  and  dispersion  strengthening  of  lead-free  solder  SAC  305 compares  between  samples  aged  for  nine  years  at  body  temperature  and  an  accelerated  aging treatments.</p>
46

Data Analytics for Statistical Learning

Komolafe, Tomilayo A. 05 February 2019 (has links)
The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. Big data is a widely-used term without a clear definition. The difference between big data and traditional data can be characterized by four Vs: velocity (speed at which data is generated), volume (amount of data generated), variety (the data can take on different forms), and veracity (the data may be of poor/unknown quality). As many industries begin to recognize the value of big data, organizations try to capture it through means such as: side-channel data in a manufacturing operation, unstructured text-data reported by healthcare personnel, various demographic information of households from census surveys, and the range of communication data that define communities and social networks. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called statistical learning of the data, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies in the process. However, several open challenges still exist in this framework for big data analytics. Recently, data types such as free-text data are also being captured. Although many established processing techniques exist for other data types, free-text data comes from a wide range of individuals and is subject to syntax, grammar, language, and colloquialisms that require substantially different processing approaches. Once the data is processed, open challenges still exist in the statistical learning step of understanding the data. Statistical learning aims to satisfy two objectives, (1) develop a model that highlights general patterns in the data (2) create a signaling mechanism to identify if outliers are present in the data. Statistical modeling is widely utilized as researchers have created a variety of statistical models to explain everyday phenomena such as predicting energy usage behavior, traffic patterns, and stock market behaviors, among others. However, new applications of big data with increasingly varied designs present interesting challenges. Consider the example of free-text analysis posed above. There's a renewed interest in modeling free-text narratives from sources such as online reviews, customer complaints, or patient safety event reports, into intuitive themes or topics. As previously mentioned, documents describing the same phenomena can vary widely in their word usage and structure. Another recent interest area of statistical learning is using the environmental conditions that people live, work, and grow in, to infer their quality of life. It is well established that social factors play a role in overall health outcomes, however, clinical applications of these social determinants of health is a recent and an open problem. These examples are just a few of many examples wherein new applications of big data pose complex challenges requiring thoughtful and inventive approaches to processing, analyzing, and modeling data. Although a large body of research exists in the area of anomaly detection increasingly complicated data sources (such as side-channel related data or network-based data) present equally convoluted challenges. For effective anomaly-detection, analysts define parameters and rules, so that when large collections of raw data are aggregated, pieces of data that do not conform are easily noticed and flagged. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This paper focuses on the healthcare, manufacturing and social-networking industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerably to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process / PHD / The prevalence of big data has rapidly changed the usage and mechanisms of data analytics within organizations. The fields of manufacturing and healthcare are two examples of industries that are currently undergoing significant transformations due to the rise of big data. The addition of large sensory systems is changing how parts are being manufactured and inspected and the prevalence of Health Information Technology (HIT) systems in healthcare systems is also changing the way healthcare services are delivered. These industries are turning to big data analytics in the hopes of acquiring many of the benefits other sectors are experiencing, including reducing cost, improving safety, and boosting productivity. However, there are many challenges that exist along with the framework of big data analytics, from pre-processing raw data, to statistical modeling of the data, and identifying anomalies present in the data or process. This work offers significant contributions in each of the aforementioned areas and includes practical real-world applications. Big data analytics generally follows this framework: first, a digitized process generates a stream of data, this raw data stream is pre-processed to convert the data into a usable format, the pre-processed data is analyzed using statistical tools. In this stage, called ‘statistical learning of the data’, analysts have two main objectives (1) develop a statistical model that captures the behavior of the process from a sample of the data (2) identify anomalies or outliers in the process. In this work, I investigate the different steps of the data analytics framework and propose improvements for each step, paired with practical applications, to demonstrate the efficacy of my methods. This work focuses on the healthcare and manufacturing industries, but the materials are broad enough to have wide applications across data analytics generally. My main contributions can be summarized as follows: • In the big data analytics framework, raw data initially goes through a pre-processing step. Although many pre-processing techniques exist, there are several challenges in pre-processing text data and I develop a pre-processing tool for text data. • In the next step of the data analytics framework, there are challenges in both statistical modeling and anomaly detection o I address the research area of statistical modeling in two ways: - There are open challenges in defining models to characterize text data. I introduce a community extraction model that autonomously aggregates text documents into intuitive communities/groups - In health care, it is well established that social factors play a role in overall health outcomes however developing a statistical model that characterizes these relationships is an open research area. I developed statistical models for generalizing relationships between social determinants of health of a cohort and general medical risk factors o I address the research area of anomaly detection in two ways: - A variety of anomaly detection techniques exist already, however, some of these methods lack a rigorous statistical investigation thereby making them ineffective to a practitioner. I identify critical shortcomings to a proposed network-based anomaly detection technique and introduce methodological improvements - Manufacturing enterprises which are now more connected than ever are vulnerable to anomalies in the form of cyber-physical attacks. I developed a sensor-based side-channel technique for anomaly detection in a manufacturing process.
47

Comportement mécanique d'alliages pour couches de liaison de barrière thermique par microindentation instrumentée à haute température / Analysis of the mechanical behaviour of bondcoat alloys for thermal barrier systems from high temperature instrumented microindentation experiments

Villemiane, Arnaud 15 December 2008 (has links)
Les systèmes barrières thermiques protégeant les aubes de turbine sont des multicouches constitués d’une couche céramique isolante appliquée sur un superalliage par l’intermédiaire d’une couche de liaison qui, dans les systèmes actuels est à base de NiAl(Pt). Pour en comprendre et décrire le comportement thermomécanique, il est nécessaire de connaître le comportement de chaque couche, en particulier celui de la couche de liaison dont le rôle est critique. Nous avons employé une technique originale, la microindentation instrumentée à chaud (jusqu’à 850°C), pour obtenir des informations sur le comportement mécanique de matériaux de couches de liaison. Il a fallu d’abord fiabiliser le dispositif pour minimiser les effets d’oxydation et caractériser la stabilité thermique pour s’assurer de la validité et la reproductibilité des résultats. Un second volet a consisté à mettre en place une méthode de traitement de données et une méthode d’analyse inverse des résultats associant une approche analytique et une simulation de l’essai par éléments finis. Les essais menés sur des matériaux massifs élaborés sous forme de couples de diffusion pour explorer une large gamme de compositions ont permis de déterminer la loi de comportement élastoviscoplastique du composé NiAl(Pt) sous forme [bêta] et sous forme martensitique. Des propriétés mécaniques ont été également été déterminées sur les composés NiAl(Ru) et NiAl(Zr) envisagés pour des systèmes futurs. L’influence des divers éléments (Al, Pt et Ru) a pu ainsi être mise en évidence. Finalement des essais ont été effectués sur des couches de liaison de barrière thermique et les résultats corrélés à ceux obtenus sur matériaux massifs / Thermal barrier systems, which protect turbine blades, are multilayers constituted of an insulating ceramic layer applied on a metallic bondcoat itself in contact with the superalloy substrate. A widely used bondcoat is composed of a NiAl(Pt) compound. In order to understand and describe the thermomechanical behaviour of such systems, it is required to know the mechanical behaviour of each layer, in particular that of this bondcoat whose role is critical for maintaining the integrity of the systems. In this study, we have employed an original technique – high temperature instrumented microindentation, up to 850°C – to extract information on the mechanical behaviour of bondcoat materials. A preliminary phase consisted in improving the experimental procedure - in particular to minimise oxidation phenomena - and in characterising the thermal stability of the equipment at high temperature to ensure the reliability, validity and reproducibility of the results obtained. We have then developed a systematic data treatment and an inverse problem analysis combining analytical approaches and a FEM simulation of the experiment to extract a mechanical behaviour law of the materials investigated. Tests performed on bulk diffusion couples, selected to explore a wide range of compositions representative of aging bondcoats, permitted to extract an elastic viscoplastic behaviour law of NiAl(Pt), both in the B2 phase and in the martensitic phase. Some mechanical properties could also be determined on NiAl(Ru) and NiAl(Zr) systems. Finally the results of a few tests performed on thermal barrier bondcoats could be correlated with the results obtained on bulk materials
48

Comportamento de estacas escavadas em solos colapsíveis / Behavior of excavated piles in collapsible soils

Teixeira, Cornélio Zampier 28 June 1993 (has links)
Este trabalho se propõe a analisar os efeitos produzidos pela inundação do solo em estacas escavadas, isoladas, submetidas a esforços de compressão. Os resultados obtidos referem-se à execução de três provas de carga lentas em estacas com diâmetros de 0,40 m e 0,50 m (2 provas de carga distintas) e comprimentos de 10,0 m, submetidas anteriormente a uma série de carregamentos (duas provas de carga rápidas e duas lentas). Os resultados relativos aos deslocamentos medidos no topo das estacas e os obtidos pela instrumentação instalada ao longo do fuste são então comparados com a sequência dos resultados anteriores, para avaliação do efeito de umedecimento no comportamento \"carga x recalque\" e nos mecanismos de transferência de carga. É feita ainda uma descrição minuciosa do processo utilizado para inundação do solo e uma breve consideração sobre os efeitos da não reposição de cargas em ensaios lentos, de longa duração. / This work intends to analyze the effects produced by the flooding of the soilexcavated piles, isolated, subjected to compression efforts. The results refer to the execution of three slow load tests on piles with diameters of 0.40 m and 0.50 m (2 distinct load tests) and 10.0 m lengths, submitted previously to a number of shipments (two slow and two quick load). The results concerning offsets measured at the top of the stakes and those obtained by the instrumentation installed along the forend are then compared with the sequence of the previous results, to evaluate the effect of damping in behavior \"load x settlement\" and load transfer mechanisms. It was also made a thorough description of the process used to flood the ground and a brief consideration of the effects of non-replacement of loads in slow, long-term trials.
49

Diagnóstico e tratamento de falhas críticas em sistemas instrumentados de segurança. / Diagnosis and treatment of critical faults in safety instrumented systems.

Squillante Júnior, Reinaldo 02 December 2011 (has links)
Sistemas Instrumentados de Segurança (SIS) são projetados para prevenir e/ou mitigar acidentes, evitando indesejáveis cenários com alto potencial de risco, assegurando a proteção da saúde das pessoas, proteção do meio ambiente e economia de custos com equipamentos industriais. Desta forma, é extremamente recomendado neste projeto de SIS o uso de métodos formais para garantir as especificações de segurança em conformidade com as normas regulamentadoras vigentes, principalmente para atingir o nível de integridade de segurança (SIL) desejado. Adicionalmente, algumas das normas de segurança como ANSI / ISA S.84.01; IEC 61508, IEC 61511, entre outras, recomendam uma série de procedimentos relacionados ao ciclo de vida de segurança de um projeto de SIS. Desta forma, destacam-se as atividades que compreendem o desenvolvimento e a validação dos algoritmos de controle em que se separam semanticamente os aspectos voltados para o diagnóstico de falhas críticas e o tratamento destas falhas associado a um controle de coordenação para filtrar a ocorrência de falhas espúrias. Portanto, a contribuição deste trabalho é propor um método formal para a modelagem e análise de SIS, incluindo o diagnóstico e o tratamento de falhas críticas, baseado em rede Bayesiana (BN) e rede de Petri (PN). Este trabalho considera o diagnóstico e o tratamento para cada função instrumentada de segurança (SIF) a partir do resultado do estudo de análise de riscos, de acordo com a metodologia de HAZOP (Hazard and Operability). / Safety Instrumented Systems (SIS) are design to prevent and/or mitigate accidents, avoiding undesirable high potential risk scenarios, assuring protection of people health, protecting the environment and saving costs of industrial equipment. It is strongly recommended in this design formal method to assure the safety specifications in accordance to standards regulations, mainly for reaching desired safety integrity level (SIL). Additionally, some of the safety standards such as ANSI/ISA S.84.01; IEC 61508, IEC 61511, among others, guide different activities related to Safety Life Cycle (SLC) design of SIS. In special, there are design activities that involves the development and validation of control algorithm that separate semantically aspects oriented to diagnosis and treatment of critical faults associated with a control coordination to filter spurious failures occurrence. In this context, the contribution of this work is to propose a formal method for modeling and analysis of SIS designed including diagnostic and treatment of critical faults based on Bayesian networks (BN) and Petri nets (PN). This approach considers diagnostic and treatment for each safety instrumented function (SIF) obtained according hazard and operability (HAZOP) methodology.
50

Modelamento do ensaio de indentação instrumentada usando elementos finitos e análise dimensional - análise de unicidade, variações experimentais, atrito e geometria e deformações do indentador. / Modeling of the instrumented indetation test using finite element simulations and dimensional analysis - analysis of uniqueness, experimental variation, friction , and elastic deformation and geometry of the indenter.

Rodríguez Pulecio, Sara Aida 31 March 2010 (has links)
A caracterização de materiais utilizando a técnica de indentação instrumentada difundiu-se significativamente na última década, devido às melhorias dos instrumentos que permitem ensaios por esta técnica e à necessidade de se fazer ensaios em pequenos volumes de materiais, como, por exemplo, em filmes finos e materiais com superfícies modificadas por tratamentos superficiais. Neste texto, abordou-se a elaboração de um algoritmo que permita o estudo da resposta de indentação de superfícies de diferentes materiais metálicos, durante e após o contato com um indentador agudo, empregando simulações por elementos finitos e análise dimensional. Na obtenção do algoritmo analisaram-se os efeitos da formação de borda (pile-up) ou da retração (sink-in) do material indentado, o coeficiente de atrito indentador-amostra, as deformações elásticas do indentador, a geometria do indentador e a variação experimental. As relações obtidas permitiram identificar uma falta fundamental de relação única entre as propriedades mecânicas e a forma da curva de indentação para curvas com razão d r/dmax>0,9, onde d r é a profundidade residual da indentação e dmax é o deslocamento máximo do indentador durante o ensaio. Da mesma forma, concluiu-se que a relação de Sneddon tem que ser corrigida tanto pela geometria da área de contato indentador-amostra quanto pela razão entre os módulos elásticos do material e do indentador (E/Ei). Como a área de contato indentador-amostra é afetada não só pela geometria do indentador mas também pelo nível de pile-up ou sink-in nos indentadores piramidais, uma relação foi identificada entre o nível de pile-up/sink-in e o fator de correção b da equação de Sneddon para os indentadores Vickers e Berkovich. Adicionalmente, pequenas diferenças foram observadas entre as curvas P-h (onde h é a profundidade de indentação abaixo da superfície original da amostra) e as curvas P-d (onde d é o valor da aproximação mútua entre indentador e amostra durante a indentação) para um mesmo valor de módulo reduzido Er quando a razão E/Ei é grande. Assim, o módulo reduzido pode sobreestimar ou subestimar a rigidez do indentador, dependendo das propriedades do material indentado. As análises neste trabalho permitiram igualmente observar que as principais limitações na obtenção das propriedades elasto-plásticas usando a curva de indentação instrumentada vêm da falta de unicidade, seguida pela geometria do indentador, isto é, diferenças entre o cone equivalente e os piramidais Vickers e Berkovich, assim como desvios em relação à geometria ideal do indentador, o que inclui arredondamento da sua ponta. Quando não há unicidade, sabendo-se informações adicionais à curva P-d, por exemplo, o valor da área residual da indentação ou o módulo elástico, uma solução única das propriedades mecânicas pode ser obtida. Em paralelo, a variação experimental pode limitar de forma significativa a precisão no cálculo das propriedades, enquanto o atrito indentador-amostra e as deformações do indentador têm efeitos menos significativos. Em termos gerais, observa-se que as funções que compõem o algoritmo desenvolvido neste trabalho permitem: (i) predizer as curvas carga-deslocamento (curvas P-d), produto do ensaio de indentação instrumentada, para um conjunto de propriedades elasto-plásticas conhecidas; (ii) identificar quando uma mesma curva P-d pode ser obtida de mais de um conjunto de propriedades do material indentado (iii) calcular as propriedades mecânicas (dureza, módulo elástico, coeficiente de encruamento e limite de escoamento) de um material usando a curva P-d quando d r/dmax<0,8 ou faixas de propriedades de materiais quando d r/dmax>0,8 e (iv) calcular as propriedades mecânicas (dureza, módulo elástico, coeficiente de encruamento e limite de escoamento) de um material usando a curva P-d e a área residual da indentação. / The interest in material characterization using instrumented indentation techniques has significantly increased in the last decade, due to improvements in testing instruments and the need to carry out tests on small volumes of materials, such as thin films or materials with surfaces modified by other surface treatments. This work addresses the development of an algorithm to analyze the indentation response of a group of metallic materials, during and after the contact with a sharp indenter, using finite element simulations and dimensional analysis. The formulation of the algorithm considered the effects of pile-up or sink-in of the indented material around the indentation, the friction coefficient between the indenter and the sample, the elastic deformation of the indenter, and the defects of the indenter tip. An analysis considering algorithm output and experimental variation was also conducted. The results allowed identifying a fundamental lack of unique relationship between the mechanical properties and the shape of the indentation curve for indentation curves with ratio d r/dmax>0,9, where d r is the residual indentation depth and dmax is the maximum indenter displacement in the test. Similarly, results allowed concluding that Sneddons equation requires a correction by both the geometry of the contact area and the ratio between the elastic moduli of the material and the indenter (E/Ei). As the shape of contact area is affected not only by the geometry of the indenter but also by the level of pile-up or sink-in in pyramidal indenters, a relationship was observed between the level of pile-up/sink-in and the correction factor b in the Sneddons equation for Vickers and Berkovich indenters. Additionally, it was found that the deformation of the indenter is not fully incorporated into indentation data analysis by the consideration of a reduced modulus (Er). Small differences between P-h curves (where h is the indentation depth below the original surface) and P-d curves (where d is the indenter/sample mutual approach) were observed for the same Er when the ratio E/Ei is large. Thus, the reduced modulus can overestimate or underestimate the indenter stiffness, depending on the mechanical properties of the indented material. Additionally, the analysis in this work has identified that the most important limitations in mechanical properties estimation using the indentation curve arise from the lack of uniqueness, followed by deviations in indenter geometry, such as differences between equivalent cone and pyramidal Vickers or Berkovich and tip defects. When non-uniqueness is present, unique solution may be obtained with the knowledge of additional information, in conjunction with P-d data, such as the residual indentation area or the elastic modulus. Furthermore, even when a unique solution is available the experimental variation may significantly decrease the accuracy in mechanical properties estimation, whereas friction and indenter deformation have less significant effects. In general, it was observed that the proposed algorithm allows: (i) predicting the load-displacement curves P-d of instrumented indentation tests for a set of known elastic-plastic mechanical properties, (ii) identifying when the same P-d curve can be obtained from more than one set of mechanical properties of the indented material, (iii) calculating the mechanical properties (hardness, elastic modulus, yield stress and strain hardening coefficient) from P-d curves when d r/dmax<0,8 or possible ranges for each mechanical property when d r/dmax>0,8 and (iv) calculating the mechanical properties (hardness, elastic modulus, yield stress and strain hardening coefficient) from P-d curves and the knowledge of the residual indentation area.

Page generated in 0.0782 seconds