• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 92
  • 25
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 222
  • 222
  • 40
  • 35
  • 32
  • 31
  • 30
  • 24
  • 24
  • 24
  • 22
  • 20
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
181

Water and Heat Transport in Road Structures : Development of Mechanistic Models

Hansson, Klas January 2005 (has links)
<p>The coupled transport of water and heat, involving freezing and thawing, in the road structure and its immediate environment is important to consider for optimal design and maintenance of roads and when assessing solute transport, of e.g. de-icing salt, from roads. The objective of this study was to develop mechanistic models, and measurement techniques, suitable to describe and understand water flow and heat flux in road structures exposed to a cold climate. </p><p>Freezing and thawing was accounted for by implementing new routines in two numerical models (HYDRUS1D/2D). The sensitivity of the model output to changes in parameter values and operational hydrological data was investigated by uncertainty and sensitivity analyses. The effect of rainfall event characteristics and asphalt fractures on the subsurface flow pattern was investigated by scenario modelling. The performance of water content reflectometers (WCR), measuring water content, was evaluated using measurements in two road structure materials. A numerical model was used to simulate WCR sensor response. The freezing/thawing routines were stable and provided results in agreement with laboratory measurements. Frost depth, thawing period, and freezing-induced water redistribution in a model road was greatly affected by groundwater level and type of subgrade. The simulated subsurface flow patterns corresponded well with published field observations. A new method was successful in enabling the application of time domain reflectometer (TDR) calibration equations to WCR output. The observed distortion in sampling volume for one of the road materials could be explained by the WCR sensor numerical model. Soil physical, hydrological, and hydraulic modules proved successful in simulating the coupled transport of water and heat in and on the road structure. It was demonstrated in this thesis that numerical models can improve the interpretation and explanation of measurements. The HYDRUS model was an accurate and pedagogical tool, clearly useful in road design and management.</p>
182

Water and Heat Transport in Road Structures : Development of Mechanistic Models

Hansson, Klas January 2005 (has links)
The coupled transport of water and heat, involving freezing and thawing, in the road structure and its immediate environment is important to consider for optimal design and maintenance of roads and when assessing solute transport, of e.g. de-icing salt, from roads. The objective of this study was to develop mechanistic models, and measurement techniques, suitable to describe and understand water flow and heat flux in road structures exposed to a cold climate. Freezing and thawing was accounted for by implementing new routines in two numerical models (HYDRUS1D/2D). The sensitivity of the model output to changes in parameter values and operational hydrological data was investigated by uncertainty and sensitivity analyses. The effect of rainfall event characteristics and asphalt fractures on the subsurface flow pattern was investigated by scenario modelling. The performance of water content reflectometers (WCR), measuring water content, was evaluated using measurements in two road structure materials. A numerical model was used to simulate WCR sensor response. The freezing/thawing routines were stable and provided results in agreement with laboratory measurements. Frost depth, thawing period, and freezing-induced water redistribution in a model road was greatly affected by groundwater level and type of subgrade. The simulated subsurface flow patterns corresponded well with published field observations. A new method was successful in enabling the application of time domain reflectometer (TDR) calibration equations to WCR output. The observed distortion in sampling volume for one of the road materials could be explained by the WCR sensor numerical model. Soil physical, hydrological, and hydraulic modules proved successful in simulating the coupled transport of water and heat in and on the road structure. It was demonstrated in this thesis that numerical models can improve the interpretation and explanation of measurements. The HYDRUS model was an accurate and pedagogical tool, clearly useful in road design and management.
183

Uncertainty analysis of a particle tracking algorithm developed for super-resolution particle image velocimetry

Joseph, Sujith 11 August 2003 (has links)
Particle Image Velocimetry (PIV) is a powerful technique to measure the velocity at many points in a flow simultaneously by performing correlation analysis on images of particles being transported by the flow. These images are acquired by illuminating the flow with two light pulses so that each particle appears once on each image. <p> The spatial resolution is an important parameter of this measuring system since it determines its ability to resolve features of interest in the flow. The super-resolution technique maximises the spatial resolution by augmenting the PIV analysis with a second pass that identifies specific particles and measures the distance between them. <p> The accuracy of the procedure depends on both the success with which the proper pairings are identified and the accuracy with which their centre-to-centre distance can be measured. This study presents an analysis of both the systematic uncertainty and random uncertainty associated with this process. The uncertainty is analysed as a function of several key parameters that define the quality of the image. The uncertainty analysis is performed by preparing 4000 member ensembles of simulated images with specific setpoints of each parameter. <p> It is shown that the systematic uncertainty is negligible compared to the random uncertainty for all conditions tested. Also, the image contrast and the selection of a threshold for the particle search are the most critical parameters influencing both success rate and uncertainty. It is also shown that high image intensities still yield accurate results. The search radius used by the super-resolution algorithm is shown to be a critical parameter also. By increasing the search radius, the success rate can be increased although this is accompanied by an increase in random uncertainty.
184

Uncertainty analysis of a particle tracking algorithm developed for super-resolution particle image velocimetry

Joseph, Sujith 11 August 2003
Particle Image Velocimetry (PIV) is a powerful technique to measure the velocity at many points in a flow simultaneously by performing correlation analysis on images of particles being transported by the flow. These images are acquired by illuminating the flow with two light pulses so that each particle appears once on each image. <p> The spatial resolution is an important parameter of this measuring system since it determines its ability to resolve features of interest in the flow. The super-resolution technique maximises the spatial resolution by augmenting the PIV analysis with a second pass that identifies specific particles and measures the distance between them. <p> The accuracy of the procedure depends on both the success with which the proper pairings are identified and the accuracy with which their centre-to-centre distance can be measured. This study presents an analysis of both the systematic uncertainty and random uncertainty associated with this process. The uncertainty is analysed as a function of several key parameters that define the quality of the image. The uncertainty analysis is performed by preparing 4000 member ensembles of simulated images with specific setpoints of each parameter. <p> It is shown that the systematic uncertainty is negligible compared to the random uncertainty for all conditions tested. Also, the image contrast and the selection of a threshold for the particle search are the most critical parameters influencing both success rate and uncertainty. It is also shown that high image intensities still yield accurate results. The search radius used by the super-resolution algorithm is shown to be a critical parameter also. By increasing the search radius, the success rate can be increased although this is accompanied by an increase in random uncertainty.
185

Value-informed space systems design and acquisition

Brathwaite, Joy Danielle 16 December 2011 (has links)
Investments in space systems are substantial, indivisible, and irreversible, characteristics that make them high-risk, especially when coupled with an uncertain demand environment. Traditional approaches to system design and acquisition, derived from a performance- or cost-centric mindset, incorporate little information about the spacecraft in relation to its environment and its value to its stakeholders. These traditional approaches, while appropriate in stable environments, are ill-suited for the current, distinctly uncertain and rapidly changing technical, and economic conditions; as such, they have to be revisited and adapted to the present context. This thesis proposes that in uncertain environments, decision-making with respect to space system design and acquisition should be value-based, or at a minimum value-informed. This research advances the value-centric paradigm by providing the theoretical basis, foundational frameworks, and supporting analytical tools for value assessment of priced and unpriced space systems. For priced systems, stochastic models of the market environment and financial models of stakeholder preferences are developed and integrated with a spacecraft-sizing tool to assess the system's net present value. The analytical framework is applied to a case study of a communications satellite, with market, financial, and technical data obtained from the satellite operator, Intelsat. The case study investigates the implications of the value-centric versus the cost-centric design and acquisition choices. Results identify the ways in which value-optimal spacecraft design choices are contingent on both technical and market conditions, and that larger spacecraft for example, which reap economies of scale benefits, as reflected by their decreasing cost-per-transponder, are not always the best (most valuable) choices. Market conditions and technical constraints for which convergence occurs between design choices under a cost-centric and a value-centric approach are identified and discussed. In addition, an innovative approach for characterizing value uncertainty through partial moments, a technique used in finance, is adapted to an engineering context and applied to priced space systems. Partial moments disaggregate uncertainty into upside potential and downside risk, and as such, they provide the decision-maker with additional insights for value-uncertainty management in design and acquisition. For unpriced space systems, this research first posits that their value derives from, and can be assessed through, the value of information they provide. To this effect, a Bayesian framework is created to assess system value in which the system is viewed as an information provider and the stakeholder an information recipient. Information has value to stakeholders as it changes their rational beliefs enabling them to yield higher expected pay-offs. Based on this marginal increase in expected pay-offs, a new metric, Value-of-Design (VoD), is introduced to quantify the unpriced system's value. The Bayesian framework is applied to the case of an Earth Science satellite that provides hurricane information to oil rig operators using nested Monte Carlo modeling and simulation. Probability models of stakeholders' beliefs, and economic models of pay-offs are developed and integrated with a spacecraft payload generation tool. The case study investigates the information value generated by each payload, with results pointing to clusters of payload instruments that yielded higher information value, and minimum information thresholds below which it is difficult to justify the acquisition of the system. In addition, an analytical decision tool, probabilistic Pareto fronts, is developed in the Cost-VoD trade space to provide the decision-maker with additional insights into the coupling of a system's probable value generation and its associated cost risk.
186

Assessing reservoir performance and modeling risk using real options

Singh, Harpreet 02 August 2012 (has links)
Reservoir economic performance is based upon future cash flows which can be generated from a reservoir. Future cash flows are a function of hydrocarbon volumetric flow rates which a reservoir can produce, and the market conditions. Both of these functions of future cash flows are associated with uncertainties. There is uncertainty associated in estimates of future hydrocarbon flow rates due to uncertainty in geological model, limited availability and type of data, and the complexities involved in the reservoir modeling process. The second source of uncertainty associated with future cash flows come from changing oil prices, rate of return etc., which are all functions of market dynamics. Robust integration of these two sources of uncertainty, i.e. future hydrocarbon flow rates and market dynamics, in a model to predict cash flows from a reservoir is an essential part of risk assessment, but a difficult task. Current practices to assess a reservoir’s economic performance by using Deterministic Cash Flow (DCF) methods have been unsuccessful in their predictions because of lack in parametric capability to robustly and completely incorporate these both types of uncertainties. This thesis presents a procedure which accounts for uncertainty in hydrocarbon production forecasts due to incomplete geologic information, and a novel real options methodology to assess the project economics for upstream petroleum industry. The modeling approach entails determining future hydrocarbon production rates due to incomplete geologic information with and without secondary information. The price of hydrocarbons is modeled separately, and the costs to produce them are determined based on market dynamics. A real options methodology is used to assess the effective cash flows from the reservoir, and hence, to determine the project economics. This methodology associates realistic probabilities, which are quantified using the method’s parameters, with benefits and costs. The results from this methodology are compared against the results from DCF methodology to examine if the real options methodology can identify some hidden potential of a reservoir’s performance which DCF might not be able to uncover. This methodology is then applied to various case studies and strategies for planning and decision making. / text
187

Constraining uncertainty in climate sensitivity : an ensemble simulation approach based on glacial climate

Schneider von Deimling, Thomas January 2006 (has links)
Uncertainty about the sensitivity of the climate system to changes in the Earth’s radiative balance constitutes a primary source of uncertainty for climate projections. Given the continuous increase in atmospheric greenhouse gas concentrations, constraining the uncertainty range in such type of sensitivity is of vital importance. A common measure for expressing this key characteristic for climate models is the climate sensitivity, defined as the simulated change in global-mean equilibrium temperature resulting from a doubling of atmospheric CO2 concentration. The broad range of climate sensitivity estimates (1.5-4.5°C as given in the last Assessment Report of the Intergovernmental Panel on Climate Change, 2001), inferred from comprehensive climate models, illustrates that the strength of simulated feedback mechanisms varies strongly among different models. The central goal of this thesis is to constrain uncertainty in climate sensitivity. For this objective we first generate a large ensemble of model simulations, covering different feedback strengths, and then request their consistency with present-day observational data and proxy-data from the Last Glacial Maximum (LGM). Our analyses are based on an ensemble of fully-coupled simulations, that were realized with a climate model of intermediate complexity (CLIMBER-2). These model versions cover a broad range of different climate sensitivities, ranging from 1.3 to 5.5°C, and have been generated by simultaneously perturbing a set of 11 model parameters. The analysis of the simulated model feedbacks reveals that the spread in climate sensitivity results from different realizations of the feedback strengths in water vapour, clouds, lapse rate and albedo. The calculated spread in the sum of all feedbacks spans almost the entire plausible range inferred from a sampling of more complex models. We show that the requirement for consistency between simulated pre-industrial climate and a set of seven global-mean data constraints represents a comparatively weak test for model sensitivity (the data constrain climate sensitivity to 1.3-4.9°C). Analyses of the simulated latitudinal profile and of the seasonal cycle suggest that additional present-day data constraints, based on these characteristics, do not further constrain uncertainty in climate sensitivity. The novel approach presented in this thesis consists in systematically combining a large set of LGM simulations with data information from reconstructed regional glacial cooling. Irrespective of uncertainties in model parameters and feedback strengths, the set of our model versions reveals a close link between the simulated warming due to a doubling of CO2, and the cooling obtained for the LGM. Based on this close relationship between past and future temperature evolution, we define a method (based on linear regression) that allows us to estimate robust 5-95% quantiles for climate sensitivity. We thus constrain the range of climate sensitivity to 1.3-3.5°C using proxy-data from the LGM at low and high latitudes. Uncertainties in glacial radiative forcing enlarge this estimate to 1.2-4.3°C, whereas the assumption of large structural uncertainties may increase the upper limit by an additional degree. Using proxy-based data constraints for tropical and Antarctic cooling we show that very different absolute temperature changes in high and low latitudes all yield very similar estimates of climate sensitivity. On the whole, this thesis highlights that LGM proxy-data information can offer an effective means of constraining the uncertainty range in climate sensitivity and thus underlines the potential of paleo-climatic data to reduce uncertainty in future climate projections. / Eine der entscheidenden Hauptquellen für Unsicherheiten von Klimaprojektionen ist, wie sensitiv das Klimasystem auf Änderungen der Strahlungsbilanz der Erde reagiert. Angesichts des kontinuierlichen Anstiegs der atmosphärischen Treibhausgaskonzentrationen ist die Einschränkung des Unsicherheitsbereichs dieser Sensitivität von entscheidender Bedeutung. Ein häufig verwendetes Maß zur Beschreibung dieser charakteristischen Kenngröße von Klimamodellen ist die sogenannte Klimasensitivität, definiert als die Gleichgewichtsänderung der simulierten globalen Mitteltemperatur, welche sich aus einer Verdoppelung des atmosphärischen CO2-Gehalts ergibt. Die breite Spanne der geschätzten Klimasensitivität (1.5-4.5°C), welche ein Vergleich verschiedener komplexer Klimamodelle nahe legt (IPCC, 2001), verdeutlicht, wie groß die Unsicherheit in der Klimasensitivität ist. Diese Unsicherheit resultiert in erster Linie aus Unterschieden in der Simulation der entscheidenden Rückkopplungs-mechanismen in den verschiedenen Modellen. Das zentrale Ziel dieser Dissertation ist die Einschränkung des breiten Unsicherheitsbereichs der Klimasensitivität. Zunächst wird hierzu ein großes Ensemble an Modellsimulationen erzeugt, in welchem gezielt spezifische Modellparameter variiert, und somit unterschiedliche Rückkopplungsstärken der einzelnen Modellversionen realisiert werden. Diese Simulationen werden dann auf ihre Konsistenz mit sowohl heutigen Beobachtungsdaten, als auch Proxy-Daten des Letzten Glazialen Maximums (LGM) überprüft. Unsere Analysen basieren dabei auf einem Ensemble voll gekoppelter Modellläufe, welche mit einem Klimamodell intermediärer Komplexität (CLIMBER-2) realisiert wurden. Die betrachteten Modellversionen decken eine breite Spanne verschiedener Klimasensitivitäten (1.3-5.5°C) ab und wurden durch gleichzeitiges Variieren von 11 Modellparametern erzeugt. Die Analyse der simulierten Rückkopplungs-mechanismen offenbart, dass unterschiedliche Werte der Klimasensitivität in unserem Modellensemble durch verschiedene Realisierungen der Rückkopplungsstärken von Wasserdampf, Wolken, Temperatur-Vertikalprofil und Albedo zu erklären sind. Die berechneten Gesamt-Rückkopplungsstärken unser Modellversionen decken hierbei fast den gesamten möglichen Bereich von komplexeren Modellen ab. Wir zeigen, dass sich die Forderung nach Konsistenz zwischen simuliertem vorindustriellem Klima und Messdaten, die auf einer Wahl von sieben global gemittelten Datensätzen basieren, als vergleichsweise schwacher Test der Modellsensitivität erweist: Die Daten schränken den plausiblen Bereich der Klimasensitivität lediglich auf 1.3-4.9°C ein. Zieht man neben den genannten global gemittelten Messdaten außerdem klimatische Informationen aus Jahreszeit und geografischer Breite hinzu, lässt sich die Unsicherheit in der Klimasensitivität nicht weiter einschränken. Der neue Ansatz dieser Dissertation besteht darin, in systematischer Weise einen großen Satz an LGM-Simulationen mit Dateninformationen über die rekonstruierte glaziale Abkühlung bestimmter Regionen zu kombinieren. Unabhängig von den Unsicherheiten in Modellparametern und Rückkopplungsstärken offenbaren unsere Modellversionen eine ausgeprägte Beziehung zwischen der simulierten Erwärmung aufgrund der CO2-Verdoppelung und der Abkühlung im LGM. Basierend auf dieser engen Beziehung zwischen vergangener und zukünftiger Temperaturentwicklung definieren wir eine Methode (basierend auf linearer Regression), welche es uns erlaubt, robuste 5-95%-Quantile der Klimasensitivität abzuschätzen. Indem wir Proxy-Daten des LGM von niederen und hohen Breiten heranziehen, können wir die Unsicherheitsspanne der Klimasensitivität auf 1.3-3.5°C beschränken. Unsicherheiten im glazialen Strahlungsantrieb vergrößern diese Abschätzung auf 1.2-4.3°C, wobei die Annahme von großen strukturellen Unsicherheiten die obere Grenze um ein weiteres Grad erhöhen kann. Indem wir Proxy-Daten über tropische und antarktische Abkühlung betrachten, können wir zeigen, dass sehr unterschiedliche absolute Temperatur-Änderungen in hohen und niederen Breiten zu sehr ähnlichen Abschätzungen der Klimasensitivität führen. Vor dem Hintergrund unserer Ergebnisse zeigt diese Dissertation, dass LGM-Proxy-Daten ein effektives Mittel zur Einschränkung des Unsicherheitsbereichs der Klimasensitivität sein können und betont somit das Potenzial von Paläoklimadaten, den großen Unsicherheitsbereich von Klimaprojektionen zu reduzieren.
188

Méthodes probabilistes pour l'évaluation de risques en production industrielle / Probabilistic methodes for risks evaluation in industrial production

Oger, Julie 16 April 2014 (has links)
Dans un contexte industriel compétitif, une prévision fiable du rendement est une information primordiale pour déterminer avec précision les coûts de production et donc assurer la rentabilité d'un projet. La quantification des risques en amont du démarrage d'un processus de fabrication permet des prises de décision efficaces. Durant la phase de conception d'un produit, les efforts de développement peuvent être alors identifiés et ordonnés par priorité. Afin de mesurer l'impact des fluctuations des procédés industriels sur les performances d'un produit donné, la construction de la probabilité du risque défaillance est développée dans cette thèse. La relation complexe entre le processus de fabrication et le produit conçu (non linéaire, caractéristiques multi-modales...) est assurée par une méthode de régression bayésienne. Un champ aléatoire représente ainsi, pour chaque configuration du produit, l'information disponible concernant la probabilité de défaillance. Après une présentation du modèle gaussien, nous décrivons un raisonnement bayésien évitant le choix a priori des paramètres de position et d'échelle. Dans notre modèle, le mélange gaussien a priori, conditionné par des données mesurées (ou calculées), conduit à un posterior caractérisé par une distribution de Student multivariée. La nature probabiliste du modèle est alors exploitée pour construire une probabilité de risque de défaillance, définie comme une variable aléatoire. Pour ce faire, notre approche consiste à considérer comme aléatoire toutes les données inconnues, inaccessibles ou fluctuantes. Afin de propager les incertitudes, une approche basée sur les ensembles flous fournit un cadre approprié pour la mise en œuvre d'un modèle bayésien imitant le raisonnement d'expert. L'idée sous-jacente est d'ajouter un minimum d'information a priori dans le modèle du risque de défaillance. Notre méthodologie a été mise en œuvre dans un logiciel nommé GoNoGo. La pertinence de cette approche est illustrée par des exemples théoriques ainsi que sur un exemple réel provenant de la société STMicroelectronics. / In competitive industries, a reliable yield forecasting is a prime factor to accurately determine the production costs and therefore ensure profitability. Indeed, quantifying the risks long before the effective manufacturing process enables fact-based decision-making. From the development stage, improvement efforts can be early identified and prioritized. In order to measure the impact of industrial process fluctuations on the product performances, the construction of a failure risk probability estimator is developed in this thesis. The complex relationship between the process technology and the product design (non linearities, multi-modal features...) is handled via random process regression. A random field encodes, for each product configuration, the available information regarding the risk of non-compliance. After a presentation of the Gaussian model approach, we describe a Bayesian reasoning avoiding a priori choices of location and scale parameters. The Gaussian mixture prior, conditioned by measured (or calculated) data, yields a posterior characterized by a multivariate Student distribution. The probabilistic nature of the model is then operated to derive a failure risk probability, defined as a random variable. To do this, our approach is to consider as random all unknown, inaccessible or fluctuating data. In order to propagate uncertainties, a fuzzy set approach provides an appropriate framework for the implementation of a Bayesian model mimicking expert elicitation. The underlying leitmotiv is to insert minimal a priori information in the failure risk model. Our reasoning has been implemented in a software called GoNoGo. The relevancy of this concept is illustrated with theoretical examples and on real-data example coming from the company STMicroelectronics.
189

Projeto, otimização e análise de incertezas de um dispositivo coletor de energia proveniente de vibrações mecânicas utilizando transdutores piezelétricos e circuito ressonante / Design, optimization and uncertainty analysis of a mechanical vibration energy harvesting device using piezoelectric transducers and resonant circuit

Tatiane Corrêa de Godoy 05 November 2012 (has links)
O uso de materiais piezelétricos no desenvolvimento de dispositivos para o aproveitamento de energia provinda de vibrações mecânicas, Energy Harvesting, tem sido largamente estudado na última década. Materiais piezelétricos podem ser encontrados na forma de finas camadas ou pastilhas, sendo facilmente integradas a estruturas sem aumento significativo de massa. A conversão de energia mecânica em energia elétrica se dá graças ao acoplamento eletromecânico dos materiais piezelétricos. A maioria das publicações encontradas na literatura exploram o uso de dispositivos eletromecânicos ressonantes, sintonizados na frequência de operação da estrutura, maximizando assim, a energia elétrica de saída dada uma certa condição de operação. O desempenho desses dispositivos ressonantes para coletar e armazenar energia é altamente dependente da adequada sintonização da sua frequência de ressonância com a frequência de operação do sistema/estrutura. Este trabalho apresenta o projeto, otimização e análise de incertezas de um dispositivo coletor/armazenador de energia que consiste em uma placa sob duas condições de contorno, engastada-livre (EL) e deslizante-livre (DL), com massa sísmica e materiais piezelétricos conectados a um circuito shunt. Um modelo em elementos finitos de placa laminada piezelétrica conectada a circuitos R e RL é utilizado combinando as teorias de camada equivalente e deformação de cisalhamento de primeira ordem. A disposição/quantidade de material piezelétrico bem como a massa sísmica acoplados à estrutura foram otimizadas utilizando-se um Algoritmo Genético, levando em conta análises mecânica (modelo mecânico, geometria, peso) e elétrica (modelo elétrico, circuito armazenador). Além disso, o efeito de incertezas dos parâmetros dielétrico e piezelétrico do transdutor, e da indutância elétrica ligada em série ao circuito coletor/armazenador de energia foi estudado. Os resultados indicam que a inclusão de uma indutância sintética ao circuito pode melhorar a coleta de energia em uma banda de frequência e, ainda, que a otimização geométrica pode reduzir a quantidade de material piezelétrico sem no entanto diminuir significativamente a energia gerada. / The use of piezoelectric materials in the development of devices to harvest energy from mechanical vibrations (Energy Harvesting) has been widely studied in the last decade. Piezoelectric materials can be found in the form of thin layers or patches easily integrated into structures without significant mass increase. The conversion of mechanical energy into electric power is provided by the electromechanical coupling of piezoelectric materials. Most publications in the literature explore the use of resonant electromechanical devices, tuned to the operating frequency of the host structure, thus maximizing the power output given a certain operating condition. The performance of these resonant devices to harvest and store energy is highly dependent on the proper tuning of its resonance frequency with the operation frequency of the system/structure. This work presents a design, optimization and uncertainty analysis of energy harvester device consisting of a plate with tip mass and piezoelectric materials connected to shunt circuits. Two boundary conditions are used for the plate, cantilever (EL) and sliding-free (DL). A coupled finite element model with R and RL circuits, combining equivalent single layer and first order shear deformation theories, was used. The distribution and volume of piezoelectric material and the tip mass coupled to the structure were optimized using a Genetic Algorithm, accounting for both mechanical (mechanical model, geometry, weight) and electric (electric model, storer circuit) analyses. Furthermore, the effect of uncertainties of transducer dielectric and piezoelectric constants and electric inductance connected in series with harvesting circuit was studied. The results indicate that the inclusion of a synthetic inductance can improve energy harvesting performance over a frequency range and also that the geometric optimization may reduce the piezoelectric material volume without diminishing significantly the harvested energy.
190

Valor da flexibilização e informação em desenvolvimento de campo por modulos / Value of information in development of oil filed by modules

Hayashi, Suzana Hisako Deguchi 15 August 2018 (has links)
Orientador: Denis Jose Schiozer / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica e Instituto de Geociencias / Made available in DSpace on 2018-08-15T12:53:43Z (GMT). No. of bitstreams: 1 Hayashi_SuzanaHisakoDeguchi_M.pdf: 16467779 bytes, checksum: 4345ae4f5ff27ff7cee4722d4b305f55 (MD5) Previous issue date: 2006 / Resumo: O risco é inerente às várias fases da vida de um campo de petróleo, devido às incertezas, geológicas, econômicas e tecnológicas que influenciam o valor de um projeto. A aquisição de informações e a adição de flexibilidade na implantação de um projeto são os principais processos que permitem a mitigação dos riscos associados. O conceito de Valor da Informação (VDI) permite medir quantitativamente os benefícios resultantes da aquisição adicional de dados, que permite definir o projeto de desenvolvimento com mais precisão, podendo trazer modificações significativas em relação à concepção inicial (projeto conceitual). O conceito de Valor de Flexibilização (VDF) permite medir os benefícios de adicionar flexibilidade, por exemplo, no cronograma de implantação de um projeto, com o objetivo de possibilitar um melhor gerenciamento de reservatórios frente aos possíveis cenários. Os conceitos de VDI e VDF são usados neste trabalho para determinar o valor de adquirir novas informações para o projeto, considerando um atraso no cronograma causado pela flexibilização do momento de definição e aprovação do projeto básico. Uma técnica baseada nos Modelos Geológicos Representativos (MOR} e nas árvores de decisão é aplicada no processo de análise de decisão. Os resultados deste trabalho mostram que a metodologia proposta neste trabalho é aplicável em modelos de grande porte. Outras conclusões são que a relevância da aquisição de informações aumenta em cenários de preço de óleo mais baixo e que é importante analisar a redução de risco como variável adicional ao retomo financeiro no processo de decisão como o analisado neste trabalho / Abstract: The risk is inherent to several phases of a petroleum field development due to geological, economic and technological uncertainties, which influence the value of a project. The acquisition, of additional information and flexibility in the implementation, of a project are the main processes, which permit risk mitigation. The concept of Value of Information (VoI) permits to measure quantitatively the benefits of the new information that yield more accuracy in the definition of the development project and it can bring important modifications in comparison with the initial conception of the project. The concept of Value of Flexibility (VoF) allows measuring the benefits of flexibility in the implementation of a project yielding better reservoir management. The concepts of VoI and VoF are used in this work to determine the value of new information in a project, considering a delay in the schedule caused by the flexibility in the moment of definition and approval of the final project. A decision tree technique, associated to Geological Representative Models (GRM), is applied in the process of the quantification of the value of information and flexibility. Based on the results of this work, it is possible to conclude that: the methodology is useful for large fields; the relevance of information acquisition increases in low prices scenarios and; if is important to analyze risk mitigation in addition to financial gain in decision making processes like the one studied in this work / Mestrado / Engenharia de Petroleo / Mestre em Ciências e Engenharia de Petróleo

Page generated in 0.07 seconds