• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 92
  • 25
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 221
  • 221
  • 40
  • 35
  • 32
  • 30
  • 30
  • 24
  • 24
  • 24
  • 22
  • 20
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Fuzzy Vulnerability Assessment Model Of Coastal Areas To Sea Level Rise

Ozyurt, Gulizar 01 October 2010 (has links) (PDF)
Climate change and anticipated impacts of sea level rise such as increased coastal erosion, inundation, flooding due to storm surges and salt water intrusion to freshwater resources will affect all the countries but mostly small island countries of oceans and low-lying lands along coastlines. Turkey having 8333 km of coastline including physically, ecologically and socio-economically important low-lying deltas should also prepare for the impacts of sea level rise as well as other impacts of climate change while participating in adaptation and mitigation efforts. Thus, a coastal vulnerability assessment of Turkey to sea level rise is needed both as a part of coastal zone management policies for sustainable development and as a guideline for resource allocation for preparation of adaptation options for upcoming problems due to sea level rise. In this study, a fuzzy coastal vulnerability assessment model (FCVI) of a region to sea level rise using physical and human activity indicators of impacts of sea level rise which use commonly available data are developed. The results enable decision makers to compare and rank different regions according to their vulnerabilities to sea level rise, to prioritize impacts of sea level rise on the region according to the vulnerability of the region to each impact and to determine the most vulnerable parameters for planning of adaptation measures to sea level rise. The sensitivity and uncertainty analysis performed for the results of the model (FCVI) is the first time application of a fuzzy uncertainty analysis model to coastal vulnerability assessments. These analysis ensure that the decision makers could be able to interpret the results of such vulnerability assessments based primarily on expert perceptions accurately enough. This in turn, would increase the confidence levels of adaptation measures and as well as accelerate implementation of adaptation of coastal areas to climate change. The developed coastal vulnerability assessment model is applied successfully to determine the vulnerability of G&ouml / ksu, G&ouml / cek and Amasra regions of Turkey that have different geological, ecological and socio-economical properties. The results of the site studies show that G&ouml / ksu has high vulnerability, G&ouml / cek has moderate vulnerability and Amasra shows low vulnerability to sea level rise. These results are in accordance with the general literature on impacts of sea level rise at different geomorphological coastal areas thus the applicability of fuzzy vulnerability assessment model (FCVI) to coastal areas is validated.
112

Onboard Propellant Gauging For Spacecraft

Lal, Amit 01 1900 (has links)
Estimation of the total mission life of a spacecraft is an important issue for the communication satellite industries. For accurate determination of the remaining mission life of the satellite it is essential to estimate the amount of propellant present in the propellant tank of the spacecraft at various stages of its mission life. Because the annual revenue incurred from a typical communication satellite operating at its full capacity is on the order of millions of dollars, premature removal of spacecraft from their orbits results in heavy losses. Various techniques such as the bo okkeeping method, the gas law method, numerical modeling techniques, and use of capacitive sensors have been employed in the past for accurate determination of the amount of propellant present in a spacecraft. First half of the thesis is concerned with sensitivity analysis of the various propellant gauging techniques, that is, estimating the e ects of the uncertainty in the instruments employed in the propellant gauging system on the onboard propellant estimation. This sensitivity analysis is done for three existing propellant gauging techniques – gas injection method, book-keeping method and the propellant tank heating method. A comparative study of the precision with which the onboard propellant is estimated by the three techniques is done and the primary source of uncertainty for all the three techniques is identified. It is illustrated that all the three methods — the gas injection method, the book-keeping method and the propellant tank heating method — are inherently indirect methods of propellant gauging, as a consequence of which, the precision with which the three techniques estimate the residual propellant decreases towards the end of mission life of the spacecraft. The second half of the thesis explores the possibility of using a new propellant tank configuration, consisting of a truncated cone centrally mounted within a spherical propellant tank, to measure the amount of liquid propellant present within the tank. The liquid propellant present within the propellant tank orients itself in a geometry, by virtue of its dominant surface tension force in zero-g condition, which minimizes its total surface energy. Study reveals that the amount of liquid propellant present in the tank can thus be estimated by measuring the height of the propellant meniscus within the central cone. It is also observed that, unlike gas law metho d, bookkeeping method or the propellant tank heating metho d, where the precision of the estimated propellant fill-fraction decreases towards the end-of-life of the spacecraft, for the proposed new configuration the precision increases.
113

Onboard Propellant Gauging For Spacecraft

Lal, Amit 01 1900 (has links)
Estimation of the total mission life of a spacecraft is an important issue for the communication satellite industries. For accurate determination of the remaining mission life of the satellite it is essential to estimate the amount of propellant present in the propellant tank of the spacecraft at various stages of its mission life. Because the annual revenue incurred from a typical commu-nication satellite operating at its full capacity is on the order of millions of dollars, premature removal of spacecraft from their orbits results in heavy losses. Various techniques such as the bookkeeping method, the gas law method, numerical modeling techniques, and use of capacitive sensors have been employed in the past for accurate determination of the amount of propellant present in a spacecraft. First half of the thesis is concerned with sensitivity analysis of the various propellant gauging techniques, that is, estimating the effects of the uncertainty in the instruments employed in the propellant gauging system on the onboard propellant estimation. This sensitivity analysis is done for three existing propellant gauging techniques – gas injection method, book-keeping method and the propellant tank heating method. A comparative study of the precision with which the onboard propellant is estimated by the three techniques is done and the primary source of uncertainty for all the three techniques is identified. It is illustrated that all the three methods — the gas injection method, the book-keeping method and the propellant tank heating method — are inherently indirect methods of propellant gauging, as a consequence of which, the precision with which the three techniques estimate the residual propellant decreases towards the end of mission life of the spacecraft. The second half of the thesis explores the possibility of using a new propellant tank configuration, consisting of a truncated cone centrally mounted within a spherical propellant tank, to measure the amount of liquid propellant present within the tank. The liquid propellant present within the propellant tank orients itself in a geometry, by virtue of its dominant surface tension force in zero-g condition, which minimizes its total surface energy. Study reveals that the amount of liquid propellant present in the tank can thus be estimated by measuring the height of the propellant meniscus within the central cone. It is also observed that, unlike gas law method, bookkeeping method or the propellant tank heating method, where the precision of the estimated propellant fill-fraction decreases towards the end-of-life of the spacecraft, for the proposed new configuration the precision increases.
114

Parameter, State and Uncertainty Estimation for 3-dimensional Biological Ocean Models

Mattern, Jann Paul 15 August 2012 (has links)
Realistic physical-biological ocean models pose challenges to statistical techniques due to their complexity, nonlinearity and high dimensionality. In this thesis, statistical data assimilation techniques for parameter and state estimation are adapted and applied to biological models. These methods rely on quantitative measures of agreement between models and observations. Eight such measures are compared and a suitable multiscale measure is selected for data assimilation. Build on this, two data assimilation approaches, a particle filter and a computationally efficient emulator approach are tested and contrasted. It is shown that both are suitable for state and parameter estimation. The emulator is also used to analyze sensitivity and uncertainty of a realistic biological model. Application of the statistical procedures yields insights into the model; e.g. time-dependent parameter estimates are obtained which are consistent with biological seasonal cycles and improves model predictions as evidenced by cross-validation experiments. Estimates of model sensitivity are high with respect to physical model inputs, e.g river runoff.
115

Sensitivity and Uncertainty Analysis Methods : with Applications to a Road Traffic Emission Model / Känslighets- och osäkerhetsanalysmetoder : med tillämpningar på en emissionsmodell för vägtrafik

Eriksson, Olle January 2007 (has links)
There is always a need to study the properties of complex input–output systems, properties that may be very difficult to determine. Two such properties are the output’s sensitivity to changes in the inputs and the output’s uncertainty if the inputs are uncertain. A system can be formulated as a model—a set of functions, equations and conditions that describe the system. We ultimately want to study and learn about the real system, but with a model that approximates the system well, we can study the model instead, which is usually easier. It is often easier to build a model as a set of combined sub-models, but good knowledge of each sub-model does not immediately lead to good knowledge of the entire model. Often, the most attractive approach to model studies is to write the model as computer software and study datasets generated by that software. Methods for sensitivity analysis (SA) and uncertainty analysis (UA) cannot be expected to be exactly the same for all models. In this thesis, we want to determine suitable SA and UA methods for a road traffic emission model, methods that can also be applied to any other model of similar structure. We examine parts of a well-known emission model and suggest a powerful data-generating tool. By studying generated datasets, we can examine properties in the model, suggest SA and UA methods and discuss the properties of these methods. We also present some of the results of applying the methods to the generated datasets. / Det finns alltid behov av att studera egenskaper hos komplexa input-output-system, egenskaper som kan vara mycket svåra att få fram. Två sådana egenskaper är ut fallets känslighet mot förändringar i ingångsvärdena och utfallets osäkerhet om ingångsvärdena har osäkerhet. Ett system kan formuleras som en modell-en mängd funktioner, ekvationer och betingelser som tillsammans liknar systemet. Vi vill egentligen studera och lära oss det verkliga systemet, men med en modell som approximerar det verkliga systemet bra kan man studera modellen istället, vilket i de flesta fall är enklare. Det är oftast enklare att bygga en modell som en mängd kombinerade delmodeller, men bra kunskap om varje delmodell leder inte omedelbart till bra kunskap om hela modellen. Det enklaste tillvägagångssättet för modellstudier är oftast att studera datamängder som genererats av modellen genom ett datorprogram. Metoder för känslighetsanalys (SA) och osäkerhetsanalys (UA) kan inte förväntas vara likadana för varje modell. I den här avhandlingen ska vi studera SA- och UA-metoder och resultat för en emissionsmodell för vägtrafik, men metoderna kan även användas för andra modeller av liknande struktur. Vi undersöker en välkänd emissionsmodell och föreslår ett kraftfullt verktyg för att generera data. Genom att studera genererade datamängder kan vi undersöka egenskaper i modellen, föreslå SA- och VA-metoder och diskutera metodernas egenskaper. Vi visar också några resultat när man tillämpar metoderna på de genererade datamängderna.
116

Utvärdering av osäkerhet och variabilitet vid beräkning av riktvärden för förorenad mark / Evaluation of Uncertainty and Variability in Calculations of Soil Guideline Values

Larsson, Emelie January 2014 (has links)
I Sverige finns cirka 80 000 identifierade förorenade områden som i vissa fall behöver efterbehandling för att hantera föroreningssituationen. Naturvårdsverket publicerade 2009 ett reviderat vägledningsmaterial för riskbedömningar av förorenade områden tillsammans med en beräkningsmodell för att ta fram riktvärden. Riktvärdesmodellen är deterministisk och genererar enskilda riktvärden för ämnen under givna förutsättningar. Modellen tar inte explicit hänsyn till osäkerhet och variabilitet utan hanterar istället det implicit med säkerhets­faktorer och genom att användaren alltid utgår från ett rimligt värsta scenario vid val av parametervärden. En metod för att hantera osäkerhet och variabilitet i riskbedömningar är att göra en så kallad probabilistisk riskbedömning med Monte Carlo-simuleringar. Fördelen med detta är att ingångsparametrar kan definieras med sannolikhetsfördelningar och på så vis hantera inverkan av osäkerhet och variabilitet. I examensarbetet genomfördes en probabilistisk riskbedömning genom en vidare egen implementering av Naturvårdsverkets metodik varefter probabilistiska riktvärden beräknades för ett antal ämnen. Modellen tillämpades med två parameter­uppsättningar vars värden hade förankrats i litteraturen respektive Naturvårdsverkets metodik. Uppsättningarna genererade kumulativa fördelningsfunktioner av riktvärden som överensstämde olika mycket med de deterministiska riktvärden som Naturvårdsverket definierat. Generellt överensstämde deterministiska riktvärden för markanvändningsscenariot känslig mark­användning (KM) mer med den probabilistiska riskbedömningen än för scenariot mindre känslig markanvändning (MKM). Enligt resultatet i examensarbetet skulle dioxin och PCB-7 behöva en sänkning av riktvärden för att fullständigt skydda människor och miljö för MKM. En fallstudie över ett uppdrag som Geosigma AB utfört under hösten 2013 genomfördes också. Det var generellt en överensstämmelse mellan de platsspecifika riktvärden (PRV) som beräknats i undersökningsrapporten och den probabilistiska risk­bedömningen. Undantaget var ämnet koppar som enligt studien skulle behöva halverade riktvärden för att skydda människor och miljö. I den probabilistiska riskbedömningen kvantifierades hur olika skyddsobjekt respektive exponeringsvägar blev styrande för olika ämnens riktvärden mellan simuleringar. För några ämnen skedde avvikelser jämfört med de deterministiska motsvarigheterna i mellan 70-90 % av fallen. Exponeringsvägarnas bidrag till det ojusterade hälsoriskbaserade riktvärdet kvantifierades också i en probabilistisk hälsoriskbaserad riskbedömning. Riktvärden med likvärdiga numeriska värden erhölls för riktvärden med skild sammansättning. Detta motiverade att riktvärdenas sammansättning och styrande exponeringsvägar alltid bör kvantifieras vid en probabilistisk riskbedömning. / In Sweden, approximately 80,000 contaminated areas have been identified. Some of these areas are in need of remediation to cope with the effects that the contaminants have on both humans and the environment. The Swedish Environmental Protection Agency (EPA) has published a methodology on how to perform risk assessments for contaminated soils together with a complex model for calculating soil guideline values. The guideline value model is deterministic and calculates single guideline values for contaminants. The model does not account explicitly for uncertainty and variability in parameters but rather handles it implicitly by using safety-factors and reasonable worst-case assumptions for different parameters. One method to account explicitly for uncertainty and variability in a risk assessment is to perform a probabilistic risk assessment (PRA) through Monte Carlo-simulations. A benefit with this is that the parameters can be defined with probability density functions (PDFs) that account for the uncertainty and variability of the parameters. In this Master's Thesis a PRA was conducted and followed by calculations of probabilistic guideline values for selected contaminants. The model was run for two sets of PDFs for the parameters: one was collected from extensive research in published articles and another one included the deterministic values set by the Swedish EPA for all parameters. The sets generated cumulative probability distributions (CPDs) of guideline values that, depending on the contaminant, corresponded in different levels to the deterministic guideline values that the Swedish EPA had calculated. In general, there was a stronger correlation between the deterministic guideline values and the CPDs for the sensitive land-use scenario compared to the less sensitive one. For contaminants, such as dioxin and PCB-7, a lowering of the guideline values would be required to fully protect humans and the environment based on the results in this thesis. Based on a recent soil investigation that Geosigma AB has performed, a case study was also conducted. In general there was a correlation between the deterministic site specific guideline values and the CPDs in the case study. In addition to this, a health oriented risk assessment was performed in the thesis where unexpected exposure pathways were found to be governing for the guideline values. For some contaminants the exposure pathway governing the guideline values in the PRA differed from the deterministic ones in 70-90 % of the simulations. Also, the contributing part of the exposure pathways to the unadjusted health guideline values differed from the deterministic ones. This indicated the need of always quantifying the composition of guideline values in probabilistic risk assessments.
117

Sensitivity and uncertainty analyses of contaminant fate and transport in a field-scale subsurface system

Wang, Jinjun 31 March 2008 (has links)
Health scientists often rely on simulation models to reconstruct groundwater contaminant exposure data for retrospective epidemiologic studies. Due to the nature of historical reconstruction process, there are inevitably uncertainties associated with the input data and, therefore, with the final results of the simulation models, potentially adversely impacting related epidemiologic investigations. This study examines the uncertainties associated with the historically reconstructed contaminant fate and transport simulations for an epidemiologic study conducted at U.S. Marine Corps Base Camp Lejeune, North Carolina. To achieve an efficient uncertainty analysis, sensitivity analysis was first conducted to identify the critical uncertain variables, which were then adopted in the uncertainty analysis using an improved Monte Carlo simulation (MCS) method. Particularly, uncertainties associated with the historical contaminant arrival time were evaluated. To quantify the uncertainties in an efficient manner, a procedure identified as Pumping Schedule Optimization System (PSOpS) was developed to obtain the extreme (i.e., earliest and latest) contaminant arrival times caused by pumping schedule variations. Two improved nonlinear programming methods Rank-and-Assign (RAA) and Improved Gradient (IG) are used in PSOpS to provide computational efficiency. Furthermore, a quantitative procedure named Pareto Dominance based Critical Realization Identification (PDCRI) was developed to screen out critical realizations for contaminant transport in subsurface system, so that the extreme contaminant arrival times under multi-parameter uncertainties could be evaluated efficiently.
118

Evaluation of emission uncertainties and their impacts on air quality modeling: applications to biomass burning

Tian, Di 20 November 2006 (has links)
Air pollution has changed from an urban environmental problem to a phenomenon spreading to state, country and even global scales. In response, a variety of regulations, standards, and policies have been enacted world-wide. Policy-making and development of efficient and effective control strategies requires understanding of air quality impacts from different sources, which are usually estimated using source-oriented air quality models and their corresponding uncertainties should be addressed. This thesis evaluates emission uncertainties and their impacts on air quality modeling (Models-3/Community Multiscale Air Quality Model (CMAQ)), with special attention to biomass burning. Impacts of uncertainties in ozone precursors (mainly NOX and VOC) emissions from different sources and regions on ozone formation and emission control efficiencies are evaluated using Monte Carlo methods. Instead of running CMAQ multiple of times, first and higher order ozone sensitivities calculated by Higher-order Decoupled Direct method in Three Dimensions (CMAQ-HDDM-3D) are employed to propagate emission uncertainties. Biomass burning is one of the major sources for PM2.5. Impacts of uncertainties in biomass burning emissions, including total amount, temporal and spatial characteristics, and speciation on air quality modeling are investigated to identify emission shortcomings. They are followed by estimation of seasonal PM2.5 source contributions over the southeastern US focusing on Georgia. Results show that prescribed forest fires are the largest individual biomass burning source. Forest fire emissions under different forest management practices and ensuing air quality impacts are further studied. Forest management practices considered here include different burning seasons, fire-return intervals (FRIs), and controlling emissions during smoldering. Finally, uncertainties in prescribed forest fire emissions are quantified by propagation of uncertainties in burned area, fuel consumption and emission factors, which are required inputs for emissions estimation and quantified by various fire behavior models and methods. In summary, this thesis has provided important insights regarding emission uncertainties and their impacts on air quality modeling.
119

Environmental prediction and risk analysis using fuzzy numbers and data-driven models

Khan, Usman Taqdees 17 December 2015 (has links)
Dissolved oxygen (DO) is an important water quality parameter that is used to assess the health of aquatic ecosystems. Typically physically-based numerical models are used to predict DO, however, these models do not capture the complexity and uncertainty seen in highly urbanised riverine environments. To overcome these limitations, an alternative approach is proposed in this dissertation, that uses a combination of data-driven methods and fuzzy numbers to improve DO prediction in urban riverine environments. A major issue of implementing fuzzy numbers is that there is no consistent, transparent and objective method to construct fuzzy numbers from observations. A new method to construct fuzzy numbers is proposed which uses the relationship between probability and possibility theory. Numerical experiments are used to demonstrate that the typical linear membership functions used are inappropriate for environmental data. A new algorithm to estimate the membership function is developed, where a bin-size optimisation algorithm is paired with a numerical technique using the fuzzy extension principle. The developed method requires no assumptions of the underlying distribution, the selection of an arbitrary bin-size, and has the flexibility to create different shapes of fuzzy numbers. The impact of input data resolution and error value on membership function are analysed. Two new fuzzy data-driven methods: fuzzy linear regression and fuzzy neural network, are proposed to predict DO using real-time data. These methods use fuzzy inputs, fuzzy outputs and fuzzy model coefficients to characterise the total uncertainty. Existing methods cannot accommodate fuzzy numbers for each of these variables. The new method for fuzzy regression was compared against two existing fuzzy regression methods, Bayesian linear regression, and error-in-variables regression. The new method was better able to predict DO due to its ability to incorporate different sources of uncertainty in each component. A number of model assessment metrics were proposed to quantify fuzzy model performance. Fuzzy linear regression methods outperformed probability-based methods. Similar results were seen when the method was used for peak flow rate prediction. An existing fuzzy neural network model was refined by the use of possibility theory based calibration of network parameters, and the use of fuzzy rather than crisp inputs. A method to find the optimum network architecture was proposed to select the number of hidden neurons and the amount of data used for training, validation and testing. The performance of the updated fuzzy neural network was compared to the crisp results. The method demonstrated an improved ability to predict low DO compared to non-fuzzy techniques. The fuzzy data-driven methods using non-linear membership functions correctly identified the occurrence of extreme events. These predictions were used to quantify the risk using a new possibility-probability transformation. All combination of inputs that lead to a risk of low DO were identified to create a risk tool for water resource managers. Results from this research provide new tools to predict environmental factors in a highly complex and uncertain environment using fuzzy numbers. / Graduate / 0543 / 0775 / 0388
120

[en] SEMI-QUANTITATIVE METHODOLOGY FOR ASSESSING THE RISK OF CO2 INJECTION FOR STORAGE IN GEOLOGICAL RESERVOIRS / [pt] METODOLOGIA SEMI-QUANTITATIVA PARA AVALIAÇÃO DO RISCO DA INJEÇÃO DE CO2 PARA ARMAZENAMENTO EM RESERVATÓRIOS GEOLÓGICOS

FERNANDA LINS GONCALVES PEREIRA 03 October 2016 (has links)
[pt] A última etapa do sequestro e armazenamento de carbono (CCS) pode ser realizada pela de injeção de CO2 em reservatórios geológicos. Projetos de CCS fazem parte de uma série de técnicas para a mitigação dos gases do efeito estufa. Neste trabalho, uma metodologia semi-quantitativa para avaliação do risco da injeção de CO2 em reservatórios geológicos é apresentada. Essa metodologia é desenvolvida a partir da criação e utilização de uma matriz de risco. Essa matriz possui em uma direção categorias de severidade ajustadas de forma qualitativa e na outra direção categorias de probabilidade ajustadas a partir de análises probabilísticas. Os valores de risco de uma fonte de perigo são calculados pelo produto de suas severidades com suas probabilidades associadas. As fontes de perigo são problemas relacionados à injeção de CO2 que são selecionadas para análise de um cenário específico. As categorias de severidade são definidas por faixas de níveis de funcionamento de uma fonte de perigo. Diversos métodos de análise probabilística são investigados e a família de métodos do valor médio apresenta características favoráveis ao seu emprego em funções de estado limite complexas. A metodologia é aplicada em um estudo de caso ilustrativo. Com os valores de risco resultantes, faz-se a identificação da principal fonte de perigo e das variáveis aleatórias mais influentes. A avaliação da metodologia indica que ela é uma ferramenta poderosa para os analistas e tomadores de decisão, e tem potencial para auxiliar na fase de planejamento de projetos de CCS. / [en] The last stage of carbon capture and sequestration (CCS) can be performed by CO2 injection process in geological reservoirs. CCS projects belong to a number of ways to mitigate greenhouse gases. In this work, a semi-quantitative methodology to assess the risk of CO2 injection in geological reservoirs is developed. This methodology is based on the establishment and application of a risk matrix. This matrix has in one direction severity categories set in a qualitative way and in the other direction probability categories set from probabilistic analysis. The risk values of a hazard source are calculated by the product of their severities with their associated probabilities. Hazard sources are problems related to the injection of CO2 that are selected for a specific scenario analysis. The severity categories are defined by operating level ranges of a hazard source. Several probabilistic analysis methods are investigated and the family of the mean value methods shows characteristics favoring their use in complex limit state functions.The methodology is applied in an illustrative case study. With the resulting risk values, the identification of the main hazard source and the most inuential random variables are made. Assessment of the methodology indicates that it is a powerful tool for analysts and decision makers, and it has the potential to assist in the CCS project planning phase.

Page generated in 0.0564 seconds