• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 198
  • 75
  • 66
  • 43
  • 14
  • 11
  • 9
  • 9
  • 7
  • 7
  • 5
  • 4
  • 4
  • 2
  • 2
  • Tagged with
  • 534
  • 534
  • 444
  • 84
  • 69
  • 68
  • 66
  • 61
  • 53
  • 52
  • 47
  • 45
  • 44
  • 43
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Assessing sheep’s wool as a filtration material for the removal of formaldehyde in the indoor environment

Wang, Jennifer, active 21st century 11 September 2014 (has links)
Formaldehyde is one of the most prevalent and toxic chemicals found indoors, where we spend ~90% of our lives. Chronic exposure to formaldehyde indoors, therefore, is of particular concern, especially for sensitive populations like children and infants. Unfortunately, no effective filtration control strategy exists for its removal. While research has shown that proteins in sheep's wool bind permanently to formaldehyde, the extent of wool's formaldehyde removal efficiency and effective removal capacity when applied in active filtration settings is unknown. In this research, wool capacity experiments were designed using a plug flow reactor and air cleaner unit to explore the capacity of wool to remove formaldehyde given different active filtration designs. Using the measured wool capacity, filter life and annual costs were modeled in a typical 50 m₃ room for a variety of theoretical filter operation lengths, air exchange rates, and source concentrations. For each case, annual filtration costs were compared to the monetary benefits derived from wool resale and from the reduction in cancer rates for different population types using the DALYs human exposure metric. Wool filtration was observed to drop formaldehyde concentrations between 60-80%, although the effective wool removal capacity was highly dependent on the fluid mechanics of the filtration unit. The air cleaner setup yielded approximately six times greater capacity than the small-scale PFR designed to mimic active filtration (670 [mu]g versus 110 [mu]g HCHO removed per g of wool, respectively). The outcomes of these experiments suggest that kinematic variations resulting from different wool packing densities, air flow rates, and degree of mixing in the units influence the filtration efficiency and effective capacity of wool. The results of the cost--benefit analysis show that for the higher wool capacity conditions, cost-effectiveness is achieved by the majority of room cases when sensitive populations like children and infants are present. However, for the average population scenarios, filtration was rarely worthwhile, showing that adults benefit less from reductions in chronic formaldehyde exposure. These results suggest that implementation of active filtration would be the most beneficial and cost-effective in settings like schools, nurseries, and hospitals that have a high percentage of sensitive populations. / text
122

Indirekta positiva effekter till följd av transportinvesteringar - utvärdering av existerande hanteringsmetoder

Stern Petersson, Linnea January 2008 (has links)
<p>I den här uppsatsen undersöks och utvärderas metoder för att beräkna indirekta socioekonomiska effekter till följd av transportinvesteringar. De projekt som ligger till grund för denna jämförelse finns på gemensam europeisk samt på nationell nivå. Projekten har använt sig antingen av kvantitativ analys, i form av multikriterieanalys alternativt cost-benefit analys, eller av kvalitativ analys, för att utvärdera effekterna. Slutsatserna består i att det är svårt att beräkna dessa effekter och att det därför är viktigt att metoden speglar den underliggande informationen. En verbal metod som speglar osäkerheten är ofta att föredra framför en metod som presenterar exakta monetära mått trots att den underliggande informationen är osäker.</p>
123

Valuing environmental benefits using the contingent valuation method : an econometric analysis

Kriström, Bengt January 1990 (has links)
The purpose of this study is to investigate methods for assessing the value people place on preserving our natural environments and resources. It focuses on the contingent valuation method, which is a method for directly asking people about their preferences. In particular, the study focuses on the use of discrete response data in contingent valuation experiments.The first part of the study explores the economic theory of the total value of a natural resource, where the principal components of total value are analyzed; use values and non-use values. Our application is a study of the value Swedes' attach to the preservation of eleven forest areas that contain high recreational values and contain unique environmental qualities. Six forests were selected on the basis of an official investigation which includes virgin forests and other areas with unique environmental qualities. In addition, five virgin forests were selected.Two types of valuation questions are analyzed, the continuous and the discrete. The first type of question asks directly about willingness to pay, while the second type suggests a price that the respondent may reject or accept. The results of the continuous question suggest an average willingness to pay of about 1,000 SEK per household for preservation of the areas. Further analysis of the data suggests that this value depends on severi characteristics of the respondent: such as the respondent's income and whether or not the respondent is an altruist.Two econometric approaches are used to analyze the discrete responses; a flexible parametric approach and a non-parametric approach. In addition, a Bayesian approach is described. It is shown that the results of a contingent valuation experiment may depend to some extent on the choice of the probability model. A re-sampling approach and a Monte-Carlo approach is used to shed light on the design of a contingent valuation experiment with discrete responses. The econometric analysis ends with an analysis of the often observed disparity between discrete and continuous valuation questions.A cost-benefit analysis is performed in the final chapter. The purpose of this analysis is to illustrate how the contingent valuation approach may be combined with opportunity cost data to improve the decision-basis in the environmental policy domain. This analysis does not give strong support for a cutting alternative. Finally, the results of this investigation are compared with evidence from other studies.The main conclusion of this study is that assessment of peoples' sentiments towards changes of our natural environments and resources can be a useful supplement to decisions about the proper husbandry of our natural environments and resources. It also highlights the importance of careful statistical analysis of data gained from contingent valuation experiments. / digitalisering@umu
124

A Bayesian cost-benefit approach to sample size determination and evaluation in clinical trials

Kikuchi, Takashi January 2011 (has links)
Current practice for sample size computations in clinical trials is largely based on frequentist or classical methods. These methods have the drawback of requiring a point estimate of the variance of treatment effect and are based on arbitrary settings of type I and II errors. They also do not directly address the question of achieving the best balance between the costs of the trial and the possible benefits by using a new medical treatment, and fail to consider the important fact that the number of users depends on evidence for improvement compared with the current treatment. A novel Bayesian approach, Behavioral Bayes (or BeBay for short) (Gittins and Pezeshk, 2000a,b, 2002a,b; Pezeshk, 2003), assumes that the number of patients switching to the new treatment depends on the strength of the evidence which is provided by clinical trials, and takes a value between zero and the number of potential patients in the country. The better a new treatment, the more patients switch to it and the more the resulting benefit. The model defines the optimal sample size to be the sample size that maximises the expected net benefit resulting from a clinical trial. Gittins and Pezeshk use a simple form of benefit function for paired comparisons between two medical treatments and assume that the variance of the efficacy is known. The research in this thesis generalises these original conditions by introducing a logistic benefit function to take account of differences in efficacy and safety between two drugs. The model is also extended to the more general cases of unpaired comparisons and unknown variance. The expected net benefit defined by Gittins and Pezeshk is based on the efficacy of the new drug only. It does not consider the incidence of adverse reactions and their effect on patients’ preferences. Here we include the costs of treating adverse reactions and calculate the total benefit in terms of how much the new drug can reduce societal expenditure. We describe how our model may be used for the design of phase III clinical trials, cluster randomised clinical trials and bridging studies. This is done in some detail and using illustrative examples based on published studies. For phase III trials we allow the possibility of unequal treatment group sizes, which often occur in practice. Bridging studies are those carried out to extend the range of applicability of an established drug, for example to new ethnic groups. Throughout the objective of our procedures is to optimise the costbenefit in terms of national health-care. BeBay is the leading methodology for determining sample sizes on this basis. It explicitly takes account of the roles of three decision makers, namely patients and doctors, pharmaceutical companies and the health authority.
125

Investigation of regulatory efficiency with reference to the EU Water Framework Directive : an application to Scottish agriculture

Lago Aresti, Manuel January 2009 (has links)
The Water Framework Directive (WFD) has the stated objective of delivering good status (GS) for Europe’s surface waters and groundwaters. But meeting GS is cost dependent, and in some water bodies pollution abatement costs may be high or judged as disproportionate. The definition and assessment of disproportionate costs is central for the justification of time-frame derogations and/or lowering the environmental objectives (standards) for compliance at a water body. European official guidance is discretionary about the interpretation of disproportionate costs which consequently can be interpreted and applied differently across Member States. The aim of this research is to clarify the definition of disproportionality and to convey a consistent interpretation that is fully compliant with the economic requirements of the Directive, whilst also being mindful of the principles of pollution control and welfare economics theory. On this basis, standard-setting derogations should aim to reach socially optimal decisions and be judged with reference to a combination of explicit cost and benefit curves – an application of Cost-Benefits Analysis - and financial affordability tests. Arguably, these tools should be more influential in the development of derogation decisions across member states, including Scotland. The WFD is expected to have extensive effects on Scottish agriculture, which is faced with the challenge of maintaining its competitiveness, while protecting water resources. Focusing the analysis on the socio-economic impacts of achieving water diffuse pollution targets for the sector, a series of independent tests for the assessment of disproportionate costs are proposed and evaluated. These are: i) development of abatement cost curves for agricultural Phosphorus (P) mitigation options for different farm systems; ii) a financial characterisation of farming in Scotland and impact on profits of achieving different P loads reductions at farm level are investigated in order to explore issues on "affordability" and "ability to pay" by the sector; and iii) an investigation of benefits assessment using discrete choice modelling to explore public preferences for pollution control and measure non-market benefits of WFD water quality improvements in Scotland. Results from these tests provide benchmarks for the definition of disproportionate costs and are relevant to other aspects of the economic analysis of water use in Scotland. This study helps to clarify the nature of agricultural water use and how it leads to social tradeoffs with other non agricultural users. Ultimately, this perspective adds to the debate of how and where water is best employed to maximize its value to society.
126

The Valuation of River Ecosystem Services

Jiang, Wei 09 November 2016 (has links)
No description available.
127

Risk assessment of natural hazards : Data availability and applicability for loss quantification

Grahn, Tonje January 2017 (has links)
Quantitative risk assessments are a fundamental part of economic analysis and natural hazard risk management models. It increases the objectivity and the transparency of risk assessments and guides policymakers in making efficient decisions when spending public resources on risk reduction. Managing hazard risks calls for an understanding of the relationships between hazard exposure and vulnerability of humans and assets.   The purpose of this thesis is to identify and estimate causal relationships between hazards, exposure and vulnerability, and to evaluate the applicability of systematically collected data sets to produce reliable and generalizable quantitative information for decision support.   Several causal relationships have been established. For example, the extent of lake flood damage to residential buildings depends on the duration of floods, distance to waterfront, the age of the house and in some cases the water level. Results also show that homeowners private initiative to reduce risk, prior to or during a flood, reduced their probability of suffering building damage with as much as 40 percent. Further, a causal relationship has been established between the number of people exposed to quick clay landslides and landslide fatalities.   Even though several relationships were identified between flood exposure and vulnerability, the effects can only explain small parts of the total variation in damages, especially at object level. The availability of damage data in Sweden is generally low. The most comprehensive damage data sets in Sweden are held by private insurance companies and are not publicly available. Data scarcity is a barrier to quantitative natural hazard risk assessment in Sweden. More efforts should therefore be made to collect data systematically for modelling and validating standardized approaches to quantitative damage estimation. / Natural hazard damages have increased worldwide. Impacts caused by hydrological and meteorological hazards have increased the most. An analysis of insurance payments in Sweden showed that flood damages have been increasing in Sweden as well. With climate change and increasing populations we can expect this trend to continue unless efforts are made to reduce risk and adapt communities to the threats. Economic analysis and quantitative risk assessments of natural hazards are fundamental parts of a risk management process that can support policymakers' decisions on efficient risk reduction. However, in order to develop reliable damage estimation models knowledge is needed of the relationships between hazard exposure and the vulnerability of exposed objects and persons. This thesis has established causal relationships between residential exposure and flood damage on the basis of insurance data. I also found that private damage-reducing actions decreased the probability of damage to buildings with almost 40 percent. Further, a causal relationship has been established between the number of people exposed to quick clay landslides and fatalities. Even though several relationships have been identified between flood exposure and vulnerability, the effects can explain only small parts of the total variation in damages, especially at object level, and more effort is needed to develop quantitative models for risk assessment purposes.
128

Analyse de sensibilité de modèles spatialisés : application à l'analyse coût-bénéfice de projets de prévention du risque d'inondation / Variance-based sensitivity analysis for spatially distributed models : application to cost-benefit analysis of flood risk management plansSpatially distributed model; Sensitivity analysis; Uncertainty; Scale; Geostatistics;CBA; Flood; Damage.

Saint-Geours, Nathalie 29 November 2012 (has links)
L'analyse de sensibilité globale basée sur la variance permet de hiérarchiser les sources d'incertitude présentes dans un modèle numérique et d'identifier celles qui contribuent le plus à la variabilité de la sortie du modèle. Ce type d'analyse peine à se développer dans les sciences de la Terre et de l'Environnement, en partie à cause de la dimension spatiale de nombreux modèles numériques, dont les variables d'entrée et/ou de sortie peuvent être des données distribuées dans l'espace. Le travail de thèse réalisé a pour ambition de montrer comment l'analyse de sensibilité globale peut être adaptée pour tenir compte des spécificités de ces modèles numériques spatialisés, notamment la dépendance spatiale dans les données d'entrée et les questions liées au changement d'échelle spatiale. Ce travail s'appuie sur une étude de cas approfondie du code NOE, qui est un modèle numérique spatialisé d'analyse coût-bénéfice de projets de prévention du risque d'inondation. On s'intéresse dans un premier temps à l'estimation d'indices de sensibilité associés à des variables d'entrée spatialisées. L'approche retenue du « map labelling » permet de rendre compte de l'auto-corrélation spatiale de ces variables et d'étudier son impact sur la sortie du modèle. On explore ensuite le lien entre la notion d'« échelle » et l'analyse de sensibilité de modèles spatialisés. On propose de définir les indices de sensibilité « zonaux » et « ponctuels » pour mettre en évidence l'impact du support spatial de la sortie d'un modèle sur la hiérarchisation des sources d'incertitude. On établit ensuite, sous certaines conditions, des propriétés formelles de ces indices de sensibilité. Ces résultats montrent notamment que l'indice de sensibilité zonal d'une variable d'entrée spatialisée diminue à mesure que s'agrandit le support spatial sur lequel est agrégée la sortie du modèle. L'application au modèle NOE des méthodologies développées se révèle riche en enseignements pour une meilleure prise en compte des incertitudes dans les modèles d'analyse coût-bénéfice des projets de prévention du risque d'inondation. / Variance-based global sensitivity analysis is used to study how the variability of the output of a numerical model can be apportioned to different sources of uncertainty in its inputs. It is an essential component of model building as it helps to identify model inputs that account for most of the model output variance. However, this approach is seldom applied in Earth and Environmental Sciences, partly because most of the numerical models developed in this field include spatially distributed inputs or outputs . Our research work aims to show how global sensitivity analysis can be adapted to such spatial models, and more precisely how to cope with the following two issues: i) the presence of spatial auto-correlation in the model inputs, and ii) the scaling issues. We base our research on the detailed study of the numerical code NOE, which is a spatial model for cost-benefit analysis of flood risk management plans. We first investigate how variance-based sensitivity indices can be computed for spatially distributed model inputs. We focus on the “map labelling” approach, which allows to handle any complex spatial structure of uncertainty in the modelinputs and to assess its effect on the model output. Next, we offer to explore how scaling issues interact with the sensitivity analysis of a spatial model. We define “block sensitivity indices” and “site sensitivity indices” to account for the role of the spatial support of model output. We establish the properties of these sensitivity indices under some specific conditions. In particular, we show that the relative contribution of an uncertain spatially distributed model input to the variance of the model output increases with its correlation length and decreases with the size of the spatial support considered for model output aggregation. By applying our results to the NOE modelling chain, we also draw a number of lessons to better deal with uncertainties in flood damage modelling and cost-benefit analysis of flood riskmanagement plans.
129

Efficiency and acceptability of pricing policies and transport investments in distorted economies

Westin, Jonas January 2012 (has links)
This thesis contains five papers studying the economic efficiency and political acceptability of road pricing policies and transport investments in distorted economies. Interactions between the transport market and other distorted markets, such as the labor market, can have a large impact on the welfare effect of a road pricing policy or a transport investment. Many road pricing studies therefore try to incorporate effects from other distorted markets in the analysis. Paper I analyzes how the economic efficiency of a road toll in a distorted economy depends on assumptions about the initial tax system. In the road pricing literature, the welfare effect of a road toll is often found to depend on revenue use. Using a simple general equilibrium model paper I shows that the relative efficiency of marginal revenue recycling policies depends more on assumptions regarding inefficiencies in the initial tax system than on the road toll per se. Paper II studies the effect on welfare, equity and labor supply from a road toll in a commuting population with heterogeneous value of time and endogenous labor supply. When explicitly taking into account that commuters have different value of time, the road toll can increase total labor supply even when the revenues are not recycled back to the commuters. The analysis stresses the importance of recognizing traveler heterogeneity when analyzing congestion pricing. Road pricing policies are often characterized by conflicting interests between different stakeholders and different geographical areas. Papers III and IV study the economic efficiency and political acceptability of pricing and investment policies in different institutional and geographical settings. The main contribution of the papers is to explain how political constraints can lead to inefficient tolling strategies. The papers contribute to the existing literature on political acceptability of road pricing by analyzing the conflict and potential trade-off between political acceptability and economic efficiency. A difficulty when assessing the welfare effect of a future transport policy is also that many factors and parameters needed for the analysis are uncertain. Paper V studies the climate benefit of an investment in high speed rail by calculating the magnitude of annual traffic emission reduction required to compensate for the annualized embedded emissions from the construction of the line. The paper finds that to be able to balance the annualized emissions from the construction, traffic volumes of more than 10 million annual one-way trips are usually required, and most of the traffic diverted from other transport modes must come from aviation. / <p>QC 20121010</p>
130

[en] OPTIMUM LEVEL OF INTERNATIONAL RESERVES FOR EMERGING ECONOMIES / [pt] NÍVEL ÓTIMO DE RESERVAS INTERNACIONAIS PARA ECONOMIAS EMERGENTES

JULIANA TERREIRO SALOMAO 30 September 2008 (has links)
[pt] Ao longo dos últimos 20 anos as economias aumentaram seus estoques de reservas internacionais de forma acelerada. As reservas globais passaram de aproximadamente um trilhão de dólares em 1990, para mais de cinco trilhões de dólares em 2006. Este processo também pode ser observado no Brasil, principalmente nos últimos dois anos, onde o estoque de reservas passou de aproximadamente 60 bilhões de dólares no fim de 2005 para mais de 180 bilhões de dólares no fim de 2007. Neste estudo, fazemos uma análise custobenefício das reservas internacionais, levando em consideração o seu papel como mitigadora tanto da probabilidade de ocorrência quanto do custo da crise, uma vez esta instaurada. Nossos resultados indicam que maiores reservas, representadas pela razão Reserva/Dívida Externa de Curto Prazo, são significantes em reduzir o custo e a probabilidade de crise. Além disso, encontramos que os níveis de reservas acumulados pela maioria dos países emergentes analisados são ótimos para valores razoáveis de custo de crise e de custo de manter reservas. No entanto, o caso brasileiro é uma exceção, pois o acúmulo de reservas internacionais nos últimos dois anos parece ser excessivo, não podendo ser explicado pelo modelo estimado. / [en] Over the past 20 years, economies have increased their levels of international reserves at a rapid pace. Global reserves went from approximately one trillion dollars in 1990, to over five trillion dollars in 2006. This trend can also be observed in Brazil, especially over the past two years, when the stock of reserves increased from about 60 billion dollars by the end of 2005 to more than 180 billion dollars by the end of 2007. In this study, we make a cost-benefit analysis of international reserves, taking into account its role in mitigating both the probability of a crisis, and the cost of a crisis once it happens. Our results show that higher reserves, represented by the Reserves/Short Term External Debt ratio, are significant in decreasing the cost and probability of a crisis. Furthermore, we find that the levels of reserves accumulated by the majority of the emerging economies analyzed are optimum for reasonable values of cost of crisis and cost of reserves. However, the Brazilian case is an exception, since the reserves accumulated in the past two years seem excessive, not being explained by the model estimated.

Page generated in 0.0868 seconds