• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 22
  • 22
  • 22
  • 5
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Theoretical and Practical Analysis of the Effect of Drilling Fluid on Rebar Bond Strength

Costello, Kelly 08 November 2018 (has links)
Drilled shafts are large cylindrical cast-in-place concrete structural elements that can be favored due to cost-effectiveness. These elements however, require strict quality control during construction to ensure a stable excavation. Drilling fluid is often used in construction to attain this stability. Drilling fluid, or slurry, can be ground water or salt water, but is typically made from a mixture of water and mineral or polymer powder to form a viscous fluid slightly more dense than ground water. During concreting, the drilling fluid is displaced by the heavier concrete, which is tremie placed at the base of the excavation from the center of the reinforcement cage. While concrete used for drilled shafts should be highly fluid, it does not follow an ideal, uniform flow. The concrete rather builds up inside the reinforcement cage to a sufficient height before then pressing out radially into the annular cover region. This concrete flow pattern associated with drilled shafts has been shown to trap slurry around/near the steel reinforcement and affect reinforcement bond strength. Presently there are no specifications relating to slurry effects on reinforcing bar bond strength from the American Concrete Institute (ACI) or the American Association of State and Highway Transportation Officials (AASHTO). This dissertation analyzes longitudinal reinforcing bar concrete bond strength data recorded from 268 specimens constructed with tremie-placed concreting conditions in varying drilling fluids. Reinforcement used for testing were No. 8 deformed rebar. Based on the results found from this analysis, this dissertation recommends the use of a slurry modification factor to current bond strength and development length specifications.
2

Application of Monitored Natural Attenuation and Risk Assessment at a Chlorinated-compound Contaminated Site for Risk anagement

Tsai, Wei-anne 03 August 2009 (has links)
Contamination by dense non aqueous phase liquids (DNAPLs) [e.g., trichloroethylene (TCE)] in soil and groundwater has become an issue of great concern in many industrialized counties. In this study, a chlorinated-compound spill site was selected as the case study site to evaluate the possible risk to site workers and local residents caused by the contaminated soil and groundwater. The contaminants of concern at this site were TCE and 1,1-Dichloroethylene (1,1-DCE). The detected concentrations for TCE and 1,1-DCE exceeded the control standards of 0.05 and 0.07 mg/L, respectively. In this study, the Risk-based Corrective Action (RBCA) protocol developed by American Society for Testing and Materials (ASTM), health and risk assessment methods for soil and groundwater contaminated sites developed by Taiwan Environmental Protection Administration were applied for risk calculation and quantification. Monte Carlo analysis using @RISK software was applied for uncertainty analysis to calculate the cumulative risk at 95% probability distribution. Moreover, a natural attenuation model (BIOCHLOR) was used to evaluate the effectiveness of natural attenuation mechanisms on the chlorinated compounds. Results from this study show that the occurrence of natural attenuation for the chlorinated compounds was confirmed through the anaerobic biodegradation processes. The calculated cumulative risk at 95% cumulative probability via ingestion route was 2.61¡Ñ10-5 through the Monte Carlo analysis. The calculated cumulative risk at 95% cumulative probability via inhalation route and ambient (outdoor) vapor inhalation diffusion channels were 1.461¡Ñ10-5 and 2.17¡Ñ10-6, respectively. Because the calculated risk levels were higher than the target cancer risk is 1¡Ñ10-6 described in Taiwan¡¦s ¡§Soil and Groundwater Remediation Act¡¨, appropriate remedial actions are required to lower the risk to below the target level. Results also show that the calculated hazard index (HI) values of the contaminated site are lower than the acceptable level (HI < 1) described in the ¡§Soil and Groundwater Remediation Act.¡¨ To meet the target level of cancer risk of 1¡Ñ10-6, TCE contaminated groundwater needs to be remediated to below the site specific target level (SSTL) for inhalation exposure routes in a confined space volume, which is 6.91 ¡Ñ 10-2 mg/L. Based on the results of risk assessment, it is very important for the decision makers to incorporate remedial activities including institutional controls, engineering controls, and remediation programs from RBCA results. This study provides a streamlined process and guidelines of developing the risk-based decision-making strategy for contaminated sites in Taiwan.
3

Uncertainty analysis and inversion of geothermal conductive models using random simulation methods

Jokinen, J. (Jarkko) 31 March 2000 (has links)
Abstract Knowledge of the thermal conditions in the lithosphere is based on theoretical models of heat transfer constrained by geological and geophysical data. The present dissertation focuses on the uncertainties of calculated temperature and heat flow density results and on how they depend on the uncertainties of thermal properties of rocks, as well as on the relevant boundary conditions. Due to the high number of involved variables of typical models, the random simulation technique was chosen as the applied tool in the analysis. Further, the random simulation technique was applied in inverse Monte Carlo solutions of geothermal models. In addition to modelling technique development, new measurements on thermal conductivity and diffusivity of middle and lower crustal rocks in elevated pressure and temperature were carried out. In the uncertainty analysis it was found that a temperature uncertainty of 50 K at the Moho level, which is at a 50 km's depth in the layered model, is produced by an uncertainty of only 0.5 W m-1 K-1 in thermal conductivity values or 0.2 orders of magnitude uncertainty in heat production rate (mW m-3). Similar uncertainties are obtained in Moho temperature, given that the lower boundary condition varies by ± 115 K in temperature (nominal value 1373 K) or ± 1.7 mW m-2 in mantle heat-flow density (nominal value 13.2 mW m-2). Temperature and pressure dependencies of thermal conductivity are minor in comparison to the previous effects. The inversion results indicated that the Monte Carlo technique is a powerful tool in geothermal modelling. When only surface heat-flow density data are used as a fitting object, temperatures at the depth of 200 km can be inverted with an uncertainty of 120 - 170 K. When petrological temperature-depth (pressure) data on kimberlite-hosted mantle xenoliths were used also as a fitting object, the uncertainty was reduced to 60 - 130 K. The inversion does not remove the ambiguity of the models completely, but it reduces significantly the uncertainty of the temperature results.
4

Mine energy budget forecasting : the value of statistical models in predicting consumption profiles for management systems / Jean Greyling

Greyling, Jean January 2014 (has links)
The mining industry in South Africa has long been a crucial contributor to the Gross Domestic Product (GDP) starting in the 18th century. In 2010, the direct contribution towards the GDP from the mining industry was 10% and 19.8% indirect. During the last decade global financial uncertainty resulted in commodity prices hitting record numbers when Gold soared to a high at $1900/ounce in September 2011, and thereafter the dismal decline to a low of $1200/ounce in July 2013. Executives in these markets have reacted strongly to reduce operational costs and focussing on better production efficiencies. One such a cost for mining within South Africa is the Operational Expenditure (OPEX) associated with electrical energy that has steadily grown on the back of higher than inflation rate escalations. Companies from the Energy Intensive User Group (EIUG) witnessed energy unit prices (c/kWh) and their percentage of OPEX grow to 20% from 7% in 2008. The requirement therefore is for more accurate energy budget forecasting models to predict what energy unit price escalations (c/kWh) occur along with the required units (kWh) at mines or new projects and their impact on OPEX. Research on statistical models for energy forecasting within the mining industry indicated that the historical low unit price and its notable insignificant impact on OPEX never required accurate forecasting to be done and thus a lack of available information occurred. AngloGold Ashanti (AGA) however approached Deloittes in 2011 to conclude a study for such a statistical model to forecast energy loads on one of its operations. The model selected for the project was the Monte Carlo analysis and the rationale made sense as research indicated that it had common uses in energy forecasting at process utility level within other industries. For the purpose of evaluation a second regression model was selected as it is well-known within the statistical fraternity and should be able to provide high level comparison to the Monte Carlo model. Finally these were compared to an internal model used within AGA. Investigations into the variables that influence the energy requirement of a typical deep level mine indicated that via a process of statistical elimination tonnes broken and year are the best variables applicable in a mine energy model for conventional mining methods. Mines plan on a tonnage profile over the Life of Mine (LOM) so the variables were known for the given evaluation and were therefore used in both the Monte Carlo Analysis that worked on tonnes and Regression Analysis that worked on years. The models were executed to 2040 and then compared to the mine energy departments’ model in future evaluations along with current actuals as measured on a monthly basis. The best comparison against current actuals came from the mine energy departments’ model with the lowest error percentage at 6% with the Regression model at 11% and the Monte Carlo at 20% for the past 21 months. This, when calculated along with the unit price path studies from the EIUG for different unit cost scenarios gave the Net Present Value (NPV) reduction that each model has due to energy. A financial analysis with the Capital Asset Pricing Model (CAPM) and the Security Market Line (SML) indicated that the required rate of return that investors of AGA shares have is 11.92%. Using this value the NPV analysis showed that the mine energy model has the best or lowest NPV impact and that the regression model was totally out of line with expectations. Investors that provide funding for large capital projects require a higher return as the associated risk with their money increases. The models discussed in this research all work on an extrapolation principle and if investors are satisfied with 6% error for the historical 2 years and not to mention the outlook deviations, then there is significance and a contribution from the work done. This statement is made as no clear evidence of any similar or applicable statistical model could be found in research that pertains to deep level mining. Mining has been taking place since the 18th century, shallow ore resources are depleted and most mining companies would therefore look towards deeper deposits. The research indicates that to some extent there exist the opportunity and some rationale in predicting energy requirements for deep level mining applications. Especially when considering the legislative and operational cost implications for the mining houses within the South African economy and with the requirements from government to ensure sustainable work and job creation from industry in alignment with the National Growth Path (NGP). For this, these models should provide an energy outlook guideline but not exact values, and must be considered along with the impact on financial figures. / MBA, North-West University, Potchefstroom Campus, 2014
5

Mine energy budget forecasting : the value of statistical models in predicting consumption profiles for management systems / Jean Greyling

Greyling, Jean January 2014 (has links)
The mining industry in South Africa has long been a crucial contributor to the Gross Domestic Product (GDP) starting in the 18th century. In 2010, the direct contribution towards the GDP from the mining industry was 10% and 19.8% indirect. During the last decade global financial uncertainty resulted in commodity prices hitting record numbers when Gold soared to a high at $1900/ounce in September 2011, and thereafter the dismal decline to a low of $1200/ounce in July 2013. Executives in these markets have reacted strongly to reduce operational costs and focussing on better production efficiencies. One such a cost for mining within South Africa is the Operational Expenditure (OPEX) associated with electrical energy that has steadily grown on the back of higher than inflation rate escalations. Companies from the Energy Intensive User Group (EIUG) witnessed energy unit prices (c/kWh) and their percentage of OPEX grow to 20% from 7% in 2008. The requirement therefore is for more accurate energy budget forecasting models to predict what energy unit price escalations (c/kWh) occur along with the required units (kWh) at mines or new projects and their impact on OPEX. Research on statistical models for energy forecasting within the mining industry indicated that the historical low unit price and its notable insignificant impact on OPEX never required accurate forecasting to be done and thus a lack of available information occurred. AngloGold Ashanti (AGA) however approached Deloittes in 2011 to conclude a study for such a statistical model to forecast energy loads on one of its operations. The model selected for the project was the Monte Carlo analysis and the rationale made sense as research indicated that it had common uses in energy forecasting at process utility level within other industries. For the purpose of evaluation a second regression model was selected as it is well-known within the statistical fraternity and should be able to provide high level comparison to the Monte Carlo model. Finally these were compared to an internal model used within AGA. Investigations into the variables that influence the energy requirement of a typical deep level mine indicated that via a process of statistical elimination tonnes broken and year are the best variables applicable in a mine energy model for conventional mining methods. Mines plan on a tonnage profile over the Life of Mine (LOM) so the variables were known for the given evaluation and were therefore used in both the Monte Carlo Analysis that worked on tonnes and Regression Analysis that worked on years. The models were executed to 2040 and then compared to the mine energy departments’ model in future evaluations along with current actuals as measured on a monthly basis. The best comparison against current actuals came from the mine energy departments’ model with the lowest error percentage at 6% with the Regression model at 11% and the Monte Carlo at 20% for the past 21 months. This, when calculated along with the unit price path studies from the EIUG for different unit cost scenarios gave the Net Present Value (NPV) reduction that each model has due to energy. A financial analysis with the Capital Asset Pricing Model (CAPM) and the Security Market Line (SML) indicated that the required rate of return that investors of AGA shares have is 11.92%. Using this value the NPV analysis showed that the mine energy model has the best or lowest NPV impact and that the regression model was totally out of line with expectations. Investors that provide funding for large capital projects require a higher return as the associated risk with their money increases. The models discussed in this research all work on an extrapolation principle and if investors are satisfied with 6% error for the historical 2 years and not to mention the outlook deviations, then there is significance and a contribution from the work done. This statement is made as no clear evidence of any similar or applicable statistical model could be found in research that pertains to deep level mining. Mining has been taking place since the 18th century, shallow ore resources are depleted and most mining companies would therefore look towards deeper deposits. The research indicates that to some extent there exist the opportunity and some rationale in predicting energy requirements for deep level mining applications. Especially when considering the legislative and operational cost implications for the mining houses within the South African economy and with the requirements from government to ensure sustainable work and job creation from industry in alignment with the National Growth Path (NGP). For this, these models should provide an energy outlook guideline but not exact values, and must be considered along with the impact on financial figures. / MBA, North-West University, Potchefstroom Campus, 2014
6

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
<p>Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT.</p><p>Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations.</p><p>A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses.</p><p>Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. </p><p>Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. </p><p>Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process. </p>
7

Finite Element based Parametric Studies of a Truck Cab subjected to the Swedish Pendulum Test

Engström, Henrik, Raine, Jens January 2007 (has links)
Scania has a policy to attain a high crashworthiness standard and their trucks have to conform to Swedish cab safety standards. The main objective of this thesis is to clarify which parameter variations, present during the second part of the Swedish cab crashworthiness test on a Scania R-series cab, that have significance on the intrusion response. An LS-DYNA FE-model of the test case is analysed where parameter variations are introduced through the use of the probabilistic analysis tool LS-OPT. Example of analysed variations are the sheet thickness variation as well as the material variations such as stress-strain curve of the structural components, but also variations in the test setup such as the pendulum velocity and angle of approach on impact are taken into account. The effect of including the component forming in the analysis is investigated, where the variations on the material parameters are implemented prior to the forming. An additional objective is to analyse the influence of simulation and model dependent variations and weigh their respective effect on intrusion with the above stated physical variations. A submodel is created due to the necessity to speed up the simulations since the numerous parameter variations yield a large number of different designs, resulting in multiple analyses. Important structural component sensitivities are taken from the results and should be used as a pointer where to focus the attention when trying to increase the robustness of the cab. Also, the results show that the placement of the pendulum in the y direction (sideways seen from the driver perspective) is the most significant physical parameter variation during the Swedish pendulum test. It is concluded that to be able to achieve a fair comparison of the structural performance from repeated crash testing, this pendulum variation must be kept to a minimum. Simulation and model dependent parameters in general showed to have large effects on the intrusion. It is concluded that further investigations on individual simulation or model dependent parameters should be performed to establish which description to use. Mapping material effects from the forming simulation into the crash model gave a slight stiffer response compared to the mean pre-stretch approximations currently used by Scania. This is still however a significant result considering that Scanias approximations also included bake hardening effects from the painting process.
8

A Case Study of Risk Management for Groundwater Contaminated Site by Organic Chlorine Solvents.

Lin, Sang-Feng 02 January 2012 (has links)
The pollution of soil and groundwater have increased due to the leakage of ptrolium products and organic chemicals in the country recently,and Taiwan face with small region and dense population,so the pollution maybe exposed through relevant way such as quality of water,soil,air,crop and fish and so on to affect human health and cause risk of jeopardise directly or indiretly. The study is to aimed at chlorinated compounds for domestic,and use methods of risk assessment to analyze necessity and urgency of renovation for be contaiminted compound and downstream influence.And then according to result to make relevant management strategies and projets to control risk level and influence for contaiminated compound. In this study,we collect information relate to case,in accordence to health and risk assessment methods for soil and groundwater contaminated sites and some softwares of CRYSTAL BALL and BIOCHLO by Environmental Protection Bureau to estimate the site which was evaluated chlorinated compound (trichloroethylene,TCE),and considerate whether they affect residents healy nearby,use some hydrogeological survey of sites to process second-level health risk assessments. First,use commercial software of CRYSTAL BALL execute uncertainty and sensitivity analysis. Not only does the analysis probe into overall parameters variaty how to affect risk value but also they process analysis and results by different combinations of paremeter.From this result, we can confirm that the large parameter values for affecting risk is transmission of pollutants and is similar to previous studies assessment and analysis.Other parameter cause less influence for risk variaty such as age level of receptor,group,the way of contact,time and water quntity.The study discovers TCE pollutants concentration will make change of risk value by accompany with groundwater move to downstream distance. That means receptor of distance of contamination resource cause large influence. The far distance represents can product the larger function for TCE pollutant, and causes less cancer risk for receptor, conversely, it causes higher cancer risk. Subsequently, we also use BIOCHLOR assessment software by U.S.A Enviromental Protection Bureau. from the result to determine whether the site have potential of anaerrobic reductive dechlorination or not and estimate natural attenuation possibility.The actual situation of the site gets the three score in the weight total.This display do not prove that chlorinated compounds can procedd biogical decomposition in the anaerobic invironment without more correct value in the stage.We recommend looking for another renovations. The study selects more important sensitive parameters through risk assessment result for the site and work out the way of renovations which is suitable for this case by schedule of object. The investigation has found that residents indeed extracted groundwater for agriculture irrigation, but not drank water directly. In order to avoid the worst situation happens from the view of risk, we will consider two aspects of regulation and technology for plan. First, in order to administration control for the aspect of regulation we will consider how to prohibit residents to extract groundwater effectively. For instance, prohibit or limit to set up underground well, provide substitute drink water, set up notice sign and underground water quality monitor measure regularity. Second, for the sake of preventing pollutant group spread and adopt measure for the aspect of technology, for example pollution remediation technology (include soil gas extraction, air injection method at present) and containment control technology (include hydraulic containment, permeability reactive barrier, natural biological remediation, etc) to manage effectively possible events in the site and prepare well prevention in advance. We also adopt good communication with local residents to let residents understand executive content for renovation and reduce their resistance in favor of making progress for renovation and achieve risk management objective.
9

Computation reduction for statistical analysis of the effect of nano-CMOS variability on integrated circuits

Xie, Zheng January 2012 (has links)
The intrinsic atomistic variability of nano-scale integrated circuit (IC) technology must be taken into account when analysing circuit designs to predict likely yield. These ‘atomistic’ variabilities are random in nature and are so great that new circuit analysis techniques are needed which adopt a statistical treatment of the variability of device performances. Monte Carlo (MC) based statistical techniques aim to do this by analysing many randomized copies of the circuit. The randomization can take into account correlation between parameters due to both intra-die and inter-die effects. A major problem is the computational cost of carrying out sufficient analyses to produce statistically reliable results. The use of principal components analysis (PCA) and ‘Statistical Behavioural Circuit Blocks (SBCB)’ is investigated as a means of reducing the dimensionality of the analysis, and this is combined with an implementation of ‘Statistical Blockade (SB)’ to achieve significant reduction in the computational costs. The purpose of SBCBs is to model the most important aspects of the device’s or circuit building block’s behaviour, to an acceptable accuracy, with a relatively small number of parameters. The SB algorithm applies Extreme Value Theory (EVT) to circuit analysis by eliminating randomised parameter vectors that are considered unlikely to produce ‘rare event’ circuits. These circuits are needed for circuit yield failure predictions and occur on the ‘tails’ of Gaussian-like probability distributions for circuit performances. Versions of the circuit analysis program ‘SPICE’ with a Python harness called RandomSPICE are used to produce SBCBs by generating and statistically analysing randomized transistor-level versions of the sub-blocks for which behavioural models are required. The statistical analysis of circuits employing these sub-blocks is achieved by a new MATLAB harness called RandomLA. The computational time savings that may be achieved are illustrated by the statistical analysis of representative circuits. A computation time reduction of 98.7% is achieved for a commonly used asynchronous circuit element. Quasi-Monte Carlo (QMC) analysis with ‘low discrepancy sequences (LDS)’ is introduced for further computation reduction. QMC analysis using SBCB behavioural models with SB is evaluated by applying it to more complex examples and comparing the results with those of transistor level simulations. The analysis of SRAM arrays is taken as a case study for VLSI circuits containing up to 1536 transistors, modeled with parameters appropriate to 35nm technology. Significantly faster statistical analysis is shown to be possible when the aim is to obtain predictions of the yield for fabrication. Saving of up to 99.85% in computation time was obtained with larger circuits.
10

Monte Carlo analysis of scattered radiation in time-of-flight positron emission tomography

Muzic, Raymond Frank, Jr. January 1991 (has links)
No description available.

Page generated in 0.08 seconds