• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 526
  • 318
  • 78
  • 77
  • 50
  • 49
  • 18
  • 15
  • 14
  • 13
  • 8
  • 6
  • 5
  • 4
  • 4
  • Tagged with
  • 1401
  • 247
  • 202
  • 173
  • 172
  • 137
  • 135
  • 128
  • 120
  • 119
  • 104
  • 103
  • 101
  • 97
  • 88
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

Quantifying the Seismic Response of Underground Structures via Seismic Full Waveform Inversion : Experiences from Case Studies and Synthetic Benchmarks

Zhang, Fengjiao January 2013 (has links)
Seismic full waveform inversion (waveform tomography) is a method to reconstruct the underground velocity field in high resolution using seismic data. The method was first introduced during the 1980’s and became computationally feasible during the late 1990’s when the method was implemented in the frequency domain. This work presents three case studies and one synthetic benchmark of full waveform inversion applications. Two of the case studies are focused on time-lapse cross-well and 2D reflection seismic data sets acquired at the Ketzin CO2 geological storage site. These studies are parts of the CO2SINK and CO2MAN projects. The results show that waveform tomography is more effective than traveltime tomography for the CO2 injection monitoring at the Ketzin site for the cross-well geometry. For the surface data sets we find it is difficult to recover the true value of the velocity anomaly due to the injection using the waveform inversion method, but it is possible to qualitatively locate the distribution of the injected CO2. The results agree well with expectations based upon conventional 2D CDP processing methods and more extensive 3D CDP processing methods in the area. A further investigation was done to study the feasibility and efficiency of seismic full waveform inversion for time-lapse monitoring of onshore CO2 geological storage sites using a reflection seismic geometry with synthetic data sets. The results show that waveform inversion may be a good complement to standard CDP processing when monitoring CO2 injection. The choice of method and strategy for waveform inversion is quite dependent on the goals of the time-lapse monitoring of the CO2 injection. The last case study is an application of the full waveform inversion method to two crooked profiles at the Forsmark site in eastern central Sweden. The main goal of this study was to help determine if the observed reflections are mainly due to fluid filled fracture zones or mafic sills. One main difficulty here is that the profiles have a crooked line geometry which corresponds to 3D seismic geometry, but a 2D based inversion method is being used. This is partly handled by a 3D to 2D coordinate projection method from traveltime inversion. The results show that these reflections are primarily due to zones of lower velocity, consistent with them being generated at water filled fracture zones.
282

An Experimental Investigation of the Fire Characteristics of the University of Waterloo Burn House Structure

Klinck, Amanda January 2006 (has links)
This thesis reports on the procedure, results and analysis of four full scale fire tests that were performed at the University of Waterloo's Live Fire Research Facility. The purpose of these tests was to investigate the thermal characteristics of one room of the Burn House structure. Comparisons were made of Burn House experimental data to previous residential fire studies undertaken by researchers from the University of Waterloo. This analysis showed similarities in growth rate characteristics, illustrating that fire behaviour in the Burn House is typical of residential structure fire behaviour. The Burn House experimental data was also compared to predictions from a fire model, CFAST. Recommendations were made for future work in relation to further investigation of the fire characteristics of the Burn House.
283

INFORMATION THEORETIC CRITERIA FOR IMAGE QUALITY ASSESSMENT BASED ON NATURAL SCENE STATISTICS

Zhang, Di January 2009 (has links)
Measurement of visual quality is crucial for various image and video processing applications. It is widely applied in image acquisition, media transmission, video compression, image/video restoration, etc. The goal of image quality assessment (QA) is to develop a computable quality metric which is able to properly evaluate image quality. The primary criterion is better QA consistency with human judgment. Computational complexity and resource limitations are also concerns in a successful QA design. Many methods have been proposed up to now. At the beginning, quality measurements were directly taken from simple distance measurements, which refer to mathematically signal fidelity, such as mean squared error or Minkowsky distance. Lately, QA was extended to color space and the Fourier domain in which images are better represented. Some existing methods also consider the adaptive ability of human vision. Unfortunately, the Video Quality Experts Group indicated that none of the more sophisticated metrics showed any great advantage over other existing metrics. This thesis proposes a general approach to the QA problem by evaluating image information entropy. An information theoretic model for the human visual system is proposed and an information theoretic solution is presented to derive the proper settings. The quality metric is validated by five subjective databases from different research labs. The key points for a successful quality metric are investigated. During the testing, our quality metric exhibits excellent consistency with the human judgments and compatibility with different databases. Other than full reference quality assessment metric, blind quality assessment metrics are also proposed. In order to predict quality without a reference image, two concepts are introduced which quantitatively describe the inter-scale dependency under a multi-resolution framework. Based on the success of the full reference quality metric, several blind quality metrics are proposed for five different types of distortions in the subjective databases. Our blind metrics outperform all existing blind metrics and also are able to deal with some distortions which have not been investigated.
284

Performance Analysis and Implementation of Full Adder Cells Using 0.18 um CMOS Technology

Tesanovic, Goran January 2003 (has links)
0.18 um CMOS technology is increasingly used in design and implementation of full adder cells. Hence, there is a need for better understanding of the effects of different cell designs on cell performance, including power dissipation and time delays. This thesis contributes to better understanding of the behavior of single-bit full adder cells when low power-delay products are essential. Thirty one single-bit full adder cells have been implemented in Cadence tool suit and simulated using 0.18 µm CMOS technology to obtain a comprehensive study of the performance of the cells with respect to time (time-delays) and power consumption (power dissipation). Simulation method used for performance measurements has been carefully devised to achieve as accurate measurements as possible with respect to time delay and power dissipation. The method combines the simple measurement technique for obtaining accurate time-delays and power dissipation of a cell, and the transistor resizing technique that allows systematicallyresizing of transistors to achieve minimal power-delay product. The original technique of sizing of the transistors has been extended in this thesis for the purpose of the performance measurements to include both resizing the transistors in the critical path and resizing the transistors on the global level, and therefore efficiently obtain minimal power-delay product for every cell. The result of this performance study is an extensive knowledge of full adder cell behaviour with respect to time and power, including the limitations of the 0.18 µm CMOS technology when used in the area of full adder cells. Furthermore, the study identified full adder cell designs that demonstrated the best performance results with respect to power-delay products. In general, the complex performance simulation method in this thesis that combines the simulation of time delay and critical path transistor resizing provides the most accurate measurements and as such can be used in the future performance analysis of single-bit full adder cells.
285

An Experimental Investigation of the Fire Characteristics of the University of Waterloo Burn House Structure

Klinck, Amanda January 2006 (has links)
This thesis reports on the procedure, results and analysis of four full scale fire tests that were performed at the University of Waterloo's Live Fire Research Facility. The purpose of these tests was to investigate the thermal characteristics of one room of the Burn House structure. Comparisons were made of Burn House experimental data to previous residential fire studies undertaken by researchers from the University of Waterloo. This analysis showed similarities in growth rate characteristics, illustrating that fire behaviour in the Burn House is typical of residential structure fire behaviour. The Burn House experimental data was also compared to predictions from a fire model, CFAST. Recommendations were made for future work in relation to further investigation of the fire characteristics of the Burn House.
286

Mould Resistance of Full Scale Wood Frame Wall Assemblies

Black, Christopher January 2006 (has links)
The primary objective of this study was to investigate mould growth resistance of different types of wood products which include the sheathing and framing within full scale wall assemblies. Secondary objectives were to investigate the difference in mould growth resistance between borate-treated and untreated wood products as well as provide information about mould growth under different temperature and humidity conditions for treated and untreated wood products. <br /><br /> The objective of the study is to better understand mould growth, and to examine the effects of varying high moisture conditions on wooden products and the mould growth which may result. More importantly this will be examined on full scale wall assemblies; to date mould growth studies have only been performed within a laboratory on small samples of materials. Moreover, this study recreates the conditions which evidently cause mould growth on full scale wall assemblies. Tests were performed within a climate chamber on three full scale wall assemblies. The original scope of this study included an examination of the sheathing and framing components within a full scale wall assembly, however this study will focus mainly on the sheathing. <br /><br /> Results of this study indicate that the relative humidity conditions needed for mould growth on wood are higher than originally believed (i. e. , significantly greater than 80%RH). During the first eight weeks of test number one the relative humidity at the surface of the sheathing was held constant at 95% and little mould growth was observed on the untreated sheathing (mould growth index of 3 or less); little or no mould growth on the treated sheathing (mould growth index of 1 or less). The second and third tests demonstrated that the presence of liquid water greatly accelerated the time to germinations, the amount of mould growth (up to a mould growth index of 6), and the rate of mould growth. All three tests clearly showed that borate-treatment reduced the amount of mould growth; however, the concentration of borate-treatment, and the types of materials treated, does affect the resistance of mould growth. Furthermore, there was some evidence to suggest Borate treatments of the plywood increased the time to germination significantly, from a few weeks to 16 weeks in this study, but once mould growth was initiated, the rate of mould growth was similar to that of the untreated plywood. Two mathematical models to determine mould growth were examined: Viitanen and WUFIBIO (Sedlbauer). Viitanen?s model predicted time to germination and rate of growth rate well for untreated plywood, and WUFIBIO predicted time to germination but not the growth rate. It was also found both models err on the side of caution in predicting mould growth. <br /><br /> Recommendations include improvements to the test method and producers, and for future work.
287

INFORMATION THEORETIC CRITERIA FOR IMAGE QUALITY ASSESSMENT BASED ON NATURAL SCENE STATISTICS

Zhang, Di January 2009 (has links)
Measurement of visual quality is crucial for various image and video processing applications. It is widely applied in image acquisition, media transmission, video compression, image/video restoration, etc. The goal of image quality assessment (QA) is to develop a computable quality metric which is able to properly evaluate image quality. The primary criterion is better QA consistency with human judgment. Computational complexity and resource limitations are also concerns in a successful QA design. Many methods have been proposed up to now. At the beginning, quality measurements were directly taken from simple distance measurements, which refer to mathematically signal fidelity, such as mean squared error or Minkowsky distance. Lately, QA was extended to color space and the Fourier domain in which images are better represented. Some existing methods also consider the adaptive ability of human vision. Unfortunately, the Video Quality Experts Group indicated that none of the more sophisticated metrics showed any great advantage over other existing metrics. This thesis proposes a general approach to the QA problem by evaluating image information entropy. An information theoretic model for the human visual system is proposed and an information theoretic solution is presented to derive the proper settings. The quality metric is validated by five subjective databases from different research labs. The key points for a successful quality metric are investigated. During the testing, our quality metric exhibits excellent consistency with the human judgments and compatibility with different databases. Other than full reference quality assessment metric, blind quality assessment metrics are also proposed. In order to predict quality without a reference image, two concepts are introduced which quantitatively describe the inter-scale dependency under a multi-resolution framework. Based on the success of the full reference quality metric, several blind quality metrics are proposed for five different types of distortions in the subjective databases. Our blind metrics outperform all existing blind metrics and also are able to deal with some distortions which have not been investigated.
288

Effekter av ny redovisningsmodell på svenska universitet och högskolor. En flerfallsstudie

Holm, Maj-Len January 2010 (has links)
Redovisningen av de indirekta kostnaderna vid högskolor och universitet har kritiserats av Riksrevisionen och externa finansiärer under många år. I november 2007 rekommenderade Sveriges universitets- och högskoleförbund (SUHF) landets universitet och högskolor att införa en ny redovisningsmodell som SUHF hade tagit fram. Flera lärosäten har infört modellen sedan dess. Men hur har det gått? Syftet med denna studie var att ta reda på det och beskriva effekterna av modellen, särskilt med tanke på intern styrning och kontroll. Studien har gjorts som en flerfallsstudie i form av en utvärdering vid sex högskolor. Data har samlats in via telefonintervjuer med ekonomer på central förvaltningsnivå. Styrmedel, positiv teori, särskilt institutionsteori, men även tidigare forskning och rapporter om fullkostnadsmodeller och strategisk ekonomistyrning, har fungerat som en teoretisk lins vid analysen av svaren. Resultaten visar att högskolorna har fått en mer transparent och rättvisande redovisning av de indirekta kostnaderna, vilket också var en av målsättningarna med modellen. Implemen­teringsfasen har inneburit merarbete för administrativ personal, svårigheter att förstå model­len, särskilt inom forskarkategorin och också ett visst motstånd. Någon förändring i externa finansiärers vilja att stödja projekt har inte noterats. Modellen har blivit ett nytt styrmedel, framför allt för prefekter, som fått ta ett större ekonomiskt ansvar än tidigare. Att modellen skulle ha fått någon påverkan på universitetsledningens strategiska styrning och kontroll har ännu inte noterats.
289

Evaluation of the Performance of Bridge Steel Pedestals under Low Seismic Loads

Hite, Monique C. 09 April 2007 (has links)
Many bridges are damaged by collisions from over-height vehicles resulting in significant impact to the transportation network. To reduce the likelihood of impact from over-height vehicles, steel pedestals have been used as a cost-effective, efficient means to increase bridge clearance heights. However, these steel pedestals installed on more than 50 bridges in Georgia have been designed with no consideration of seismic loads and may behave in a similar fashion to high-type steel bearings. Past earthquakes have revealed the susceptibility of high-type bearings to damage, resulting in the collapse of several bridges. Although Georgia is located in a low-to-moderate region of seismicity, earthquake design loads for steel pedestals should not be ignored. In this study, the potential vulnerabilities of steel pedestals having limited strength and deformation capacity and lack of adequate connection details for anchor bolts is assessed experimentally and analytically. Full-scale reversed cyclic quasi-static experimental tests are conducted on a 40' bridge specimen rehabilitated with 19" and 33" steel pedestals to determine the modes of deformation and mechanisms that can lead to modes of failure. The inelastic force-deformation hysteretic behavior of the steel pedestals obtained from experimental test results is used to calibrate an analytical bridge model developed in OpenSees. The analytical bridge model is idealized based on a multi-span continuous bridge in Georgia that has been rehabilitated with steel pedestals. The analytical bridge model is subjected to a suite of ground motions to evaluate the performance of the steel pedestals and the overall bridge system. Recommendations are made to the Georgia Department of Transportation (GDOT) for the design and construction of steel pedestals. The results of this research are useful for Georgia and other states in low-to-moderate seismic zones considering the use of steel pedestals to elevate bridges and therefore reduce the likelihood of over-height vehicle collisions.
290

Development of BEMS Diagnostic and Intelligent Expert Technology for Air-conditioning Systems

Dai, Chi-fu 11 June 2012 (has links)
When central HVAC systems are under commercial operation, all operational parameters, including chilled water supply temperature, return water temperature, chilled water flow rate, and power consumption as the key factors in affecting system energy efficiency.However, in Taiwan there is still lacking of the regression equations of chillers under local weather conditions, and has to rely on manual operation based on experiences. This is also the major short-comings in implementing TAB, and is the best way to renovate the green buildings to make them more intelligent. In this study, theoretical analysis and experimental investigation will be applied simultaneously. In selecting cases from the BeeUp program, actual operational data including COP can be fitted into experimental model to facilitate TAB engineering via BEMS system, in achieving system efficiency. Through the execution of this project, it is identified that the exhibition buildings with 10 hours cooling load, can achieve 7% energy savings. The good result obtained in this project , including the thermal energy storage HVAC system and the heat pump system can be widely adapted to obtain significant energy conservation effect.

Page generated in 0.0381 seconds