• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 27
  • Tagged with
  • 27
  • 27
  • 27
  • 27
  • 27
  • 27
  • 5
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

The Impulse-Radiating Antenna

Rosenlind, Johanna January 2009 (has links)
As the interest in intentional electromagnetic interference (IEMI) increases, so does the need of a suitable antenna which endures those demanding conditions. The ultrawideband (UWB) technology provides an elegant way of generating high-voltage UWB pulses which can be used for IEMI. One UWB antenna, invented solely for the purpose of radiating pulses, is the impulse radiating antenna (IRA). In the course of this master thesis work, a suitable geometry of the IRA is suggested, and modelled, for the high-voltage application of 90 kV.
22

Micropumps for extreme pressures

Svensson, Stefan January 2009 (has links)
The objective of this thesis was to improve a paraffin actuated micropump design, to be able to pump against extreme pressures (above 100 bar). This was accomplished by initially studying the membrane activation, using video capturing. The micropump has been improved to withstand pressures high enough, to enable use in an high-performance liquid chromatography (HPLC) system. The micropump has been shown to pump against back pressures up to 150 bar, with a positive net-flow. This should be compared with the previously recorded maximum back pressure of 50 bar. The pumping against high back pressures was possible due to an increased understanding of the sealing of the membranes. This resulted in a new design that was manufactured and characterised. Without clamping the pump was measured to manage back pressures of 10 bar, and then starting to leak in a bond at the flow channel. With supporting clamping, the managed back pressures increased ten folded. When measured on the different valves, pressure above 200 bar has been possible to withhold. Although the valves were below their maximum limit, the pressure was not possible to be further increased due to a limitation in the equipment, i.e. risk of damaging the connections. When examined after pressurised at extreme pressures (above 100 bar) several times, no signs of fatigue or damage of the membrane was seen. A new behaviour of the valves was discovered. Above certain pressures some designs self sealed, i.e. withholding the pressure after the voltage was turned off. For these valves the pressure had to be released by some other means.
23

Simulation of Reactor Transient and Design Criteria of Sodium-cooled Fast Reactors / Simulation of Reactor Transient and Design Criteria of Sodium-cooled Fast Reactors

Gottfridsson, Filip January 2010 (has links)
The need for energy is growing in the world and the market of nuclear power is now once more expanding. Some issues of the current light-water reactors can be solved by the next generation of nuclear power, Generation IV, where sodium-cooled reactors are one of the candidates. Phénix was a French prototype sodium-cooled reactor, which is seen as a success. Although it did encounter an earlier unexperienced phenomenon, A.U.R.N., in which a negative reactivity transient followed by an oscillating behavior forced an automatic emergency shutdown of the reactor. This phenomenon lead to a lot of downtime of the reactor and is still unsolved. However, the most probable cause of the transients is radial movements of the core, referred to as core-flowering. This study has investigated the available documentation of the A.U.R.N. events. A simplified model of core-flowering was also created in order to simulate how radial expansion affects the reactivity of a sodium-cooled core. Serpent, which is a Monte-Carlo based simulation code, was chosen as calculation tool. Furthermore, a model of the Phénix core was successfully created and partly validated. The model of the core has a k_eff = 1.00298 and a neutron flux of (8.43+-0.02)!10^15 neutrons/cm^2 at normal state. The result obtained from the simulations shows that an expansion of the core radius decreases the reactivity. A linear approximation of the result gave the relation: change in k_eff/core extension = - 60 pcm/mm. This value corresponds remarkably well to the around - 60 pcm/mm that was obtained from the dedicated core-flowering experiments in Phénix made by the CEA. Core-flowering can recreate similar signals to those registered during the A.U.R.N. events, though the absence of trace of core movements in Phénix speaks against this. However, if core-flowering is the sought answer, it can be avoided by design. The equipment that registered the A.U.R.N. events have proved to be insensitive to noise. Though, the high amplitude of the transients and their rapidness have made some researcher believe that the events are a combination of interference in the equipment of Phénix and a mechanical phenomenon. Regardless, the origin of A.U.R.N. seems to be bound to some specific parameter of Phénix due to the fact that the transients only have occurred in this reactor. A safety analysis made by an expert committee, appointed by CEA, showed that the A.U.R.N. events are not a threat to the safety of Phénix. However, the origin of these negative transients has to be found before any construction of a commercial size sodium-cooled fast reactor can begin. Thus, further research is needed.
24

Coal and Oil: The Dark Monarchs of Global Energy : Understanding Supply and Extraction Patterns and their Importance for Future Production

Höök, Mikael January 2010 (has links)
The formation of modern society has been dominated by coal and oil, and together these two fossil fuels account for nearly two thirds of all primary energy used by mankind.  This makes future production a key question for future social development and this thesis attempts to answer whether it is possible to rely on an assumption of ever increasing production of coal and oil. Both coal and oil are finite resources, created over long time scales by geological processes. It is thus impossible to extract more fossil fuels than geologically available. In other words, there are limits to growth imposed by nature. The concept of depletion and exhaustion of recoverable resources is a fundamental question for the future extraction of coal and oil. Historical experience shows that peaking is a well established phenomenon in production of various natural resources. Coal and oil are no exceptions, and historical data shows that easily exploitable resources are exhausted while more challenging deposits are left for the future. For oil, depletion can also be tied directly to the physical laws governing fluid flows in reservoirs. Understanding and predicting behaviour of individual fields, in particularly giant fields, are essential for understanding future production. Based on comprehensive databases with reserve and production data for hundreds of oilfields, typical patterns were found. Alternatively, depletion can manifest itself indirectly through various mechanisms. This has been studied for coal. Over 60% of the global crude oil production is derived from only around 330 giant oilfields, where many of them are becoming increasingly mature. The annual decline in existing oil production has been determined to be around 6% and it is unrealistic that this will be offset by new field developments, additional discoveries or unconventional oil. This implies that the peak of the oil age is here. For coal a similar picture emerges, where 90% of the global coal production originates from only 6 countries. Some of them, such as the USA show signs of increasing maturity and exhaustion of the recoverable amounts. However, there is a greater uncertainty about the recoverable reserves and coal production may yield a global maximum somewhere between 2030 and 2060. This analysis shows that the global production peaks of both oil and coal can be expected comparatively soon. This has significant consequences for the global energy supply and society, economy and environment. The results of this thesis indicate that these challenges should not be taken lightly.
25

Depletion and decline curve analysis in crude oil production

Höök, Mikael January 2009 (has links)
Oil is the black blood that runs through the veins of the modern global energy system. While being the dominant source of energy, oil has also brought wealth and power to the western world. Future supply for oil is unsure or even expected to decrease due to limitations imposed by peak oil. Energy is fundamental to all parts of society. The enormous growth and development of society in the last two-hundred years has been driven by rapid increase in the extraction of fossil fuels. In the foresee-able future, the majority of energy will still come from fossil fuels. Consequently, reliable methods for forecasting their production, especially crude oil, are crucial. Forecasting crude oil production can be done in many different ways, but in order to provide realistic outlooks, one must be mindful of the physical laws that affect extraction of hydrocarbons from a reser-voir. Decline curve analysis is a long established tool for developing future outlooks for oil production from an individual well or an entire oilfield. Depletion has a fundamental role in the extraction of finite resources and is one of the driving mechanisms for oil flows within a reservoir. Depletion rate also can be connected to decline curves. Consequently, depletion analysis is a useful tool for analysis and forecasting crude oil production. Based on comprehensive databases with reserve and production data for hundreds of oil fields, it has been possible to identify typical behaviours and properties. Using a combination of depletion and decline rate analysis gives a better tool for describing future oil production on a field-by-field level. Reliable and reasonable forecasts are essential for planning and nec-essary in order to understand likely future world oil production.
26

Modern Stereo Correspondence Algorithms : Investigation and Evaluation

Olofsson, Anders January 2010 (has links)
<p>Many different approaches have been taken towards solving the stereo correspondence problem and great progress has been made within the field during the last decade. This is mainly thanks to newly evolved global optimization techniques and better ways to compute pixel dissimilarity between views. The most successful algorithms are based on approaches that explicitly model smoothness assumptions made about the physical world, with image segmentation and plane fitting being two frequently used techniques.</p><p>Within the project, a survey of state of the art stereo algorithms was conducted and the theory behind them is explained. Techniques found interesting were implemented for experimental trials and an algorithm aiming to achieve state of the art performance was implemented and evaluated. For several cases, state of the art performance was reached.</p><p>To keep down the computational complexity, an algorithm relying on local winner-take-all optimization, image segmentation and plane fitting was compared against minimizing a global energy function formulated on pixel level. Experiments show that the local approach in several cases can match the global approach, but that problems sometimes arise – especially when large areas that lack texture are present. Such problematic areas are better handled by the explicit modeling of smoothness in global energy minimization.</p><p>Lastly, disparity estimation for image sequences was explored and some ideas on how to use temporal information were implemented and tried. The ideas mainly relied on motion detection to determine parts that are static in a sequence of frames. Stereo correspondence for sequences is a rather new research field, and there is still a lot of work to be made.</p>
27

Spectral Mammography with X-Ray Optics and a Photon-Counting Detector

Fredenberg, Erik January 2009 (has links)
Early detection is vital to successfully treating breast cancer, and mammography screening is the most efficient and wide-spread method to reach this goal. Imaging low-contrast targets, while minimizing the radiation exposure to a large population is, however, a major challenge. Optimizing the image quality per unit radiation dose is therefore essential. In this thesis, two optimization schemes with respect to x-ray photon energy have been investigated: filtering the incident spectrum with refractive x-ray optics (spectral shaping), and utilizing the transmitted spectrum with energy-resolved photon-counting detectors (spectral imaging). Two types of x-ray lenses were experimentally characterized, and modeled using ray tracing, field propagation, and geometrical optics. Spectral shaping reduced dose approximately 20% compared to an absorption-filtered reference system with the same signal-to-noise ratio, scan time, and spatial resolution. In addition, a focusing pre-object collimator based on the same type of optics reduced divergence of the radiation and improved photon economy by about 50%. A photon-counting silicon detector was investigated in terms of energy resolution and its feasibility for spectral imaging. Contrast-enhanced tumor imaging with a system based on the detector was characterized and optimized with a model that took anatomical noise into account. Improvement in an ideal-observer detectability index by a factor of 2 to 8 over that obtained by conventional absorption imaging was found for different levels of anatomical noise and breast density. Increased conspicuity was confirmed by experiment. Further, the model was extended to include imaging of unenhanced lesions. Detectability of microcalcifications increased no more than a few percent, whereas the ability to detect large tumors might improve on the order of 50% despite the low attenuation difference between glandular and cancerous tissue. It is clear that inclusion of anatomical noise and imaging task in spectral optimization may yield completely different results than an analysis based solely on quantum noise. / QC 20100714

Page generated in 0.0394 seconds