• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1265
  • 440
  • 229
  • 124
  • 93
  • 37
  • 27
  • 26
  • 22
  • 20
  • 16
  • 12
  • 11
  • 11
  • 10
  • Tagged with
  • 2791
  • 320
  • 317
  • 288
  • 233
  • 229
  • 191
  • 181
  • 179
  • 160
  • 155
  • 138
  • 137
  • 132
  • 130
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
351

An iterative reconstruction algorithm for quantitative tissue decomposition using DECT / En iterativ rekonstruktions algoritm för kvantitativ vävnadsklassificering via DECT

Grandell, Oscar January 2012 (has links)
The introduction of dual energy CT, DECT, in the field of medical healthcare has made it possible to extract more information of the scanned objects. This in turn has the potential to improve the accuracy in radiation therapy dose planning. One problem that remains before successful material decomposition can be achieved however, is the presence of beam hardening and scatter artifacts that arise in a scan. Methods currently in clinical use for removal of beam hardening often bias the CT numbers. Hence, the possibility for an appropriate tissue decomposition is limited. Here a method for successful decomposition as well as removal of the beam hardening artifact is presented. The method uses effective linear attenuations for the five base materials, water, protein, adipose, cortical bone and marrow, to perform the decomposition on reconstructed simulated data. This is performed inside an iterative loop together with the polychromatic x-ray spectra to remove the beam hardening
352

Natural gas recovery from hydrates in a silica sand matrix

Haligva, Cef 05 1900 (has links)
This thesis studies methane hydrate crystal formation and decomposition at 1.0, 4.0 and 7.0°C in a new apparatus. Hydrate was formed in the interstitial space of a variable volume bed of silica sand particles with an average diameter equal to 329μm (150 to 630μm range). The initial pressure inside the reactor was 8.0MPa for all the formation experiments. Three bed sizes were employed in order to observe the effects of the silica sand bed size on the rate of methane consumption (formation) and release (decomposition). The temperature at various locations inside the silica sand bed was measured with thermocouples during formation and decomposition experiments. For the decomposition experiments, two different methods were employed to dissociate the hydrate: thermal stimulation and depressurization. It was found that more than 74.0% of water conversion to hydrates was achieved in all hydrate formation experiments at 4.0°C and 1.0°C starting with a pressure of 8.0MPa. The dissociation of hydrate was found to occur in two stages when thermal stimulation was employed whereas three stages were found during depressurization. In both cases, the first stage was strongly affected by the changing bed size whereas it was not found to depend on the bed size afterwards.
353

Essays on Unions, Wages and Performance: Evidence from Latin America

Rios, Fernando 13 August 2013 (has links)
Unions are one of the most important institutions in labor markets, and are capable of affecting workers (wages) and employers (performance). Despite the relevance unions have had worldwide, most of the literature has concentrated on the economic effects of unions in the U.S. and other developed countries, with few studies concentrating on what unions do in developing countries. Because developing countries have contrasting differences compared to developed countries, in terms of economic development, legal settings and institutions, it is possible that conclusions reached in the broader literature might not be appropriate in the framework of developing countries. This dissertation aims to fill this gap in the literature studying the economic effects of unions on wages and performance in selected developing countries in Latin America: Argentina, Bolivia, Chile, Mexico, Panama and Uruguay. The first essay focuses on the impact of unions on wages distribution in Bolivia and Chile, using the novel Recentered Influence Function decomposition. Although both countries have considerably different levels of economic development and institutions, the estimations indicate unions have similar effects increasing wages and reducing wage inequality at the top of the distribution. These results are similar to those found replicating the methodology using U.S. data. The results suggest that the common economic and political forces that govern the role of unions as collective bargaining units transcend other contextual differences in these countries. The second essay analyzes the impact of unions on economic performance of establishments in the manufacturing sector in Argentina, Bolivia, Chile, Mexico, Panama and Uruguay. Using an augmented Cobb Douglas production function, the essay finds that unions have a positive, but small, effect on productivity, with the exception of Argentina. Analyses on alternative measures of performance show that, for most cases, the positive productivity effects barely offset the higher union compensation; that unions show no relationship with sales growth; and that unionized establishments usually reduce investment in capital and R&D. While no single narrative can explain all observed effects across countries, the results provide a step forward to understand the role of unions on economic performance in developing countries.
354

Probabilistic Robust Design For Dynamic Systems Using Metamodelling

Seecharan, Turuna Saraswati January 2007 (has links)
Designers use simulations to observe the behaviour of a system and to make design decisions to improve dynamic performance. However, for complex dynamic systems, these simulations are often time-consuming and, for robust design purposes, numerous simulations are required as a range of design variables is investigated. Furthermore, the optimum set is desired to meet specifications at particular instances in time. In this thesis, the dynamic response of a system is broken into discrete time instances and recorded into a matrix. Each column of this matrix corresponds to a discrete time instance and each row corresponds to the response at a particular design variable set. Singular Value Decomposition (SVD) is then used to separate this matrix into two matrices: one that consists of information in parameter-space and the other containing information in time-space. Metamodels are then used to efficiently and accurately calculate the response at some arbitrary set of design variables at any time. This efficiency is especially useful in Monte Carlo simulation where the responses are required at a very large sample of design variable sets. This work is then extended where the normalized sensitivities along with the first and second moments of the response are required at specific times. Later, the procedure of calculating the metamodel at specific times and how this metamodel is used in parameter design or integrated design for finding the optimum parameters given specifications at specific time steps is shown. In conclusion, this research shows that SVD and metamodelling can be used to apply probabilistic robust design tools where specifications at certain times are required for the optimum performance of a system.
355

Three material decomposition in dual energy CT for brachytherapy using the iterative image reconstruction algorithm DIRA : Performance of the method for an anthropomorphic phantom

Westin, Robin January 2013 (has links)
Brachytherapy is radiation therapy performed by placing a radiation source near or inside a tumor. Difference between the current water-based brachytherapy dose formalism (TG-43) and new model based dose calculation algorithms (MBSCAs) can differ by more than a factor of 10 in the calculated doses. There is a need for voxel-by-voxel cross-section assignment, ideally, both the tissue composition and mass density of every voxel should be known for individual patients. A method for determining tissue composition via three material decomposition (3MD) from dual energy CT scans was developed at Linköping university. The method (named DIRA) is a model based iterative reconstruction algorithm that utilizes two photon energies for image reconstruction and 3MD for quantitative tissue classification of the reconstructed volumetric dataset. This thesis has investigated the accuracy of the 3MD method applied on prostate tissue in an anthropomorphic phantom when using two different approximations of soft tissues in DIRA. Also the distributions of CT-numbers for soft tissues in a contemporary dual energy CT scanner have been determined. An investigation whether these distributions can be used for tissue classification of soft tissues via thresholding has been conducted. It was found that the relative errors of mass energy absorption coefficient (MEAC) and linear attenuation coefficient (LAC) of the approximated mixture as functions of photon energy were less than 6 \% in the energy region from 1 keV to 1 MeV. This showed that DIRA performed well for the selected anthropomorphic phantom and that it was relatively insensitive to choice of base materials for the approximation of soft tissues. The distributions of CT-numbers of liver, muscle and kidney tissues overlapped. For example a voxel containing muscle could be misclassified as liver in 42 cases of 100. This suggests that pure thresholding is insufficient as a method for tissue classification of soft tissues and that more advanced methods should be used.
356

A Contour-based Separation of VerticallyAttached Traffic Signs

Zhao, Ping January 2007 (has links)
This report presents an algorithm for locating the cut points for and separatingvertically attached traffic signs in Sweden. This algorithm provides severaladvanced digital image processing features: binary image which representsvisual object and its complex rectangle background with number one and zerorespectively, improved cross correlation which shows the similarity of 2Dobjects and filters traffic sign candidates, simplified shape decompositionwhich smoothes contour of visual object iteratively in order to reduce whitenoises, flipping point detection which locates black noises candidates, chasmfilling algorithm which eliminates black noises, determines the final cut pointsand separates originally attached traffic signs into individual ones. At each step,the mediate results as well as the efficiency in practice would be presented toshow the advantages and disadvantages of the developed algorithm. Thisreport concentrates on contour-based recognition of Swedish traffic signs. Thegeneral shapes cover upward triangle, downward triangle, circle, rectangle andoctagon. At last, a demonstration program would be presented to show howthe algorithm works in real-time environment.
357

Limit Values and Factors influencing Limit Values of Spruce

Zhang, Liming January 2011 (has links)
We collected the data for decomposition of spruce litter to determine the limit values of mass loss and to find both chemical and climate factors that influence limit values. Our data contained 28 sequences of spruce which mainly in Sweden and a small part in other places. We choose mean annual temperature (MAT) and mean annual precipitation (MAP) as climate factors and water solubles, lignin, N, P, K, Ca, Mg and Mn as chemical factors. Then we got the estimated limit values by performing a nonlinear model with mass loss and time spots, and found out the influential factors by using another linear mixed model. At the end we knew that linear mixed model is a proper and efficient approach for determining the factors, P and MAP are the significant factors and Species is a good random effect to explain the variance within groups.
358

Probabilistic Robust Design For Dynamic Systems Using Metamodelling

Seecharan, Turuna Saraswati January 2007 (has links)
Designers use simulations to observe the behaviour of a system and to make design decisions to improve dynamic performance. However, for complex dynamic systems, these simulations are often time-consuming and, for robust design purposes, numerous simulations are required as a range of design variables is investigated. Furthermore, the optimum set is desired to meet specifications at particular instances in time. In this thesis, the dynamic response of a system is broken into discrete time instances and recorded into a matrix. Each column of this matrix corresponds to a discrete time instance and each row corresponds to the response at a particular design variable set. Singular Value Decomposition (SVD) is then used to separate this matrix into two matrices: one that consists of information in parameter-space and the other containing information in time-space. Metamodels are then used to efficiently and accurately calculate the response at some arbitrary set of design variables at any time. This efficiency is especially useful in Monte Carlo simulation where the responses are required at a very large sample of design variable sets. This work is then extended where the normalized sensitivities along with the first and second moments of the response are required at specific times. Later, the procedure of calculating the metamodel at specific times and how this metamodel is used in parameter design or integrated design for finding the optimum parameters given specifications at specific time steps is shown. In conclusion, this research shows that SVD and metamodelling can be used to apply probabilistic robust design tools where specifications at certain times are required for the optimum performance of a system.
359

A Methodology for Aeroelastic Constraint Analysis in a Conceptual Design Environment

De Baets, Peter Wilfried Gaston 12 April 2004 (has links)
The research examines how the Bi-Level Integrated System Synthesis decomposition technique can be adapted to perform as the conceptual aeroelastic design tool. The study describes a comprehensive solution of the aeroelastic coupled problem cast in this decomposition format and implementation in an integrated framework. The method is supported by application details of a proof-of-concept high speed vehicle. Physics-based codes such as finite element and an aerodynamic panel method are used to model the high-definition geometric characteristics of the vehicle. A synthesis and sizing code was added to referee the conflicts that arise between the two disciplines. This research's novelty lies in four points. First is the use of physics-based tools at the conceptual design phase to calculate the aeroelastic properties. Second is the projection of flutter and divergence velocity constraint lines in a power loading versus wing loading graph. The mapping of such constraints in a designer's familiar format is a valuable tool for fast examination of the design space. Third is the improvement of the aeroelastic assessment given the time allotted. Until recently, because of extensive computational and time requirements, aeroelasticity was only assessed at the preliminary design phase. This research illustrates a scheme whereby, for the first time, aeroelasticity can be assessed at the early design formulation stages. Forth, this assessment allowed to verify the impact of changing velocity, altitude, and angle of attack and identify robust design space with these three mission properties. The method's application to the quiet supersonic business jet gave a delta shaped wing for the supersonic speed regime. A subsonic case resulted in a high aspect ratio wing. The scaling approach allowed iso-flutter and iso-divergence lines to be plotted. The main effects of velocity, altitude, and angle of attack on these iso-lines were also discussed, as was the identification of robust design space. The response surface surrogate models allowed convergence of the system optimization but questions were posed as to the accuracy of these quadratic models. Other future improvements include the addition of more disciplines and more detailed models.
360

Algorithm Development for Large-Scale Multiple Antenna Wireless Systems in Cloud Computing Environment

Chao , Wen-Yuen 31 July 2012 (has links)
Currently, data size that we have to deal with is growing bigger and bigger. This fact implies that the computing time and computing power for dealing with the data is demanded. A way to circumvent the difficulty is as follows: Divide the data into several small blocks and then process these small blocks by several computers. Therefore, we need a tool for the decomposition-coordinated procedure. Alternating direction method of multipliers (ADMM) is a powerful algorithm for the mentioned purpose and has widely used in distributed optimizations. With ADMM algorithm, a big global optimization problem can be decomposed into several small local optimization problems. ADMM algorithm has been used in several recent distributed systems such as cloud systems and distributed antenna systems. In this thesis, we aim to apply the ADMM in a distributed antenna system. For the uplink setting, we develop a distributed demodulation algorithm, where multiple base stations collaborate with each other for data detection. On the other hand, for the downlink setting, we develop a distributed beamforming design algorithm, where multiple base stations collaborate to form a beamforming for mitigating the inter-cell interference. Finally, simulations are conducted to verify the efficiency of our designs.

Page generated in 0.7135 seconds