• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 275
  • 32
  • Tagged with
  • 307
  • 307
  • 307
  • 307
  • 307
  • 31
  • 24
  • 24
  • 21
  • 18
  • 17
  • 17
  • 16
  • 16
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Testing of micro-fluidic systems for Raman spectroscopic measurements on biological cells

Berger, Malin January 2018 (has links)
Pulmonary Artery Hypertension (PAH) is a condition that can affect people as a consequence of infections or diseases such as lung disease, high blood pressure or pneumonia. When afflicted by these diseases, low oxygen content in the lung tissue causes the pulmonary arterial soft muscle cells (PASMC) located in the walls of the pulmonary arteries to chronically swell up. As a result, the arteries are constantly narrowed. This can in many cases be fatal as the arteries become clogged and the heart is forced to pump more blood to the lungs, causing an enlargement of the right heart chamber which eventually may lead to heart failure. This irreversible swelling of the PASMC is the cause for PAH. To find a treatment for this incurable disease, the mechanisms of the vasoconstriction need to be investigated.  Spectroscopy is the study of the interactions between light and matter and is a tool that can be used to gain knowledge in the matter of the expansions of the PASMC. In particular, Raman spectroscopy that targets the inelastic interactions can be used since it registers dynamic changes of cells.  To simulate an oxygen deprived environment, a micro-fluidic system designed for use in cellular experiments has been developed. Tests of the prototypes showed strong Raman signals from the polymeric material of the system itself. These signals overshadowed the signals from the observed sample. The objective of the experiments presented in this report was to test whether the signals from the micro-fluidic system could be eliminated by adding spacing between the polymer and the sample.  The experiment was conducted by collecting data of samples from baker's yeast prepared in the micro-fluidic system at different z-distances. By this the optimal spacing between the polymer of the micro-fluidic system and the sample could be determined. This experiment concluded that the sample needed to be placed 1.54 mm further from the micro-fluidic system in order to test human lung tissue at 2 mW laser intensity.
12

Simulation of LiDAR data for forestry applications / Simulering av LiDAR data i skogsbrukssyfte

Öhman, Nikanor January 2018 (has links)
In forestry it is important to have accurate information about the forest. LiDAR (laser scanning) can be used to scan vast areas of forest and from the data extract information about the trees. The purpose of this thesis is to develop a simulator for LiDAR data. The simulator will be tested on a method for tree localization (Holmgren and Lindberg 2013) to see how parameters like tree density and laser frequency effects the accuracy of the localization. First a simulator which uses simple shaped trees (in the shape of cones) is written. Later on a tree model based on real laser data is created by the use of histogram density estimation. Ray-tracing is used to simulate the LiDAR data which the trees give rise to. This is done by following each ray of laser and see where it is reflected. The tree localization method is tested on the data and we report the following findings: 1: The percentage of correctly located trees decreases with increasing tree density. 2: Larger trees yields an increase in false trees found by the localization method. 3: Higher laser pulse density decreases the number of false trees. 4: The minimum radius at which the localization method start fitting ellipsoids greatly effects the number of false trees. Smaller radius yield more false trees.
13

Fatigue analysis - system parameters optimization

Markgren, Hanna January 2018 (has links)
For a mechanical system exposed to repeated cyclic loads fatigue is one of the most common reasons for the system to fail. However fatigue failure calculations are not that well developed. Often when fatigue calculations are made they are done with standard loads and simplified cases. The fatigue life is the time from start of use until the system fails due to fatigue and there does exist some building blocks to calculate the fatigue life. The aim for this project was to put these building blocks together in a workflow that ca be used for calculations of the fatigue life. The workflow was built so that it should be easy to follow for any type of me- chanical system. The start of the workflow is the load history of the system. This is then converted into a stress history that is used for the calculations of the fatigue life. Finally the workflow was tested with two test cases to see if it was possible to use. In Algoryx Momentum the model for each case was set up and then the load history was extracted for each time step during the simulation. To convert the load history to stress history FEM calculations was needed, this was however not a part of this project so the constants to convert loads to stress was given. Then with the stress history in place it was possible to calculate the fatigue life. The results from both test cases were that it was possible to follow every step of the workflow and by this use the workflow to calculate the fatigue life. The second test also showed that with an optimization the system was improved and this resulted in a longer lifetime. To conclude the workflow seems to work as expected and is quite easy to follow. The result given by using the workflow shows the fatigue life, which was the target for the project. However, to be able to evaluate the workflow fully and understand how well the resluts can be trusted a comparison with empiric data would be needed. Still the results from the tests are that the workflow seem to give reasonable results when calculating fatigue life.
14

Radiation exposure to personnel during fluoroscopic procedures : Strålningsmiljö för personal under genomlysningsarbete

Eriksson, Olof January 2018 (has links)
There are about 17 million X-ray procedures performed in Sweden every year. Various methods are used to determine the risk for the patient, and for the staff. The objective of this project was to map X-ray scatter coming out of a patient as a result of interactions between the radiation and body tissue during certain medical procedures that involve fluoroscopy. Flouroscopy is a type of X-ray imaging method that generates a moving picture which allows an operator to view in-body procedures live. A successfully created map of the radiation field can work as a tool for risk analysis concerning the dose of radiation of which the medical staff is exposed to, this parameter will later be described as the effective dose (E). The effective dose is a tool for assessment of the risk of developing lethal cancer due to radiation exposure. This report will also investigate the radiation that reaches the eye lens of the staff, since the maximum recommended dosage for the eye lens has been lowered recently when it was discovered that the eye lens was more sensitive to radiation than previously known. In this report data was collected from radiation exposure situations, and it was concluded that distance is a good protector against radiation, which agree well with theory discussed in the report. Another theory which was discussed in the paper states that positions behind the X-ray tube will be exposed to the highest amount of radiation, this was also proven. The measured data from investigating protective equipment showed that the equipment in place was effective. / <p>The author changed last name to Folkunger due to marriage shortly after this publication.</p>
15

Volume calculation of the thyroid gland from SPECT images based on Monte Carlo simulations

Niklas, Palmqvist January 2018 (has links)
In this study the volume determination which is a part of the doseplanning for patients with thyrotoxicosis was investigated. The aim was to find an accurate method to determine the active volume with single photon emission tomography (SPECT) which in several studies have shown better results than with the currently used method planar scintigraphy (PS). This was implemented on the Xeleris 3.1 by General Electric (GE) at the University Hospital of Umeå (NUS). The examination time with SPECT is required not to be significantly longer than the examination time with PS. Despite the relatively short examination time, the accuracy of the volume determination should be more accurate. This was analyzed with a true value of the volume, conducted with Monte Carlo simulations of digital anthropomorphic phantoms. It is also important that the developed method is user friendly. The study included ten patients with thyrotoxicosis from which, relative activity uptakes were measured. These uptakes were specified in 22 digital XCAT phantoms which were random sampled with respect to phantom mass. The Monte Carlo program "simulating medical imaging nuclear detectors" (simind) was used to simulate the SPECT-system. The projection images were ordered subset expectation maximization (OSEM) reconstructed and Butterworth filtered. Unfiltered images were used to compare volume calculations with filtered ones. The volume of the thyroid was segmented using threshold values applied to all voxels in the image-sets and the optimization of the thresholds was conducted by numerical calculations. The results in this study shows that the best choice of intensity threshold value is 21.1(24)% of the maximum voxel value for all phantoms. The threshold is valid for OSEM iteration number five and unfiltered image-sets. Butterworth filtered images were less suitable to use than unfiltered images when the thyroid volume was calculated with data from SPECT-simulations of phantoms.
16

Using Boosted Decision Trees in the Search for Heavy Neutral Higgs Bosons in the ATLAS Experiment

El Faham, Hesham January 2018 (has links)
A search for heavy neutral Higgs bosons in the τ_μ τ_had channel is presented. The analysis was performed using approximately 32 fb^−1 of 13 TeV proton-proton collision data with the ATLAS detector and improves upon earlier ATLAS searches through the use of Boosted Decision Trees (BDT).
17

Optimizing fuel cell channel geometry to favour water transport / Optimera bränslecellkanalgeometri for att gynna vattentransport

Linder, Tom January 2017 (has links)
No description available.
18

Dyonic supersymmetric solutions in supergravity

Rødland, Lukas January 2017 (has links)
No description available.
19

Analysis of the rigid multibody tire

Yruretagoyena Conde, Ruben January 2018 (has links)
A brief description  of the multibody system mechanics is given as the first step in this thesis to present the dynamical equations which will be the main tool to analyze the two-body tire model.  After establishing the basic theory about multibody systems to understand the main character in the thesis (the tire)  all the  physical components playing in the tire terrain interaction  will be defined together with some of the developed ways and models to describe the  tire rolling  on non-deformable terrain and the  tyre rolling on the deformable terrain. The two-body tire model definition and general description will be introduced.  Using the tools acquired on the course of the thesis  the  dynamic equations will be solved for the particular case of the two-body tire model rolling on a flat rigid surface. After solving the equation we are going to find out some incongruences with respect to reality and then we are going to have a proposal to adjust the model to agree with  real tires.
20

Globulettes : a new class of very small and dense interstellar clouds

Grenman, Tiia January 2006 (has links)
The space between stars is not empty, but filled with a thin gas and microscopic dust grains, together forming the so-called interstellar medium. Matter is concentrated into clouds of very different sizes, ranging from giant molecular cloud complexes to massive isolated dark small isolated cloudlets, called globules. In bright emission regions, surrounding young massive stars, one can find many tiny, isolated and cold objects appearing as dark spots against the background nebulosity. These objects are much smaller and less massive than normal globules. Such small clouds are the topic of the present Licentiate thesis, where they have been baptised globulettes. The analysis is based on H-alpha images of the Rosette Nebula and IC 1805 Nebula, collected with the Nordic Optical Telescope in the years 1999 and 2000. In total 151 globulettes in these two regions were catalogued, measured and analysed. Positions, orientations, sizes, masses, densities and pressures were derived, as well as their present condition with regard to gravitational stability. From these data, their origins and possible evolutionary history were discussed. Most globulettes are sharp-edged and well isolated from the surrounding. The size distributions are quite similar in the two studied nebulae. The masses and densities were derived from the extinction of light and the measured shape of the objects. In a few cases the masses have been estimated earlier by another team, from radio emission of CO gas, and our values are in line with their estimates for these particular globulettes. A majority of the objects have masses &lt; 20 Jupiter masses, and the mass distribution drops rapidly towards higher values. Very few objects have masses above 100 Jupiter masses, which we define as the lower mass limit for normal globules. However, there is no smooth overlap between the two types of clouds, which makes us conclude that globulettes represent a distinct, new class of objects. The column density profile of a typical globulette was found to be rather uniform in the central parts, but flattens at the periphery, as compared to what is expected from a sphere of constant volume density. The virial theorem, including only the kinetic and gravitational energy, indicates that all 133 globulettes are expanding or disrupting. However, other forces, such as outer gas and radiation pressures, can help to confine the globulettes. Our results show that about half of these objects are gravitationally bound and even unstable against contraction, which opens some evolutionary scenarios not expected in the first place. Some massive globulettes could therefore collapse to form stars with very low masses, for instance, so-called brown dwarfs, while the low-mass globulettes could contract to free-floating planets. Globulettes might have been formed either by the fragmentation of larger filaments, or by the disintegration of large molecular clouds originally hosting compact and small cores. At a later stage even the confine globulettes might disrupt because of evaporation form the action of external radiation and gas flows. or evaporate. However, preliminary calculations of their lifetimes show that some might survive for a relatively long time and even longer than their estimated contraction time. No evidence of embedded infrared-emitting sources was found in independent IR studies, but one cannot exclude that globulettes already host low-mass brown dwarfs or planets. / <p>Godkänd; 2006; 20070109 (haneit)</p>

Page generated in 0.078 seconds