• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • Tagged with
  • 6
  • 6
  • 6
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Approximate recursive algorithm for finding MAP of binary Markov random fields

Altaye, Endale Berhane January 2010 (has links)
<p>The purpose of this study was to develop a recursive algorithm for computing a maximum a posteriori (MAP) estimate of a binary Markov random field (MRF) by using the MAP-MRF framework. We also discuss how to include an approximation in the recursive scheme, so that the algorithm becomes computationally feasible also for larger problems. In particular, we discuss how our algorithm can be used in an image analysis setting. We consider a situation where an unobserved latent field is assumed to follow a Markov random field prior model, a Gaussian noise-corrupted version of the latent field is observed, and we estimate the unobserved field by the MAP estimator.</p>
2

A simulation of the development and screening of Cancer Mammae

Riksheim, Marianne January 2009 (has links)
<p>Based on the tumor growth model of Harald Weedon-Fekjær et al.'s paper "Breast cancer tumor growth estimated through mammography screening data", a simulation of breast cancer occurrence and tumor growth in a large population of women was made. The simulation was made realistic by starting tumor growth according to a Poisson process, including a distribution for clinical detection size and a screening test sensitivity function, and using an individual growth rate, based on estimates of Weedon-Fekjær et al. After running the full simulation, parts of the simulation outcomes were compared to known data of breast cancer and the model was found to give realistic and expected results. The simulation was then used to look for other interesting results such as expected reduction in time to tumor detection due to screening and finding the size distribution of tumors before and after screening. For the age group 50 - 69 years, it was found that screening every year allows a reduction of 19.1 months, while screening every two, three, five and ten years allows for reductions of, respectively, 9.5, 8.3, 6.1 and 3.4 months. These results are based on the assumption that tumors are actually found at screening, i.e. the tumors are not found clinically before they are found on screening. When clinical findings are included, different results are obtained. For the age group 50 - 69 years, it was found that screening every year allows a reduction of 15 months, while screening every two, three, five and ten years allows for reductions of, respectively, 4.8, 1.4, -5.6 and -21.5 months. Negative numbers indicate that a tumor is found earlier clinically than at screening.</p>
3

A simulation of the development and screening of Cancer Mammae

Riksheim, Marianne January 2009 (has links)
Based on the tumor growth model of Harald Weedon-Fekjær et al.'s paper "Breast cancer tumor growth estimated through mammography screening data", a simulation of breast cancer occurrence and tumor growth in a large population of women was made. The simulation was made realistic by starting tumor growth according to a Poisson process, including a distribution for clinical detection size and a screening test sensitivity function, and using an individual growth rate, based on estimates of Weedon-Fekjær et al. After running the full simulation, parts of the simulation outcomes were compared to known data of breast cancer and the model was found to give realistic and expected results. The simulation was then used to look for other interesting results such as expected reduction in time to tumor detection due to screening and finding the size distribution of tumors before and after screening. For the age group 50 - 69 years, it was found that screening every year allows a reduction of 19.1 months, while screening every two, three, five and ten years allows for reductions of, respectively, 9.5, 8.3, 6.1 and 3.4 months. These results are based on the assumption that tumors are actually found at screening, i.e. the tumors are not found clinically before they are found on screening. When clinical findings are included, different results are obtained. For the age group 50 - 69 years, it was found that screening every year allows a reduction of 15 months, while screening every two, three, five and ten years allows for reductions of, respectively, 4.8, 1.4, -5.6 and -21.5 months. Negative numbers indicate that a tumor is found earlier clinically than at screening.
4

Approximate recursive algorithm for finding MAP of binary Markov random fields

Altaye, Endale Berhane January 2010 (has links)
The purpose of this study was to develop a recursive algorithm for computing a maximum a posteriori (MAP) estimate of a binary Markov random field (MRF) by using the MAP-MRF framework. We also discuss how to include an approximation in the recursive scheme, so that the algorithm becomes computationally feasible also for larger problems. In particular, we discuss how our algorithm can be used in an image analysis setting. We consider a situation where an unobserved latent field is assumed to follow a Markov random field prior model, a Gaussian noise-corrupted version of the latent field is observed, and we estimate the unobserved field by the MAP estimator.
5

Estimating Time-Continuous Gene Expression Profiles Using the Linear Mixed Effects Framework

Page, Christian Magnus January 2012 (has links)
With the first generation of microarray experiments there were discussions and important arguments on how the samples should be treated. This included what kind of transformation and normalization procedures that the data should be subjected to before the actual analysis. With the new generation of microarray experiments, genome wide association studies, and more complicated experiments, new questions and standards arises. An important question is the appropriate use of controls, and what effect these controls have on the assessed behaviour of the genes.The main objective in this thesis was to analyse a data set and experimental procedure from an experiment done at IKM (NTNU, 2009), where the effect of a gastrin treatment was measured on a set of genes along with an unstimulated control sample over a time interval. The experiment was replicated once, giving two independent experiments. A natural extension of this is then; what is the gain in precision by adding additional replicas to the experiment? And what additional information is given in an unstimulated control sample?Our approach was to use the Linear Mixed Effects (LME) framework to fit a regression curve to each gene, for both the treated time series itself, and the treated adjusted for by the unstimulated control sample. Each replication was assumed to have a random offset from the common mean. The mean was modelled using basis expansion with the Legendre polynomials, thus allowing it to vary as a smooth function over the time interval.A computer simulation showed that an increase in the number of independent time series sampled would decrease the error in the estimated expression profile. Even when this causes the number of time points (measurements) within the time series to decrease.The analysis of the data showed that not using an unstimulated control gave many false positive results, however, always using such a control will also cause an increase in both false negative and false positive results, due to increase in stochastisity. However, having an unstimulated control sample will give the researcher an increased control when assessing the effect of the treatment.
6

Extreme Value Analysis & Application of the ACER Method on Electricity Prices

Anda, Torgeir January 2012 (has links)
In this thesis we have explored the very high prices that sometimes occurs in the Nord Pool electricity market Elspot. By applying AR-GARCH time series models, extreme value theory, and ACER estimation techniques, we have sought to estimate the probabilities of threshold exceedances related to electricity prices. Of particular concern was the heavy-tailed Fr&#233;chet distribution, which was the asymptotic distribution assumed in the ACER estimation.We have found that with extreme value theory we are better equipped to deal with the very high quantiles in the time series we have analyzed. We have also described a method that can give an assessment of the probability of exceeding a selected level in the electricity price.

Page generated in 0.068 seconds