• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 220
  • 48
  • Tagged with
  • 268
  • 268
  • 264
  • 263
  • 107
  • 106
  • 93
  • 79
  • 79
  • 54
  • 50
  • 37
  • 36
  • 28
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Analyzing Survey Response Time and Response Rate for Colorectal Cancer Patients Using Logistic and Poisson Regression / Analys av svarstid och svarsfrekvens för patienter med kolorektal cancer med hjälp av regression

Möller, Anna, Lagerros, Martina January 2023 (has links)
Cancer is a highly prevalent disease worldwide, claiming hundreds of lives each year. In the field of cancer research, it is customary to conduct surveys in which patients are asked to self-report and assess their symptoms and overall health. In such research, it is essential for patients to respond promptly to questionnaires to avoid recall bias and for a representative patient sample to respond to avoid biased sampling. This report aims to investigate the factors that impact response rate and response time using logistic regression and Poisson regression. The study focuses on a dataset of patients with colorectal cancer, with the response rate of patients with pancreatic cancer serving as a reference. By analyzing variables such as gender, age, place of residence, and the method of survey notification, the conclusion is that patients over the age of 80 who received their survey login codes on paper are the least responsive and underrepresented subgroup of the sample. In the analysis of the response time using Poisson regression, the conclusion is that the notification channel has the most significant impact on response rate. / Cancer är en mycket utbredd sjukdom världen över och kräver hundratals liv varje år. Inom cancerforskningen är det vanligt att genomföra undersökningar där patienter ombeds att självrapportera och bedöma sina symtom och övergripande hälsa. I sådana undersökningar är det avgörande att patienterna svarar snabbt på enkäter för att undvika minnesbias och för att få fram en representativ patientgrupp och undvika snedvriden urvalsprocess. Syftet med denna rapport är att undersöka faktorer som påverkar svarsfrekvensen och svarstiden genom att använda logistisk regression och Poisson-regression. Studien fokuserar på en dataset av patienter med tjocktarmscancer, där svarsfrekvensen hos patienter med bukspottkörtelcancer används som referens. Genom att analysera variabler som kön, ålder, bostadsort och metod för undersökningsmeddelande dras slutsatsen att patienter över 80 år som fick sina inloggningskoder på papper är den minst responsiva och mest underrepresenterade undergruppen av urvalet. I analysen av svarstiden med hjälp av Poisson-regression dras slutsatsen att undersökningskanalen har den största påverkan på svarsfrekvensen.
212

Evaluating volatility forecasts, A study in the performance of volatility forecasting methods / Utvärdering av volatilitetsprognoser, En undersökning av kvaliteten av metoder för volatilitetsprognostisering

Verhage, Billy January 2023 (has links)
In this thesis, the foundations of evaluating the performance of volatility forecasting methods are explored, and a mathematical framework is created to determine the overall forecasting performance based on observed daily returns across multiple financial instruments. Multiple volatility responses are investigated, and theoretical corrections are derived under the assumption that the log returns follow a normal distribution. Performance measures that are independent of the long-term volatility profile are explored and tested. Well-established volatility forecasting methods, such as moving average and GARCH (p,q) models, are implemented and validated on multiple volatility responses. The obtained results reveal no significant difference in the performances between the moving average and GARCH (1,1) volatility forecast. However, the observed non-zero bias and a separate analysis of the distribution of the log returns reveal that the theoretically derived corrections are insufficient in correcting the not-normally distributed log returns. Furthermore, it is observed that there is a high dependency of abslute performances on the considered evaluation period, suggesting that comparisons between periods should not be made. This study is limited by the fact that the bootstrapped confidence regions are ill-suited for determining significant performance differences between forecasting methods. In future work, statistical significance can be gained by bootstrapping the difference in performance measures. Furthermore, a more in-depth analysis is needed to determine more appropriate theoretical corrections for the volatility responses based on the observed distribution of the log returns. This will increase the overall forecasting performance and improve the overall quality of the evaluation framework. / I detta arbete utforskas grunderna för utvärdering av prestandan av volatilitetsprognoser och ett matematiskt ramverk skapas för att bestämma den övergripande prestandan baserat på observerade dagliga avkastningar för flera finansiella instrument. Ett antal volatilitetsskattningar undersökts och teoretiska korrigeringar härleds under antagandet att log-avkastningen följer en normalfördelningen. Prestationsmått som är oberoende av den långsiktiga volatilitetsprofilen utforskas och testas. Väletablerare metoder för volatilitetsprognostisering, såsom glidande medelvärden och GARCH-modeller, implementeras och utvärderas mot flera volatilitetsskattningar. De erhållna resultaten visar att det inte finns någon signifikant skillnad i prestation mellan prognoser producerade av det glidande medelvärdet och GARCH (1,1). Det observerade icke-noll bias och en separat analys av fördelningen av log-avkastningen visar dock att de teoretiskt härledda korrigeringarna är otillräckliga för att fullständigt korrigera volatilitesskattningarna under icke-normalfördelade log-avkastningar. Dessutom observeras att det finns ett stort beroende på den använda utvärderingsperioden, vilket tyder på att jämförelser mellan perioder inte bör göras. Denna studie är begränsad av det faktum att de använda bootstrappade konfidensregionerna inte är lämpade för att fastställa signifikanta skillnader i prestanda mellan prognosmetoder. I framtida arbeten behövs fortsatt analys för att bestämma mer lämpliga teoretiska korrigeringar för volatilitetsskattningarna baserat på den observerade fördelningen av log-avkastningen. Detta kommer att öka den övergripande prestandan och förbättra den övergripande kvaliteten på prognoserna.
213

Linear Eigenvalue Problems in Quantum Chemistry / Linjärt egenvärde Problem inom kvantkemi kvantkemi

van de Linde, Storm January 2023 (has links)
In this thesis, a method to calculate eigenpairs is implemented for the Multipsi library. While the standard implemtentations use the Davidson method with Rayleigh-Ritz extraction to calculate the eigenpairs with the lowest eigenvalues, the new method uses the harmonic Davidson method with the harmonic Rayleigh-Ritz extraction to calculate eigenpairs with eigenvalues near a chosen target. This is done for Configuration Interaction calculations and for Multiconfigurational methods. From calculations, it seems the new addition to the Multipsi library is worth investigating further as convergence for difficult systems with a lot of near-degeneracy was improved. / I denna avhandling implementeras en metod för att beräkna egenpar för Multipsi-biblioteket. Medan standardimplementeringarna använder Davidson-metoden med Rayleigh-Ritz-extraktion för att beräkna egenparen med de lägsta egenvärdena, använder den nya metoden den harmoniska Davidson-metoden med den harmoniska Rayleigh-Ritz-extraktionen för att beräkna egenparen med egenvärden nära ett valt mål. Detta görs för konfigurationsinteraktionsberäkningar och för multikonfigurationsmetoder. Utifrån beräkningarna verkar det nya tillskottet till Multipsi-biblioteket vara värt att undersöka vidare eftersom konvergensen för svåra system med mycket nära degenerering förbättrades.
214

Numerical Algorithms for Optimization Problems in Genetical Analysis

Mishchenko, Kateryna January 2008 (has links)
<p>The focus of this thesis is on numerical algorithms for efficient solution of QTL analysis problem in genetics.</p><p>Firstly, we consider QTL mapping problems where a standard least-squares model is used for computing the model fit. We develop optimization methods for the local problems in a hybrid global-local optimization scheme for determining the optimal set of QTL locations. Here, the local problems have constant bound constraints and may be non-convex and/or flat in one or more directions. We propose an enhanced quasi-Newton method and also implement several schemes for constrained optimization. The algorithms are adopted to the QTL optimization problems. We show that it is possible to use the new schemes to solve problems with up to 6 QTLs efficiently and accurately, and that the work is reduced with up to two orders magnitude compared to using only global optimization.</p><p>Secondly, we study numerical methods for QTL mapping where variance component estimation and a REML model is used. This results in a non-linear optimization problem for computing the model fit in each set of QTL locations. Here, we compare different optimization schemes and adopt them for the specifics of the problem. The results show that our version of the active set method is efficient and robust, which is not the case for methods used earlier. We also study the matrix operations performed inside the optimization loop, and develop more efficient algorithms for the REML computations. We develop a scheme for reducing the number of objective function evaluations, and we accelerate the computations of the derivatives of the log-likelihood by introducing an efficient scheme for computing the inverse of the variance-covariance matrix and other components of the derivatives of the log-likelihood.</p>
215

Modern Stereo Correspondence Algorithms : Investigation and Evaluation

Olofsson, Anders January 2010 (has links)
<p>Many different approaches have been taken towards solving the stereo correspondence problem and great progress has been made within the field during the last decade. This is mainly thanks to newly evolved global optimization techniques and better ways to compute pixel dissimilarity between views. The most successful algorithms are based on approaches that explicitly model smoothness assumptions made about the physical world, with image segmentation and plane fitting being two frequently used techniques.</p><p>Within the project, a survey of state of the art stereo algorithms was conducted and the theory behind them is explained. Techniques found interesting were implemented for experimental trials and an algorithm aiming to achieve state of the art performance was implemented and evaluated. For several cases, state of the art performance was reached.</p><p>To keep down the computational complexity, an algorithm relying on local winner-take-all optimization, image segmentation and plane fitting was compared against minimizing a global energy function formulated on pixel level. Experiments show that the local approach in several cases can match the global approach, but that problems sometimes arise – especially when large areas that lack texture are present. Such problematic areas are better handled by the explicit modeling of smoothness in global energy minimization.</p><p>Lastly, disparity estimation for image sequences was explored and some ideas on how to use temporal information were implemented and tried. The ideas mainly relied on motion detection to determine parts that are static in a sequence of frames. Stereo correspondence for sequences is a rather new research field, and there is still a lot of work to be made.</p>
216

Fractal sets and dimensions

Leifsson, Patrik January 2006 (has links)
<p>Fractal analysis is an important tool when we need to study geometrical objects less regular than ordinary ones, e.g. a set with a non-integer dimension value. It has developed intensively over the last 30 years which gives a hint to its young age as a branch within mathematics.</p><p>In this thesis we take a look at some basic measure theory needed to introduce certain definitions of fractal dimensions, which can be used to measure a set's fractal degree. Comparisons of these definitions are done and we investigate when they coincide. With these tools different fractals are studied and compared.</p><p>A key idea in this thesis has been to sum up different names and definitions referring to similar concepts.</p>
217

How useful are intraday data in Risk Management? : An application of high frequency stock returns of three Nordic Banks to the VaR and ES calculation

Somnicki, Emil, Ostrowski, Krzysztof January 2010 (has links)
<p>The work is focused on the Value at Risk and the Expected Shortfallcalculation. We assume the returns to be based on two pillars - the white noise and the stochastic volatility. We assume that the white noise follows the NIG distribution and the volatility is modeled using the nGARCH, NIG-GARCH, tGARCH and the non-parametric method. We apply the models into the stocks of three Banks of the Nordic market. We consider the daily and the intraday returns with the frequencies 5, 10, 20 and 30 minutes. We calculate the one step ahead VaR and ES for the daily and the intraday data. We use the Kupiec test and the Markov test to assess the correctness of the models. We also provide a new concept of improving the daily VaR calculation by using the high frequency returns. The results show that the intraday data can be used to the one step ahead VaR and the ES calculation. The comparison of the VaR for the end of the following trading day calculated on the basis of the daily returns and the one computed using the high frequency returns shows that using the intraday data can improve the VaR outcomes.</p>
218

Numerical Algorithms for Optimization Problems in Genetical Analysis

Mishchenko, Kateryna January 2008 (has links)
The focus of this thesis is on numerical algorithms for efficient solution of QTL analysis problem in genetics. Firstly, we consider QTL mapping problems where a standard least-squares model is used for computing the model fit. We develop optimization methods for the local problems in a hybrid global-local optimization scheme for determining the optimal set of QTL locations. Here, the local problems have constant bound constraints and may be non-convex and/or flat in one or more directions. We propose an enhanced quasi-Newton method and also implement several schemes for constrained optimization. The algorithms are adopted to the QTL optimization problems. We show that it is possible to use the new schemes to solve problems with up to 6 QTLs efficiently and accurately, and that the work is reduced with up to two orders magnitude compared to using only global optimization. Secondly, we study numerical methods for QTL mapping where variance component estimation and a REML model is used. This results in a non-linear optimization problem for computing the model fit in each set of QTL locations. Here, we compare different optimization schemes and adopt them for the specifics of the problem. The results show that our version of the active set method is efficient and robust, which is not the case for methods used earlier. We also study the matrix operations performed inside the optimization loop, and develop more efficient algorithms for the REML computations. We develop a scheme for reducing the number of objective function evaluations, and we accelerate the computations of the derivatives of the log-likelihood by introducing an efficient scheme for computing the inverse of the variance-covariance matrix and other components of the derivatives of the log-likelihood.
219

How useful are intraday data in Risk Management? : An application of high frequency stock returns of three Nordic Banks to the VaR and ES calculation

Somnicki, Emil, Ostrowski, Krzysztof January 2010 (has links)
The work is focused on the Value at Risk and the Expected Shortfallcalculation. We assume the returns to be based on two pillars - the white noise and the stochastic volatility. We assume that the white noise follows the NIG distribution and the volatility is modeled using the nGARCH, NIG-GARCH, tGARCH and the non-parametric method. We apply the models into the stocks of three Banks of the Nordic market. We consider the daily and the intraday returns with the frequencies 5, 10, 20 and 30 minutes. We calculate the one step ahead VaR and ES for the daily and the intraday data. We use the Kupiec test and the Markov test to assess the correctness of the models. We also provide a new concept of improving the daily VaR calculation by using the high frequency returns. The results show that the intraday data can be used to the one step ahead VaR and the ES calculation. The comparison of the VaR for the end of the following trading day calculated on the basis of the daily returns and the one computed using the high frequency returns shows that using the intraday data can improve the VaR outcomes.
220

Revision Moment for the Retail Decision-Making System

Juszczuk, Agnieszka Beata, Tkacheva, Evgeniya January 2010 (has links)
In this work we address to the problems of the loan origination decision-making systems. In accordance with the basic principles of the loan origination process we considered the main rules of a clients parameters estimation, a change-point problem for the given data and a disorder moment detection problem for the real-time observations. In the first part of the work the main principles of the parameters estimation are given. Also the change-point problem is considered for the given sample in the discrete and continuous time with using the Maximum likelihood method. In the second part of the work the disorder moment detection problem for the real-time observations is considered as a disorder problem for a non-homogeneous Poisson process. The corresponding optimal stopping problem is reduced to the free-boundary problem with a complete analytical solution for the case when the intensity of defaults increases. Thereafter a scheme of the real time detection of a disorder moment is given.

Page generated in 0.0799 seconds