• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 203
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 465
  • 63
  • 56
  • 56
  • 55
  • 48
  • 45
  • 43
  • 41
  • 40
  • 38
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Numerical Investigation on Spherical Harmonic Synthesis and Analysis

Bärlund, Johnny January 2015 (has links)
In this thesis work the accuracy of the spherical harmonic synthesis and analysis are investigated, by simulated numerical studies.The main idea is to investigate the loss of accuracy, in the geopotential coeffcients, by the following testing method. We start with a synthesis calculation, using the coefficients(EGM2008), to calculate geoid heights on a regular grid. Those geoid heights are then used in an analysis calculation to obtain a new set of coeffcients, which are in turn used to derive a new set of geoid heights. The difference between those two sets of geoid heights will be analyzed to assess the accuracy of the synthesis and analysis calculations.The tests will be conducted with both point-values and area-means in the blocks in the grid. The area-means are constructed in some different ways and will also be compared to the mean value from 10000 point values as separate tests. Numerical results from this investigation show there are signifi…cant systematic errors in the geoid heights computed by spherical harmonic synthesis and analysis, sometimes reaching as high as several meters. Those big errors are most common at the polar regions and at the mid-latitude regions.
272

Road Estimation Using GPS Traces and Real Time Kinematic Data

Ghanbarynamin, Samira 29 April 2022 (has links)
Advance Driver Assistance System (ADAS) are becoming the main issue in today’s automotive industry. The new generation of ADAS aims at focusing on more details and obtaining more accuracy. To achieve this objective, the research and development parts of the automobile industry intend to utilize Global Positioning System (GPS) by integrating it with other existing tools in ADAS. There are several driving assistance systems which are served by a digital map as a primary or a secondary sensor. The traditional techniques of digital map generation are expensive and time consuming and require extensive manual effort. Therefore, having frequently updated maps is an issue. Furthermore, the existing commercial digital maps are not highly accurate. This Master thesis presents several algorithms for automatically converting raw Universal Serial Bus (USB)-GPS and Real Time Kinematic (RTK) GPS traces into a routable road network. The traces are gathered by driving 20 times on a highway. This work begins by pruning raw GPS traces using four different algorithms. The first step tries to minimize the number of outliers. After the traces are smoothed, they tend to consolidate into smooth paths. So in order to merge all 20 trips together and estimate the road network a Trace Merging algorithm is applied. Finally, a Non-Uniform Rational B-Spline (NURBS) curve is implemented as an approximation curve to smooth the road shape and decrease the effect of noisy data further. Since the RTK-GPS receiver provides highly accurate data, the curve resulted from its GPS data is the most sufficient road shape. Therefore, it is used as a ground truth to compare the result of each pruning algorithm based on data from USB-GPS. Lastly, the results of this work are demonstrated and a quality evaluation is done for all methods.
273

Hierarchical Approximation Methods for Option Pricing and Stochastic Reaction Networks

Ben Hammouda, Chiheb 22 July 2020 (has links)
In biochemically reactive systems with small copy numbers of one or more reactant molecules, stochastic effects dominate the dynamics. In the first part of this thesis, we design novel efficient simulation techniques for a reliable and fast estimation of various statistical quantities for stochastic biological and chemical systems under the framework of Stochastic Reaction Networks. In the first work, we propose a novel hybrid multilevel Monte Carlo (MLMC) estimator, for systems characterized by having simultaneously fast and slow timescales. Our hybrid multilevel estimator uses a novel split-step implicit tau-leap scheme at the coarse levels, where the explicit tau-leap method is not applicable due to numerical instability issues. In a second work, we address another challenge present in this context called the high kurtosis phenomenon, observed at the deep levels of the MLMC estimator. We propose a novel approach that combines the MLMC method with a pathwise-dependent importance sampling technique for simulating the coupled paths. Our theoretical estimates and numerical analysis show that our method improves the robustness and complexity of the multilevel estimator, with a negligible additional cost. In the second part of this thesis, we design novel methods for pricing financial derivatives. Option pricing is usually challenging due to: 1) The high dimensionality of the input space, and 2) The low regularity of the integrand on the input parameters. We address these challenges by developing different techniques for smoothing the integrand to uncover the available regularity. Then, we approximate the resulting integrals using hierarchical quadrature methods combined with Brownian bridge construction and Richardson extrapolation. In the first work, we apply our approach to efficiently price options under the rough Bergomi model. This model exhibits several numerical and theoretical challenges, implying classical numerical methods for pricing being either inapplicable or computationally expensive. In a second work, we design a numerical smoothing technique for cases where analytic smoothing is impossible. Our analysis shows that adaptive sparse grids’ quadrature combined with numerical smoothing outperforms the Monte Carlo approach. Furthermore, our numerical smoothing improves the robustness and the complexity of the MLMC estimator, particularly when estimating density functions.
274

Generalizace vrstevnic v rovinatých územích / Simplification of contour lines in flat areas

Čelonk, Marek January 2021 (has links)
Simplification of contour lines in flat areas The diploma thesis is focused on the cartographic generalization of contour lines derived from dense point clouds in flat territories, where the original contour lines tend to oscillate. The main aim is to propose, develop and test a new algorithm for the contour simplification preserving the given vertical error and reflecting the cartographic rules. Three methods designed for large scale maps (1 : 10 000 and larger) are presented: the weighted average, modified Douglas-Peucker and potential-based approach. The most promising method is based on the repeated simplification of contour line segments by calculating the generalization potential of its vertices. The algorithm is implemented in Python 2.7 with the use of Arcpy library was tested on DMR5G data, the simplified contour lines were compared with the result created by a professional cartographer. Achieved results are presented on attached topographic maps. Keywords: contours, cartographic generalization, digital cartography, vertical buffer, smoothing, GIS
275

Investigating Surface Oxide Composition and Formation on Metallic Vibenite® Alloys

Monie, Emil, Säfström, Nils, Deng, Yiping, Möllerberg, Axel January 2022 (has links)
Oxide formation on metallic surfaces is a common phenomenon which occursnaturally or intently. Depending on the metallic oxide, they can be viewed as either nuisances or conveniences depending on the effects of the oxide. Formed oxides may also potentially smooth surfaces of metallic alloys since a portion of the surface in contact with the oxygen will be converted into the oxide via the metal-oxygeninteraction, leading to a smoother surface underneath the formed oxide. It was found that oxide formation was most significant when metallic Vibenite® alloys were treated at 1000°C for a minimum of 3 hours with an oxygen flow into the oven of 10 L/min. This signifies the importance of a minimum temperature limit as well as an increased oxygen pressure within the oven the samples are being treated in, which concurs with various studies referred to in the report. The oxides were also somewhat successfully identified using analysis methods such as XPS, XRD and Raman spectroscopy with supporting evidence from simulated Thermo-Calc approximations. Thepost-treatment surfaces of the samples, after having their oxide layers removed, were confirmed to have undergone surface smoothing using the optical analysis method of VSI. The results of this report indicate validity in the use of the oxide formation technique for surface smoothing and strongly suggests further study in material optimised heat-treatments for different metallic alloys with the purpose of surface refinement
276

Forecasting Monthly Swedish Air Traveler Volumes

Becker, Mark, Jarvis, Peter January 2023 (has links)
In this paper we conduct an out-of-sample forecasting exercise for monthly Swedish air traveler volumes. The models considered are multiplicative seasonal ARIMA, Neural network autoregression, Exponential smoothing, the Prophet model and a Random Walk as a benchmark model. We divide the out-of-sample data into three different evaluation periods: Pre-COVID-19, during COVID-19 and Post-COVID-19 for which we calculate the MAE, MAPE and RMSE for each model in each of these evaluation periods. The results show that for the Pre-COVID-19 period all models produce accurate forecasts, in comparison to the Random Walk model. For the period during COVID-19, no model outperforms the Random Walk, with only Exponential smoothing performing as well as the Random Walk. For the period Post-COVID-19, the best performing models are Random Walk, SARIMA and Exponential smoothing, with all aforementioned models having similar performance.
277

Size Function Based Mesh Relaxation

Howlett, John David 18 March 2005 (has links) (PDF)
This thesis addresses the problem of relaxing a finite element mesh to more closely match a size function. The main contributions include new methods for performing size function based mesh relaxation, as well as an algorithm for measuring the performance of size function based mesh relaxation methods.
278

[en] A NOVEL SEMIPARAMETRIC STRUCTURAL MODEL FOR ELECTRICITY FORWARD CURVES / [pt] MODELO ESTRUTURAL SEMI-PARAMÉTRICO PARA CURVAS FORWARD DE ELETRICIDADE

MARINA DIETZE MONTEIRO 23 February 2021 (has links)
[pt] A proteção contra a volatilidade dos preços spot torna-se cada vez mais importante nos mercados de energia desverticalizados. Portanto, ser capaz de modelar preços forward e futuros de eletricidade é crucial em um ambiente competitivo. A eletricidade difere de outras commodities devido à sua capacidade de armazenamento e transporte limitados. Além disso, seus derivativos estão associados a um período de entrega durante o qual a energia é concedida continuamente, o que implica em muitas vezes os contratos de eletricidades serem denominados swaps. Tais peculiaridades tornam a modelagem de preços de contratos de energia elétrica uma tarefa não trivial, onde os modelos tradicionais devem ser adaptados para atender às características mencionadas. Neste contexto, foi proposto um modelo estrutural semi-paramétrico para obtenção de uma curva forward de eletricidade contínua e diária através de critérios de máxima suavidade. Ademais, os contratos forward elementares podem ser representados por qualquer estrutura paramétrica para sazonalidade ou mesmo para variáveis exógenas. Nossa estrutura reconhece a sobreposição dos swaps e permite uma análise das oportunidades de arbitragem observadas nos mercados de energia. A curva forward é calculada por um problema de otimização hierárquico capaz de lidar com conjuntos de dados escassos de mercados com baixa liquidez. Os resultados do PCA corroboram a capacidade do modelo em explicar uma alta porcentagem da variância com apenas alguns fatores. / [en] Hedging against spot price volatilities becomes increasingly important in deregulated power markets. Therefore, being able to model electricity forward prices is crucial in a competitive environment. Electricity differs from other commodities due to its limited storability and transportability. Furthermore, its derivatives are associated with a delivery period during which electricity is continuously delivered, implying on referring to power forwards as swaps. These peculiarities make the modeling of electricity contract prices a non-trivial task, where traditional models must be adapted to address the mentioned characteristics. In this context, we propose a novel semiparametric structural model to compute a continuous daily forward curve of electricity through maximum smoothness criterion. In addition, elementary forward contracts can be represented by any parametric structure for seasonality or even for exogenous variables. Our framework acknowledges the overlapped swaps and allows an analysis of arbitrage opportunities observed in power markets. The smooth forward curve is computed by a hierarchical optimization problem able to handle scarce data sets from low-liquidity markets. PCA results corroborate our framework s capability to explain a high percentage of variance with only a few factors.
279

Manipulering av bildhastighet och dess känslomässiga påverkan på tittarupplevelse vid olika format / Manipulation of frame rate and its emotional effect on viewer perception in different formats

O'Grady, William, Währme, Emil January 2023 (has links)
Frame rate is a fundamental element of creating the illusion of movement in video based media. For almost a century film has been produced in agreement with a standard frame rate of 24 frames per second, originally established due to technical limitations. This number lives on for films today, despite many technological innovations and other video based media formats straying from this standard. With contemporary video technology, content cannot only be recorded in higher frame rate; frames can also be artificially interpolated. So called Frame Interpolation technology now comes as a pre-installed feature on most televisions. As a consequence, this has formed a debate on how video based media should be presented, not least when it is artificially generated outside of the creators’ control. This study therefore aims to explore how manipulation of a video clip’s frame rate influences the viewer experience and thereby if the use of Frame Interpolation technology in televisions is justified. A study was conducted wherein participants were shown video clips in their original frame rate and compared them to artificially manipulated copies. The results showed that there is no definitive frame rate that is preferred by all participants and that some participants did not perceive any difference at all. It is also shown that the artificial manipulation of frame rate is generally not appreciated, and that criticisms against its use are misguided in terms of content shown. It is then discussed how television manufacturers should reconsider the use of Frame Interpolation technology. Lastly, we affirm how the results of this study are limited in accuracy by its scope. Further exploration of the subject is suggested to further consider these results found here and the results of earlier papers. / Bildhastigheten i videobaserad media är en fundamental aspekt i hur vi översätter stillbild till rörlig bild. Sedan ett sekel tillbaka produceras film enligt en standard bildhastighet på 24 bilder per sekund, på grund av tekniska begränsningar. Den siffran lever kvar än idag, trots tekniska innovationer samt andra videobaserade medier som töjt på denna standard. Med modern teknik kan media inte bara spelas in i högre bildhastigheter; bilder kan också artificiellt interpoleras. Frame Interpolation-teknik som den kallas kommer numera förinställd på de flesta tv-apparater. Som konsekvens har det förts debatt för och emot högre bildhastigheter, inte minst när de manipuleras av tv-tillverkare utöver skaparnas kontroll. Den här studien vill ta reda på hur manipulering av ett videoklipps bildhastighet påverkar människors känslomässiga tittarupplevelse och därigenom om bruk av Frame Interpolation-teknik i tv-apparater är motiverad. Undersökningen testade deltagarna genom att visa klipp i sin ursprungliga bildhastighet i jämförelse med artificiellt manipulerade kopior. Studien visade att det inte binärt går att bestämma en bildhastighet som deltagarna fann definitivt bäst och att skillnaden inte är uppenbar för alla. Resultatet visar också att artificiell manipulering av bildhastighet inte uppskattas, och att kritiken riktar sig mot fel innehåll. Det diskuteras därför om tv-tillverkare bör överväga användningen av Frame Interpolation-teknik. Slutligen klargörs det varför man ska ställa sig kritisk inför resultaten utifrån studiens begränsningar. Vidare forskning föreslås som kan stödja studiens och liknande studiers slutsatser.
280

Variance Change Point Detection under A Smoothly-changing Mean Trend with Application to Liver Procurement

Gao, Zhenguo 23 February 2018 (has links)
Literature on change point analysis mostly requires a sudden change in the data distribution, either in a few parameters or the distribution as a whole. We are interested in the scenario that the variance of data may make a significant jump while the mean of data changes in a smooth fashion. It is motivated by a liver procurement experiment with organ surface temperature monitoring. Blindly applying the existing change point analysis methods to the example can yield erratic change point estimates since the smoothly-changing mean violates the sudden-change assumption. In my dissertation, we propose a penalized weighted least squares approach with an iterative estimation procedure that naturally integrates variance change point detection and smooth mean function estimation. Given the variance components, the mean function is estimated by smoothing splines as the minimizer of the penalized weighted least squares. Given the mean function, we propose a likelihood ratio test statistic for identifying the variance change point. The null distribution of the test statistic is derived together with the rates of convergence of all the parameter estimates. Simulations show excellent performance of the proposed method. Application analysis offers numerical support to the non-invasive organ viability assessment by surface temperature monitoring. The method above can only yield the variance change point of temperature at a single point on the surface of the organ at a time. In practice, an organ is often transplanted as a whole or in part. Therefore, it is generally of more interest to study the variance change point for a chunk of organ. With this motivation, we extend our method to study variance change point for a chunk of the organ surface. Now the variances become functions on a 2D space of locations (longitude and latitude) and the mean is a function on a 3D space of location and time. We model the variance functions by thin-plate splines and the mean function by the tensor product of thin-plate splines and cubic splines. However, the additional dimensions in these functions incur serious computational problems since the sample size, as a product of the number of locations and the number of sampling time points, becomes too large to run the standard multi-dimensional spline models. To overcome the computational hurdle, we introduce a multi-stages subsampling strategy into our modified iterative algorithm. The strategy involves several down-sampling or subsampling steps educated by preliminary statistical measures. We carry out extensive simulations to show that the new method can efficiently cut down the computational cost and make a practically unsolvable problem solvable with reasonable time and satisfactory parameter estimates. Application of the new method to the liver surface temperature monitoring data shows its effectiveness in providing accurate status change information for a portion of or the whole organ. / Ph. D. / The viability evaluation is the key issue in the organ transplant operation. The donated organ must be viable at the time of being transplanted to the recipient. Nowadays, viability evaluation can be assessed by analyzing the temperature data monitored on the organ surface. In my dissertation, I have developed two new statistical methods to evaluate the viability status of a prepared organ by studying the organ surface temperature. The first method I have developed can be used to detect the change of viability status at a spot on the organ surface. The second method I have developed can be used to detect the change of viability condition for the selected organ chunks. In practice, combining these two methods together can provide accurate viability status change information for a portion of or the whole organ effectively.

Page generated in 0.0608 seconds