• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 200
  • 65
  • 26
  • 26
  • 16
  • 11
  • 11
  • 10
  • 10
  • 6
  • 4
  • 4
  • 3
  • 2
  • 2
  • Tagged with
  • 462
  • 63
  • 56
  • 56
  • 54
  • 48
  • 44
  • 43
  • 41
  • 40
  • 37
  • 37
  • 35
  • 33
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Hierarchical Approximation Methods for Option Pricing and Stochastic Reaction Networks

Ben Hammouda, Chiheb 22 July 2020 (has links)
In biochemically reactive systems with small copy numbers of one or more reactant molecules, stochastic effects dominate the dynamics. In the first part of this thesis, we design novel efficient simulation techniques for a reliable and fast estimation of various statistical quantities for stochastic biological and chemical systems under the framework of Stochastic Reaction Networks. In the first work, we propose a novel hybrid multilevel Monte Carlo (MLMC) estimator, for systems characterized by having simultaneously fast and slow timescales. Our hybrid multilevel estimator uses a novel split-step implicit tau-leap scheme at the coarse levels, where the explicit tau-leap method is not applicable due to numerical instability issues. In a second work, we address another challenge present in this context called the high kurtosis phenomenon, observed at the deep levels of the MLMC estimator. We propose a novel approach that combines the MLMC method with a pathwise-dependent importance sampling technique for simulating the coupled paths. Our theoretical estimates and numerical analysis show that our method improves the robustness and complexity of the multilevel estimator, with a negligible additional cost. In the second part of this thesis, we design novel methods for pricing financial derivatives. Option pricing is usually challenging due to: 1) The high dimensionality of the input space, and 2) The low regularity of the integrand on the input parameters. We address these challenges by developing different techniques for smoothing the integrand to uncover the available regularity. Then, we approximate the resulting integrals using hierarchical quadrature methods combined with Brownian bridge construction and Richardson extrapolation. In the first work, we apply our approach to efficiently price options under the rough Bergomi model. This model exhibits several numerical and theoretical challenges, implying classical numerical methods for pricing being either inapplicable or computationally expensive. In a second work, we design a numerical smoothing technique for cases where analytic smoothing is impossible. Our analysis shows that adaptive sparse grids’ quadrature combined with numerical smoothing outperforms the Monte Carlo approach. Furthermore, our numerical smoothing improves the robustness and the complexity of the MLMC estimator, particularly when estimating density functions.
272

Generalizace vrstevnic v rovinatých územích / Simplification of contour lines in flat areas

Čelonk, Marek January 2021 (has links)
Simplification of contour lines in flat areas The diploma thesis is focused on the cartographic generalization of contour lines derived from dense point clouds in flat territories, where the original contour lines tend to oscillate. The main aim is to propose, develop and test a new algorithm for the contour simplification preserving the given vertical error and reflecting the cartographic rules. Three methods designed for large scale maps (1 : 10 000 and larger) are presented: the weighted average, modified Douglas-Peucker and potential-based approach. The most promising method is based on the repeated simplification of contour line segments by calculating the generalization potential of its vertices. The algorithm is implemented in Python 2.7 with the use of Arcpy library was tested on DMR5G data, the simplified contour lines were compared with the result created by a professional cartographer. Achieved results are presented on attached topographic maps. Keywords: contours, cartographic generalization, digital cartography, vertical buffer, smoothing, GIS
273

Investigating Surface Oxide Composition and Formation on Metallic Vibenite® Alloys

Monie, Emil, Säfström, Nils, Deng, Yiping, Möllerberg, Axel January 2022 (has links)
Oxide formation on metallic surfaces is a common phenomenon which occursnaturally or intently. Depending on the metallic oxide, they can be viewed as either nuisances or conveniences depending on the effects of the oxide. Formed oxides may also potentially smooth surfaces of metallic alloys since a portion of the surface in contact with the oxygen will be converted into the oxide via the metal-oxygeninteraction, leading to a smoother surface underneath the formed oxide. It was found that oxide formation was most significant when metallic Vibenite® alloys were treated at 1000°C for a minimum of 3 hours with an oxygen flow into the oven of 10 L/min. This signifies the importance of a minimum temperature limit as well as an increased oxygen pressure within the oven the samples are being treated in, which concurs with various studies referred to in the report. The oxides were also somewhat successfully identified using analysis methods such as XPS, XRD and Raman spectroscopy with supporting evidence from simulated Thermo-Calc approximations. Thepost-treatment surfaces of the samples, after having their oxide layers removed, were confirmed to have undergone surface smoothing using the optical analysis method of VSI. The results of this report indicate validity in the use of the oxide formation technique for surface smoothing and strongly suggests further study in material optimised heat-treatments for different metallic alloys with the purpose of surface refinement
274

Forecasting Monthly Swedish Air Traveler Volumes

Becker, Mark, Jarvis, Peter January 2023 (has links)
In this paper we conduct an out-of-sample forecasting exercise for monthly Swedish air traveler volumes. The models considered are multiplicative seasonal ARIMA, Neural network autoregression, Exponential smoothing, the Prophet model and a Random Walk as a benchmark model. We divide the out-of-sample data into three different evaluation periods: Pre-COVID-19, during COVID-19 and Post-COVID-19 for which we calculate the MAE, MAPE and RMSE for each model in each of these evaluation periods. The results show that for the Pre-COVID-19 period all models produce accurate forecasts, in comparison to the Random Walk model. For the period during COVID-19, no model outperforms the Random Walk, with only Exponential smoothing performing as well as the Random Walk. For the period Post-COVID-19, the best performing models are Random Walk, SARIMA and Exponential smoothing, with all aforementioned models having similar performance.
275

Size Function Based Mesh Relaxation

Howlett, John David 18 March 2005 (has links) (PDF)
This thesis addresses the problem of relaxing a finite element mesh to more closely match a size function. The main contributions include new methods for performing size function based mesh relaxation, as well as an algorithm for measuring the performance of size function based mesh relaxation methods.
276

[en] A NOVEL SEMIPARAMETRIC STRUCTURAL MODEL FOR ELECTRICITY FORWARD CURVES / [pt] MODELO ESTRUTURAL SEMI-PARAMÉTRICO PARA CURVAS FORWARD DE ELETRICIDADE

MARINA DIETZE MONTEIRO 23 February 2021 (has links)
[pt] A proteção contra a volatilidade dos preços spot torna-se cada vez mais importante nos mercados de energia desverticalizados. Portanto, ser capaz de modelar preços forward e futuros de eletricidade é crucial em um ambiente competitivo. A eletricidade difere de outras commodities devido à sua capacidade de armazenamento e transporte limitados. Além disso, seus derivativos estão associados a um período de entrega durante o qual a energia é concedida continuamente, o que implica em muitas vezes os contratos de eletricidades serem denominados swaps. Tais peculiaridades tornam a modelagem de preços de contratos de energia elétrica uma tarefa não trivial, onde os modelos tradicionais devem ser adaptados para atender às características mencionadas. Neste contexto, foi proposto um modelo estrutural semi-paramétrico para obtenção de uma curva forward de eletricidade contínua e diária através de critérios de máxima suavidade. Ademais, os contratos forward elementares podem ser representados por qualquer estrutura paramétrica para sazonalidade ou mesmo para variáveis exógenas. Nossa estrutura reconhece a sobreposição dos swaps e permite uma análise das oportunidades de arbitragem observadas nos mercados de energia. A curva forward é calculada por um problema de otimização hierárquico capaz de lidar com conjuntos de dados escassos de mercados com baixa liquidez. Os resultados do PCA corroboram a capacidade do modelo em explicar uma alta porcentagem da variância com apenas alguns fatores. / [en] Hedging against spot price volatilities becomes increasingly important in deregulated power markets. Therefore, being able to model electricity forward prices is crucial in a competitive environment. Electricity differs from other commodities due to its limited storability and transportability. Furthermore, its derivatives are associated with a delivery period during which electricity is continuously delivered, implying on referring to power forwards as swaps. These peculiarities make the modeling of electricity contract prices a non-trivial task, where traditional models must be adapted to address the mentioned characteristics. In this context, we propose a novel semiparametric structural model to compute a continuous daily forward curve of electricity through maximum smoothness criterion. In addition, elementary forward contracts can be represented by any parametric structure for seasonality or even for exogenous variables. Our framework acknowledges the overlapped swaps and allows an analysis of arbitrage opportunities observed in power markets. The smooth forward curve is computed by a hierarchical optimization problem able to handle scarce data sets from low-liquidity markets. PCA results corroborate our framework s capability to explain a high percentage of variance with only a few factors.
277

Manipulering av bildhastighet och dess känslomässiga påverkan på tittarupplevelse vid olika format / Manipulation of frame rate and its emotional effect on viewer perception in different formats

O'Grady, William, Währme, Emil January 2023 (has links)
Frame rate is a fundamental element of creating the illusion of movement in video based media. For almost a century film has been produced in agreement with a standard frame rate of 24 frames per second, originally established due to technical limitations. This number lives on for films today, despite many technological innovations and other video based media formats straying from this standard. With contemporary video technology, content cannot only be recorded in higher frame rate; frames can also be artificially interpolated. So called Frame Interpolation technology now comes as a pre-installed feature on most televisions. As a consequence, this has formed a debate on how video based media should be presented, not least when it is artificially generated outside of the creators’ control. This study therefore aims to explore how manipulation of a video clip’s frame rate influences the viewer experience and thereby if the use of Frame Interpolation technology in televisions is justified. A study was conducted wherein participants were shown video clips in their original frame rate and compared them to artificially manipulated copies. The results showed that there is no definitive frame rate that is preferred by all participants and that some participants did not perceive any difference at all. It is also shown that the artificial manipulation of frame rate is generally not appreciated, and that criticisms against its use are misguided in terms of content shown. It is then discussed how television manufacturers should reconsider the use of Frame Interpolation technology. Lastly, we affirm how the results of this study are limited in accuracy by its scope. Further exploration of the subject is suggested to further consider these results found here and the results of earlier papers. / Bildhastigheten i videobaserad media är en fundamental aspekt i hur vi översätter stillbild till rörlig bild. Sedan ett sekel tillbaka produceras film enligt en standard bildhastighet på 24 bilder per sekund, på grund av tekniska begränsningar. Den siffran lever kvar än idag, trots tekniska innovationer samt andra videobaserade medier som töjt på denna standard. Med modern teknik kan media inte bara spelas in i högre bildhastigheter; bilder kan också artificiellt interpoleras. Frame Interpolation-teknik som den kallas kommer numera förinställd på de flesta tv-apparater. Som konsekvens har det förts debatt för och emot högre bildhastigheter, inte minst när de manipuleras av tv-tillverkare utöver skaparnas kontroll. Den här studien vill ta reda på hur manipulering av ett videoklipps bildhastighet påverkar människors känslomässiga tittarupplevelse och därigenom om bruk av Frame Interpolation-teknik i tv-apparater är motiverad. Undersökningen testade deltagarna genom att visa klipp i sin ursprungliga bildhastighet i jämförelse med artificiellt manipulerade kopior. Studien visade att det inte binärt går att bestämma en bildhastighet som deltagarna fann definitivt bäst och att skillnaden inte är uppenbar för alla. Resultatet visar också att artificiell manipulering av bildhastighet inte uppskattas, och att kritiken riktar sig mot fel innehåll. Det diskuteras därför om tv-tillverkare bör överväga användningen av Frame Interpolation-teknik. Slutligen klargörs det varför man ska ställa sig kritisk inför resultaten utifrån studiens begränsningar. Vidare forskning föreslås som kan stödja studiens och liknande studiers slutsatser.
278

Variance Change Point Detection under A Smoothly-changing Mean Trend with Application to Liver Procurement

Gao, Zhenguo 23 February 2018 (has links)
Literature on change point analysis mostly requires a sudden change in the data distribution, either in a few parameters or the distribution as a whole. We are interested in the scenario that the variance of data may make a significant jump while the mean of data changes in a smooth fashion. It is motivated by a liver procurement experiment with organ surface temperature monitoring. Blindly applying the existing change point analysis methods to the example can yield erratic change point estimates since the smoothly-changing mean violates the sudden-change assumption. In my dissertation, we propose a penalized weighted least squares approach with an iterative estimation procedure that naturally integrates variance change point detection and smooth mean function estimation. Given the variance components, the mean function is estimated by smoothing splines as the minimizer of the penalized weighted least squares. Given the mean function, we propose a likelihood ratio test statistic for identifying the variance change point. The null distribution of the test statistic is derived together with the rates of convergence of all the parameter estimates. Simulations show excellent performance of the proposed method. Application analysis offers numerical support to the non-invasive organ viability assessment by surface temperature monitoring. The method above can only yield the variance change point of temperature at a single point on the surface of the organ at a time. In practice, an organ is often transplanted as a whole or in part. Therefore, it is generally of more interest to study the variance change point for a chunk of organ. With this motivation, we extend our method to study variance change point for a chunk of the organ surface. Now the variances become functions on a 2D space of locations (longitude and latitude) and the mean is a function on a 3D space of location and time. We model the variance functions by thin-plate splines and the mean function by the tensor product of thin-plate splines and cubic splines. However, the additional dimensions in these functions incur serious computational problems since the sample size, as a product of the number of locations and the number of sampling time points, becomes too large to run the standard multi-dimensional spline models. To overcome the computational hurdle, we introduce a multi-stages subsampling strategy into our modified iterative algorithm. The strategy involves several down-sampling or subsampling steps educated by preliminary statistical measures. We carry out extensive simulations to show that the new method can efficiently cut down the computational cost and make a practically unsolvable problem solvable with reasonable time and satisfactory parameter estimates. Application of the new method to the liver surface temperature monitoring data shows its effectiveness in providing accurate status change information for a portion of or the whole organ. / Ph. D.
279

Topologieoptimierung im Creo-Umfeld mit ProTopCI

Simmler, Urs 22 July 2016 (has links) (PDF)
Wikipedia umschreibt die Topologieoptimierung als ein computerbasiertes Berechnungsverfahren, durch welches eine günstige Grundgestalt (Topologie) für Bauteile unter mechanischer Belastung ermittelt werden kann. Durch die Verwendung von 3D-Druck-Verfahren wird die Gestaltung der Komponenten revolutioniert, weil diese nicht mehr abhängig vom Fertigungsverfahren sind. Dabei werden auch optimale Gitterstrukturen innerhalb der Komponenten immer wichtiger. Diese neuen Herausforderungen können im Creo Umfeld mit ProTopCI (Hersteller CAESS, PTC Partner Advantage, Silver) elegant gelöst werden. Im Vortrag (mit Live-Demonstration) werden die neuen Möglichkeiten dieser innovativen Lösung beleuchtet: Modellerzeugung in Creo Simulate (FEM-Mode): - Verschiedene Lastfälle, - Kontakte, - Schraubenverbindungen, - CAD-Geometrie, - zu optimierende Bereiche, ... Technologische Randbedingungen zur Berücksichtigung des Fertigungsverfahren Innovatives Erzeugen/Optimieren der Gitterstrukturen Glätten, Exportieren der optimierten Geometrie
280

產能規劃結合客戶價值分析之效益研究-以季節性產品製造商為例 / The Benefit Analysis of Combining Capacity Planning with Customer Value : A Case Study of Seasonal Product Manufacturers

蔡欣妤, Tsai, Shin Yu Unknown Date (has links)
中國經濟崛起後,台灣製造業者面臨比以往更激烈的競爭,急需提升自身能力,為了能有效對抗複雜、變化快速的環境,企業應從既有的競爭優勢建立和延伸,而台灣的經濟自由以及完善的製造業產能,是製造業競爭力提升的關鍵。在產能規劃時容易因為客戶影響而產生產能平滑化議題,其中以季節性產品製造商為最,且現今產品生命週期縮短及客戶喜好改變快速,導致市場不確定性提高,企業的產能運用越來越容易受客戶需求之影響,因此了解既有客戶資訊並將其納入產能平滑化策略之考量因素,是現今台灣季節性產品製造商發展優勢的重要議題。然而目前客戶分析之應用大多在銷售領域,結合產能規劃策略的研究及個案都相當不足,若能建立兩者結合之研究分析及個案討論,會更有助於未來提升台灣製造業競爭力之研究,因此本研究探討客戶分析對產能平滑化策略之影響。 本研究利用個案研究法分析世界第四大冷氣壓縮機製造商-瑞智精密公司,探討其如何在2011年至2014年期間將客戶分析導入產能規劃策略,並透過內部銷售數據、瑞智專業顧問訪談以及第三機構之研究/學術資料三個資料來源進行三角檢定,評估其在淡季時是否有助增加銷售量;旺季時是否可以增加產量或是發展其他產能分配策略。 本研究以實務資料驗證結合客戶分析之產能平滑化策略是否具備實質效益,研究發現導入客戶分析之產能平滑化策略確實縮小了淡旺季差異,也可發展出客戶評價機制方案進行旺季產能分配策略,而將客戶分析和客戶評價機制導入產能規劃步驟,可以讓客戶資訊從銷售部門流通至生產部門,化解兩者之間目標不同的衝突,使企業內部能更有效率地溝通合作,迅速反應客戶需求創造競爭力。 / After the rising of economy in China, the manufacturers in Taiwan are facing fierce horizontal competition. In order to secure their leading position and competitiveness in the market, Taiwanese manufacturers have to enhance and develop their current competitive advantages to further respond to the complicated and volatile economic environment quickly. Priority should be given to enhance the capacity planning as the economic freedom and sufficient manufacturing capacity are two main factors making the manufacturers to be competitive and successful in Taiwan. However, the capacity planning is easily influenced by the customers, which caused an issue called “Production Smoothing”. The problem is obvious and complicated for those seasonal product manufacturers. Moreover, the shorter product lifecycle and rapidly changing customer preferences contribute to the uncertainty in the market. As a result, the capacity planning is now more susceptible to be affected by customers’ needs. That is why it is very critical for the manufacturers to understand their customers’ needs and further to consider the needs in their company’s capacity plans. However, the current case study of customer analysis usually focuses on the field of marketing instead of on the capacity planning. It would be great if we could combine the capacity planning and marketing to the case discussion and analysis, further to help develop the research on the competitiveness of Taiwanese manufacturers. Therefore, the object of this article is to analyze how the customer analysis can bring an impact to the company’s capacity smoothing strategy. This case study is to analyze Rechi Precision Company, the world's fourth largest air conditioner compressor manufacturers, discussing how they take customer analysis into consideration when they develop the capacity strategy from 2011 to 2014. By analyzing internal sales data, interviewing with the internal consultants in Rechi, and referring research from the third institutions, we access whether the combination of customer analysis and capacity planning could help increase the sales during the low season and increase the capacity or develop other capacity allocation strategy during the peak season. According to the research results, the combination can decrease sales difference between the peak season and the low season. It can also help develop customer evaluation mechanism to determine capacity allocation. In addition, in order to respond to customers’ needs quickly, the manufacturers should integrate customer analysis and customer evaluation mechanism into capacity planning steps, which can soften the conflicts between the sales and production department. All of these efforts could make the whole company communicate and cooperate with each other more efficient.

Page generated in 0.0386 seconds