• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 560
  • 32
  • 6
  • 2
  • Tagged with
  • 620
  • 620
  • 585
  • 52
  • 41
  • 40
  • 38
  • 34
  • 33
  • 30
  • 30
  • 29
  • 28
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
481

Counterparty Credit Risk on the Blockchain / Motpartsrisk på blockkedjan

Starlander, Isak January 2017 (has links)
Counterparty credit risk is present in trades offinancial obligations. This master thesis investigates the up and comingtechnology blockchain and how it could be used to mitigate counterparty creditrisk. The study intends to cover essentials of the mathematical model expectedloss, along with an introduction to the blockchain technology. After modellinga simple smart contract and using historical financial data, it was evidentthat there is a possible opportunity to reduce counterparty credit risk withthe use of blockchain. From the market study of this thesis, it is obvious thatthe current financial market needs more education about blockchain technology. / Motpartsrisk är närvarande i finansiella obligationer. Den här uppsatsen un- dersöker den lovande teknologin blockkedjan och hur den kan användas för att reducera motpartsrisk. Studien har för avsikt att täcka det essentiel- la i den matematiska modellen för förväntad förlust, samt en introduktion om blockkedjeteknologi. Efter att ha modellerat ett enkelt smart kontrakt, där historiska finansiella data använts, var det tydligt att det kan finnas en möjlighet att reducera motpartsrisk med hjälp av blockkedjan. Från mark- nadsundersökningen gjord i studien var det uppenbart att den nuvarande finansiella marknaden är i stort behov av mer utbildning om blockkedjan.
482

Toxicity Levels of Stock Markets : Observing Information Asymmetry in a Multi-Market Setting / Aktiemarknaders Toxicity-Nivaer : Observering av Informationsasymmetri i en Flermarknadsmiljo

Molander, Lukas, Yape, Shih Jung January 2017 (has links)
The presence of toxic order ow and predatory HFT strategies in a multi-market setting are scarcely researched in the academic world. This thesis studies the toxicity levels of a set of markets by examining unconsolidated quote data and firm specific trade data. A method for deducing the markets toxicity levels is presented along with proxies for toxic order ow, namely: changes in spread and quoted volume, following a trade in a given market. We find both signs of toxicity and different toxicity levels between the markets. However, the results are lacking in statistical significance but they show that this field is of great interest for further research. Also, the methods proposed for deducing the toxicity levels are rudimentary but could serve well as a premise for further development. / Närvaron av toxic order flow och predatoriska HFT-strategier i en flermarknadsmiljö är föga studerat i den akademiska världen. Denna avhandling studerar detta på en uppsättning marknader genom att undersöka okonsoliderad quote data och firma specifika trades, och på så vis ta fram marknadernas toxicity-nivåer. En metod för att fastställa marknadernas toxicity-nivåer presenteras tillsammans med proxys för toxic order flow, mer specifikt: förändringar i spread och quotad volym, efter en handel på en given marknad. Vi finner både tecken på toxicity och olika toxicityniv åer mellan marknaderna. Resultaten saknar dock statistisk signifikans men de visar ändå på att detta område är av stort intresse för ytterligare forskning. De metoder som föreslås för att fastställa toxicity-nivåerna är rudimentära, men kan tjäna som en utgångspunkt för vidare utveckling.
483

The Impact of Quantum Computing on the Financial Sector : Exploring the Current Performance and Prospects of Quantum Computing for Financial Applications through Mean-Variance Optimization

Fahlkvist, Ante, Kheiltash, Alfred January 2023 (has links)
Many important tasks in finance often rely on complex and time-consuming computations. The rapid development of quantum technology has raised the question of whether quantum computing can be used to solve these tasks more efficiently than classical computing. This thesis studies the potential use of quantum computing in finance by solving differently-sized problem instances of the mean-variance portfolio selection model using commercially available quantum resources. The experiments employ gate-based quantum computers and quantum annealing, the two main technologies for realizing a quantum computer. To solve the mean-variance optimization problem on gate-based quantum computers, the model was formulated as a quadratic unconstrained binary optimization (QUBO) problem, which was then used as input to quantum resources available on the largest quantum computing as a service (QCaaS) platforms, IBM Quantum Lab, Microsoft Azure Quantum and Amazon Braket. To solve the problem using quantum annealing, a hybrid quantum-classical solver available on the service D-Wave Leap was employed, which takes as input the mean-variance model’s constrained quadratic form. The problem instances were also solved classically on the model’s QUBO form, where the results acted as benchmarks for the performances of the quantum resources. The results were evaluated based on three performance metrics: time-to-solve, solution quality, and cost-to-solve. The findings indicate that gate-based quantum computers are not yet mature enough to consistently find optimal solutions, with the computation times being long and costly as well. Moreover, the use of gate-based quantum computers was not trouble-free, with the majority of quantum computers failing to even complete the jobs. Quantum annealing, on the other hand, demonstrated greater maturity, with the hybrid solver being capable of fast and accurate optimization, even for very large problem instances. The results from using the hybrid solver justify further research into quantum annealing, to better understand the capabilities and limitations of the technology. The results also indicate that quantum annealing has reached a level of maturity where it has the potential to make a significant impact on financial institutions, creating value that cannot be obtained by using classical computing.
484

Predicting Patent Data using Wavelet Regression and Bayesian Machine Learning / Modellering av Patentdata med Wavelet Regression och Bayesiansk Maskininlärning

Martinsen, Mattias January 2023 (has links)
Patents are a fundamental part of scientific and engineering work, ensuringprotection of inventions owned by individuals or organizations. Patents areusually made public 18 months after being filed to a patent office, whichmeans that current publicly available patent data only provides informationabout the past. Regression models applied on discrete time series can be usedas a prediction tool to counteract this, building a 18 month long bridge intothe future and beyond. While linear models are popular for their simplicity,Bayesian networks have statistical properties that can produce high forecastingquality. Improvements is also made by using signal processing as patentdata is naturally stochastic. This thesis implements wavelet-based signalprocessing and P CA to increase stability and reduce overfitting. A multiplelinear regression model and a Bayesian network model is then designed andapplied to the transformed data. When evaluated on each data set, the Bayesianmodel both performs better and exhibits greater stability and consistency inits predictions. As expected, the linear model is both smaller and faster toevaluate and train. Despite an increase in complexity and slower evaluationtimes, the Bayesian model is conclusively superior to the linear model. Futurework should focus on the signal processing method and additional layers inthe Bayesian network. / Patent är en grundläggande byggsten av den tekniska världen då de skyddaruppfinningar som ägs av individer eller organisationer. Patent publicerasvanligtvis 18 månader efter att de lämnats in till ett patentverk, vilket innebäratt patentdata som är tillgänglig idag endast ger information om det förflutna.Regressionsmodeller som förutspår diskreta tidsserier kan användas somett verktyg för att motverka detta. Då linjära modeller är populära för sinenkelhet, har Bayesianska nätverk statistiska egenskaper som kan produceramodeller med hög kvalité. Patentdata är naturligt kaotisk och måste bearbetasinnan en modell använder den. Denna uppsats implementerar wavelet-baseradsignalbehandling och P CA som förbättrar stabilitet och kvalité. En linjärregressionsmodell och en Bayesiansk nätverksmodell designas och applicerassedan på transformerad data. I varje enskilt fall presterar den Bayesianskamodellen bättre med stabila och konsekventa förutsägelser. Som förväntatär den linjära modellen snabbare att både använda och träna. Trots en ökadkomplexitet och långsammare evaluering är den Bayesianska modellen ettsjälvklart val över den linjära modellen. Framtida förbättringar bör fokuserapå behandling av indata och komplexiteten i det Bayesianska nätverket.
485

Polypharmacy Side Effect Prediction with Graph Convolutional Neural Network based on Heterogeneous Structural and Biological Data / Förutsägning av biverkningar från polyfarmaci med grafiska faltningsneuronnät baserat på heterogen strukturell och biologisk data

Diaz Boada, Juan Sebastian January 2020 (has links)
The prediction of polypharmacy side effects is crucial to reduce the mortality and morbidity of patients suffering from complex diseases. However, its experimental prediction is unfeasible due to the many possible drug combinations, leaving in silico tools as the most promising way of addressing this problem. This thesis improves the performance and robustness of a state-of-the-art graph convolutional network designed to predict polypharmacy side effects, by feeding it with complexity properties of the drug-protein network. The modifications also involve the creation of a direct pipeline to reproduce the results and test it with different datasets. / För att minska dödligheten och sjukligheten hos patienter som lider av komplexa sjukdomar är det avgörande att kunna förutsäga biverkningar från polyfarmaci. Att experimentellt förutsäga biverkningarna är dock ogenomförbart på grund av det stora antalet möjliga läkemedelskombinationer, vilket lämnar in silico-verktyg som det mest lovande sättet att lösa detta problem. Detta arbete förbättrar prestandan och robustheten av ett av det senaste grafiska faltningsnätverken som är utformat för att förutsäga biverkningar från polyfarmaci, genom att mata det med läkemedel-protein-nätverkets komplexitetsegenskaper. Ändringarna involverar också skapandet av en direkt pipeline för att återge resultaten och testa den med olika dataset.
486

CAE Sun Simulation - Thermo Structural Coupling / CAE solsimulering – Termo-strukturell koppling

Staffas, Kristin January 2020 (has links)
Deformation of materials due to heat exposure can be a challenge in vehicle design. Sun exposure and heat build up in the vehicle cabin contribute to accelerated aging of the materials used. To understand how this process affects the durability, numerical simulation is an important tool. This thesis work investigates whether it is possible to, with weather data and knowledge of how a car is used, simulate the deformation process of the trim parts due to heat exposure from the sun. To answer this question, two simulation methods have been compared: The first method is to only look at the aging of the material over time. This method is used today at Volvo Cars, but has been found to be insufficient from a number of aspects. The second method includes the effects of the reversible deformations. This method has not previously been tested in a similar context but was considered interesting as it models more true to life. In addition, the effects of modeling the car according to a virtual customer behavior, rather than stationary, were investigated. The main simulation tools used are TAITherm and Abaqus. The results show that modeling the vehicle with an imagined user routine yields a lower mean temperature as the car is cooled down when driven. The result of the comparison between the two methods shows that inclusion of the effects of the reversible deformations contribute to a better model of the aging of the trim parts due to heat exposure from the sun. / Deformation av material till följd av hög värmelast är ofta en utmaning inom fordonsdesign. Materialet i kupén exponeras både för direkt solljus och den värme som ackumuleras i det stängda utrymmet. För att få förståelse för hur denna process påverkar hållbarheten är numerisk analys ett viktigt verktyg. Föreliggande examensarbete undersöker om det är möjligt att, med väderdata och kunskap om hur en bil används, simulera deformationsprocessen i trim- delarna som uppkommer av värmexponering från solen. För att besvara denna fråga har två simuleringsmetoder jämförts: Den första metoden är att endast titta på materialets åldrande över tiden. Den används idag på Volvo Cars, men har befunnits vara otillräcklig utifrån ett antal aspekter. Den andra metoden inkluderar effekterna av de elastiska deformationerna av materialet. Denna metod har inte tidigare prövats i ett liknande sammanhang men bedömdes intressant då den modellerar mer verklighetstroget. Dessutom undersöktes effekterna av att modellera bilen enligt ett virtuellt kundbeteen- de snarare än som stillastående. De huvudsakliga simuleringsverktygen som använts här är TAITherm och Abaqus. Resultaten visar att modellering av fordonet med en tänkt användarrutin ger en lägre medeltemperatur eftersom bilen kyls ner när den körs. Resultatet av jämförelsen mellan de två metoderna visar att inkludering av effekterna av de elastiska deformationerna bidrar till en bättre modellering av åldrandet av trimdelarna till följd av värmexponering från solen.
487

Implementation of Forward and Reverse Mode Automatic Differentiation for GNU Octave Applications

Kang, Yixiu 04 April 2003 (has links)
No description available.
488

High-performance implementation of H(div)-conforming elements for incompressible flows

Wik, Niklas January 2022 (has links)
In this thesis, evaluation of H(div)-conforming finite elements is implemented in a high-performance setting and used to solve the incompressible Navier-Stokes equation, obtaining an exactly point-wise divergence-free velocity field. In particular, the anisotropic Raviart-Thomas tensor-product polynomial space is considered, where the finite element operators are evaluated with quadrature in a matrix-free fashion using sum-factorization on tensor-product meshes. The implementation includes evaluation over elements and faces in two- and three-dimensional space, supporting non-conforming meshes with hanging nodes, and using the contravariant Piola transformation to preserve normal components on element boundaries. In terms of throughput, the implementation achieves up to an order of magnitude faster evaluation of finite element operators compared to a matrix-based evaluation. Correctness is demonstrated with optimal convergence rates for various polynomial degrees, as well as exactly divergence-free solutions for the velocity field.
489

Tour expansion in snow removal problem

Tarasova, Anna January 2022 (has links)
The process of removing snow from the streets of cities in an optimal way can pose quite a challenge. In order to optimize the path of the snow removing vehicle, the city can be translated into a graph with nodes as crossings and links as roads. Once the city is modelled as a graph, all nodes with degree one can be eliminated and the snow removal time is added to the closest node. An optimization problem can then be solved in order to find a vehicle path in this reduced graph. The purpose of this thesis is to give an algorithm to reconstruct the reduced graph and then dictate the proper vehicle path in this reconstructed graph. The algorithm is constructed by reversing the node elimination process, piecing together the original graph and traversing the graph to get information about what to do on the eliminated links and nodes. The obtained algorithm is presented in this thesis.
490

Using maximal feasible subset of constraints to accelerate a logic-based Benders decomposition scheme for a multiprocessor scheduling problem

Grgic, Alexander, Andersson, Filip January 2022 (has links)
Logic-based Benders decomposition (LBBD) is a strategy for solving discrete optimisation problems. In LBBD, the optimisation problem is divided into a master problem and a subproblem and each part is solved separately. LBBD methods that combine mixed-integer programming and constraint programming have been successfully applied to solve large-scale scheduling and resource allocation problems. Such combinations typically solve an assignment-type master problem and a scheduling-type subproblem. However, a challenge with LBBD methods that have feasibility subproblems are that they do not provide a feasible solution until an optimal solution is found.  In this thesis, we show that feasible solutions can be obtained by finding and combining feasible parts of an infeasible master problem assignment. We use these insights to develop an acceleration technique for LBBD that solves a series of subproblems, according to algorithms for constructing a maximal feasible subset of constraints (MaFS). Using a multiprocessor scheduling problem as a benchmark, we study the computational impact from using this technique. We evaluate three variants of LBBD schemes. The first uses MaFS, the second uses irreducible subset of constraints (IIS) and the third combines MaFS with IIS. Computational tests were performed on an instance set of multiprocessor scheduling problems. In total, 83 instances were tested, and their number of tasks varied between 2794 and 10,661. The results showed that when applying our acceleration technique in the decomposition scheme, the pessimistic bounds were strong, but the convergence was slow. The decomposition scheme combining our acceleration technique with the acceleration technique using IIS showed potential to accelerate the method.

Page generated in 0.1011 seconds