• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 509
  • 254
  • 2
  • 2
  • 2
  • Tagged with
  • 767
  • 577
  • 362
  • 345
  • 163
  • 160
  • 156
  • 145
  • 55
  • 50
  • 47
  • 44
  • 39
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
731

Lärares utmaningar i arbetet med att leda och stimulera särskilt matematiskt begåvade elever : En kvalitativ intervjustudie ur ett implementeringsteoretiskt perspektiv

Huttu, Johanna, Kärrlander, Pontus, Viebke, Henrik January 2022 (has links)
According to the Swedish curriculum for the compulsory school, every teacher has an obligation to differentiate students’ education to promote their development and learning needs. The study carried out in this paper is a qualitative interview study with Swedish teachers that have experience teaching particularly gifted students in mathematics. The study is based on the implementation theory where the aim of the study is to present an overview of different challenges that teachers encounter when trying to implement the decisions made regarding education of gifted students in mathematics. Furthermore, the aim of the study was to see if there were differences, and what those differences were, between the challenges that Swedish teachers mentioned versus the challenges that are acknowledged in previous international research.  A deductive approach is used rooting the analysis in key concepts based on previous research. The key concepts or challenges we found were mathematical competence, student interests and motivations, appropriate challenge for students, inclusion, identification, prioritizing, time, materials, lack of education and collegial cooperation. We found plenty of similarities to the previous research on the area but could however also see tendencies in our study which hasn’t been discussed in the presented previous research. For example, we noticed that teachers in our study mentioned the importance of experience. We also noticed that some of the key challenges from the previous research didn’t seem to be as prominent in this study.
732

Robust portfolio optimization with Expected Shortfall / Robust portföljoptimering med ES

Isaksson, Daniel January 2016 (has links)
This thesis project studies robust portfolio optimization with Expected Short-fall applied to a reference portfolio consisting of Swedish linear assets with stocks and a bond index. Specifically, the classical robust optimization definition, focusing on uncertainties in parameters, is extended to also include uncertainties in log-return distribution. My contribution to the robust optimization community is to study portfolio optimization with Expected Shortfall with log-returns modeled by either elliptical distributions or by a normal copula with asymmetric marginal distributions. The robust optimization problem is solved with worst-case parameters from box and ellipsoidal un-certainty sets constructed from historical data and may be used when an investor has a more conservative view on the market than history suggests. With elliptically distributed log-returns, the optimization problem is equivalent to Markowitz mean-variance optimization, connected through the risk aversion coefficient. The results show that the optimal holding vector is almost independent of elliptical distribution used to model log-returns, while Expected Shortfall is strongly dependent on elliptical distribution with higher Expected Shortfall as a result of fatter distribution tails. To model the tails of the log-returns asymmetrically, generalized Pareto distributions are used together with a normal copula to capture multivariate dependence. In this case, the optimization problem is not equivalent to Markowitz mean-variance optimization and the advantages of using Expected Shortfall as risk measure are utilized. With the asymmetric log-return model there is a noticeable difference in optimal holding vector compared to the elliptical distributed model. Furthermore the Expected Shortfall in-creases, which follows from better modeled distribution tails. The general conclusions in this thesis project is that portfolio optimization with Expected Shortfall is an important problem being advantageous over Markowitz mean-variance optimization problem when log-returns are modeled with asymmetric distributions. The major drawback of portfolio optimization with Expected Shortfall is that it is a simulation based optimization problem introducing statistical uncertainty, and if the log-returns are drawn from a copula the simulation process involves more steps which potentially can make the program slower than drawing from an elliptical distribution. Thus, portfolio optimization with Expected Shortfall is appropriate to employ when trades are made on daily basis. / Examensarbetet behandlar robust portföljoptimering med Expected Shortfall tillämpad på en referensportfölj bestående av svenska linjära tillgångar med aktier och ett obligationsindex. Specifikt så utvidgas den klassiska definitionen av robust optimering som fokuserar på parameterosäkerhet till att även inkludera osäkerhet i log-avkastningsfördelning. Mitt bidrag till den robusta optimeringslitteraturen är att studera portföljoptimering med Expected Shortfall med log-avkastningar modellerade med antingen elliptiska fördelningar eller med en norma-copul med asymmetriska marginalfördelningar. Det robusta optimeringsproblemet löses med värsta tänkbara scenario parametrar från box och ellipsoid osäkerhetsset konstruerade från historiska data och kan användas när investeraren har en mer konservativ syn på marknaden än vad den historiska datan föreslår. Med elliptiskt fördelade log-avkastningar är optimeringsproblemet ekvivalent med Markowitz väntevärde-varians optimering, kopplade med riskaversionskoefficienten. Resultaten visar att den optimala viktvektorn är nästan oberoende av vilken elliptisk fördelning som används för att modellera log-avkastningar, medan Expected Shortfall är starkt beroende av elliptisk fördelning med högre Expected Shortfall som resultat av fetare fördelningssvansar. För att modellera svansarna till log-avkastningsfördelningen asymmetriskt används generaliserade Paretofördelningar tillsammans med en normal-copula för att fånga det multivariata beroendet. I det här fallet är optimeringsproblemet inte ekvivalent till Markowitz väntevärde-varians optimering och fördelarna med att använda Expected Shortfall som riskmått används. Med asymmetrisk log-avkastningsmodell uppstår märkbara skillnader i optimala viktvektorn jämfört med elliptiska fördelningsmodeller. Därutöver ökar Expected Shortfall, vilket följer av bättre modellerade fördelningssvansar. De generella slutsatserna i examensarbetet är att portföljoptimering med Expected Shortfall är ett viktigt problem som är fördelaktigt över Markowitz väntevärde-varians optimering när log-avkastningar är modellerade med asymmetriska fördelningar. Den största nackdelen med portföljoptimering med Expected Shortfall är att det är ett simuleringsbaserat optimeringsproblem som introducerar statistisk osäkerhet, och om log-avkastningar dras från en copula så involverar simuleringsprocessen flera steg som potentiellt kan göra programmet långsammare än att dra från en elliptisk fördelning. Därför är portföljoptimering med Expected Shortfall lämpligt att använda när handel sker på daglig basis.
733

Analysis of Copula Opinion Pooling with Applications to Quantitative Portfolio Management

Bredeby, Rickard January 2015 (has links)
In 2005 Attilio Meucci presented his article Beyond Black-Litterman: Views on Non-Normal Markets which introduces the copula opinion pooling approach using generic non-normal market assumptions. Copulas and opinion pooling are used to express views on the market which provides a posterior market distribution that smoothly blends an arbitrarily distributed market prior distribution with arbitrarily chosen views. This thesis explains how to use this method in practice and investigates its performance in different investment situations. The method is tested on three portfolios, each showing some different feature. The conclusions that can be drawn are e.g. that the method can be used in many different investment situations in many different ways, implementation and calculations can be made within a few seconds for a large data set and the method could be useful for portfolio managers using mathematical methods. The presented examples together with the method generate reasonable results.
734

Alternative Methods for Value-at-Risk Estimation : A Study from a Regulatory Perspective Focused on the Swedish Market / Alternativa metoder för beräkning av Value-at-Risk : En studie från ett regelverksperspektiv med fokus på den svenska marknaden

Sjöwall, Fredrik January 2014 (has links)
The importance of sound financial risk management has become increasingly emphasised in recent years, especially with the financial crisis of 2007-08. The Basel Committee sets the international standards and regulations for banks and financial institutions, and in particular under market risk, they prescribe the internal application of the measure Value-at-Risk. However, the most established non-parametric Value-at-Risk model, historical simulation, has been criticised for some of its unrealistic assumptions. This thesis investigates alternative approaches for estimating non-parametric Value-at-Risk, by examining and comparing the capability of three counterbalancing weighting methodologies for historical simulation: an exponentially decreasing time weighting approach, a volatility updating method and, lastly, a more general weighting approach that enables the specification of central moments of a return distribution. With real financial data, the models are evaluated from a performance based perspective, in terms of accuracy and capital efficiency, but also in terms of their regulatory suitability, with a particular focus on the Swedish market. The empirical study shows that the capability of historical simulation is improved significantly, from both performance perspectives, by the implementation of a weighting methodology. Furthermore, the results predominantly indicate that the volatility updating model with a 500-day historical observation window is the most adequate weighting methodology, in all incorporated aspects. The findings of this paper offer significant input both to existing research on Value-at-Risk as well as to the quality of the internal market risk management of banks and financial institutions. / Betydelsen av sund finansiell riskhantering har blivit alltmer betonad på senare år, i synnerhet i och med finanskrisen 2007-08. Baselkommittén fastställer internationella normer och regler för banker och finansiella institutioner, och särskilt under marknadsrisk föreskriver de intern tillämpning av måttet Value-at-Risk. Däremot har den mest etablerade icke-parametriska Value-at-Risk-modellen, historisk simulering, kritiserats för några av dess orealistiska antaganden. Denna avhandling undersöker alternativa metoder för att beräkna icke-parametrisk Value-at‑Risk, genom att granska och jämföra prestationsförmågan hos tre motverkande viktningsmetoder för historisk simulering: en exponentiellt avtagande tidsviktningsteknik, en volatilitetsuppdateringsmetod, och slutligen ett mer generellt tillvägagångssätt för viktning som möjliggör specifikation av en avkastningsfördelnings centralmoment. Modellerna utvärderas med verklig finansiell data ur ett prestationsbaserat perspektiv, utifrån precision och kapitaleffektivitet, men också med avseende på deras lämplighet i förhållande till existerande regelverk, med särskilt fokus på den svenska marknaden. Den empiriska studien visar att prestandan hos historisk simulering förbättras avsevärt, från båda prestationsperspektiven, genom införandet av en viktningsmetod. Dessutom pekar resultaten i huvudsak på att volatilitetsuppdateringsmodellen med ett 500 dagars observationsfönster är den mest användbara viktningsmetoden i alla berörda aspekter. Slutsatserna i denna uppsats bidrar i väsentlig grad både till befintlig forskning om Value-at-Risk, liksom till kvaliteten på bankers och finansiella institutioners interna hantering av marknadsrisk.
735

Do hedge funds yield greater risk-adjusted rate of  returns than mutual funds?A quantitative study comparing hedge funds to mutual funds and hedge fund strategies / Avkastar hedgefonder högre risk-justerade avkastningar än aktiefonder?En kvantitativ studie som jämför hedgefonder med aktiefonder och investeringsstrategier

Börjesson, Oscar, HaQ, Sebastian Rezwanul January 2014 (has links)
In recent times, the popularity of hedge funds has undoubtedly increased. There are shared opinions on whether hedge funds generate absolute rates of returns and whether they provide a strong alternative investment to mutual funds. This thesis aims to examine whether hedge funds with different investment strategies create absolute returns and if certain investment strategies outperform others. This thesis compares hedge funds risk-adjusted rate of return towards mutual funds, such as mutual funds, to see if certain investment strategies are more lucrative than the corresponding investments in terms of excess returns to corresponding indices. An econometric approach was applied to search for significant differences in risk-adjusted returns of hedge funds in contrast to mutual funds. Our results show that Swedish hedge funds do not generate as high risk-adjusted returns as Swedish mutual funds. In regard to the best performing hedge fund strategy, the results are inconclusive. Also, we do not find any evidence that hedge funds violate the effective market hypothesis. / Hedgefonder har den senaste tiden ökat i popularitet. Samtidigt finns det delade meningar huruvida hedgefonder genererar absolutavkastning och om de fungerar som bra alternativ till traditionella fonder. Denna uppsats syftar till att undersöka huruvida hedgefonder skapar absolutavkastning samt om det finns investeringsstrategier som presterar bättre än andra. Denna uppsats jämför hedgefonders riskjusterade avkastning med traditionella fonder, för att på sätt se om en viss investeringsstrategi ar mer lukrativ i termer av överavkastning i förhållande till motsvarande index. Vi har använt ekonometriska metoder för att söka efter statistiskt signifikanta skillnader mellan avkastningen för hedgefonder och traditionella fonder. Våra resultat visar att svenska hedgefonder inte genererar högre risk-justerade avkastningar än svenska aktiefonder. Våra resultat visar inga signifikanta skillnader vad gäller avkastning mellan olika strategier. Slutligen finner vi heller inga bevis för att hedgefonder går emot den effektiva marknadshypotesen
736

Crosscap States in Integrable Spin Chains / Crosscaptillstånd i integrable spinnkedjor

Ekman, Christopher January 2022 (has links)
We consider integrable boundary states in the Heisenberg model. We begin by reviewing the algebraic Bethe Ansatz as well as integrable boundary states in spin chains. Then a new class of integrable states that was introduced last year by Caetano and Komatsu is described and expanded. We call these states the crosscap states. In these states each spin is entangled with its antipodal spin. We present a novel proof of the integrability of both a crosscap state that is known in the literature and one that is not previously known. We then use the machinery of the algebraic Bethe Ansatz to derive the overlaps between the crosscap states and off-shell Bethe states in terms of scalar products and other known overlaps. / Vi undersöker integrable gränstillstånd i Heisenbergmodellen. Vi börjar med att gå igenom den algebraiska Betheansatsen och integrabla gränstillstånd i spinnkedjor. Sedan beskrivs och expanderas en ny klass av integrabla tillstånd som introducerades förra året av Caetano och Komatsu. Vi kallar dessa tillstånd crosscap-tillstånd. I dessa tillstånd är varje spinn intrasslat med sin antipodala motsvarighet. Vidare presenterar vi ett nytt bevis av integrerbarheten hos både ett tidigare känt och ett nytt crosscap-tillstånd. Sedan använder vi den algebraiska Betheansatsens maskineri för att härleda överlappen mellan crosscap-tillstånden och off-shell Bethe tillstånd i termer av skalärprodukter och andra kända överlapp.
737

Detecting anomalies in data streams driven by ajump-diffusion process / Anomalidetektion i dataströmmar för hopp-diffusionsprocesser

Paulin, Carl January 2021 (has links)
Jump-diffusion processes often model financial time series as they can simulate the random jumps that they frequently exhibit. These jumps can be seen as anomalies and are essential for financial analysis and model building, making them vital to detect.The realized variation, realized bipower variation, and realized semi-variation were tested to see if one could use them to detect jumps in a jump-diffusion process and if anomaly detection algorithms can use them as features to improve their accuracy. The algorithms tested were Isolation Forest, Robust Random Cut Forest, and Isolation Forest Algorithm for Streaming Data, where the latter two use streaming data. This was done by generating a Merton jump-diffusion process with a varying jump-rate and tested using each algorithm with each of the features. The performance of each algorithm was measured using the F1-score to compare the difference between features and algorithms. It was found that the algorithms were improved from using the features; Isolation Forest saw improvement from using one, or more, of the named features. For the streaming algorithms, Robust Random Cut Forest performed the best for every jump-rate except the lowest. Using a combination of the features gave the highest F1-score for both streaming algorithms. These results show one can use these features to extract jumps, as anomaly scores, and improve the accuracy of the algorithms, both in a batch and stream setting. / Hopp-diffusionsprocesser används regelbundet för att modellera finansiella tidsserier eftersom de kan simulera de slumpmässiga hopp som ofta uppstår. Dessa hopp kan ses som anomalier och är viktiga för finansiell analys och modellbyggnad, vilket gör dom väldigt viktiga att hitta. Den realiserade variationen, realiserade bipower variationen, och realiserade semi-variationen är faktorer av en tidsserie som kan användas för att hitta hopp i hopp-diffusionprocesser. De används här för att testa om anomali-detektionsalgoritmer kan använda funktionerna för att förbättra dess förmåga att detektera hopp. Algoritmerna som testades var Isolation Forest, Robust Random Cut Forest, och Isolation Forest Algoritmen för Strömmande data, där de två sistnämnda använder strömmande data. Detta gjordes genom att genera data från en Merton hopp-diffusionprocess med varierande hoppfrekvens där de olika algoritmerna testades med varje funktion samt med kombinationer av funktioner. Prestationen av varje algoritm beräknades med hjälp av F1-värde för att kunna jämföra algoritmerna och funktionerna med varandra. Det hittades att funktionerna kan användas för att extrahera hopp från hopp-diffusionprocesser och även använda de som en indikator för när hopp skulle ha hänt. Algoritmerna fick även ett högre F1-värde när de använde funktionerna. Isolation Forest fick ett förbättrat F1-värde genom att använda en eller fler utav funktionerna och hade ett högre F1-värde än att bara använda funktionerna för att detektera hopp. Robust Random Cut Forest hade högst F1-värde av de två algoritmer som använde strömmande data och båda fick högst F1-värde när man använde en kombination utav alla funktioner. Resultatet visar att dessa funktioner fungerar för att extrahera hopp från hopprocesser, använda dem för att detektera hopp, och att algoritmernas förmåga att detektera hoppen ökade med hjälp av funktionerna.
738

Homogenization of reaction-diffusion problems with nonlinear drift in thin structures

Raveendran, Vishnu January 2022 (has links)
We study the question of periodic homogenization of a variably scaled reaction-diffusion equation with non-linear drift of polynomial type. The non-linear drift was derived as hydrodynamic limit of a totally asymmetric simple exclusion process (TASEP) for a population of interacting particles crossing a domain with obstacle. We consider three different geometries: (i) Bounded domain crossed by a finitely thin flat composite layer; (ii) Bounded domain crossed by an infinitely thin flat composite layer; (iii) Unbounded composite domain.\end{itemize} For the thin layer cases, we consider our reaction-diffusion problem endowed with slow or moderate drift. Using energy-type estimates as well as concepts like thin-layer convergence and two-scale convergence, we derive homogenized evolution equations and the corresponding effective model parameters. Special attention is paid to the derivation of the effective transmission conditions across the separating limit interfaces. As a special scaling, the problem with large drift is treated separately for an unbounded composite domain. Because of the imposed large drift, this nonlinearity is expected to explode in the limit of a vanishing scaling parameter. To deal with this special case, we employ two-scale formal homogenization asymptotics with drift to derive the corresponding upscaled model equations as well as the structure of the effective transport tensors. Finally, we use Schauder's fixed point Theorem as well as monotonicity arguments to study the weak solvability of the upscaled model posed in the unbounded domain. This study wants to contribute with theoretical understanding needed when designing thin composite materials which are resistant to slow, moderate, and high velocity impacts. / We study the question of periodic homogenization of a variably scaled reaction-diffusion equation with non-linear drift of polynomial type. The non-linear drift was derived as hydrodynamic limit of a totally asymmetric simple exclusion process (TASEP) for a population of interacting particles crossing a domain with obstacle. We consider three different geometries: (i) Bounded domain crossed by a finitely thin composite layer; (ii) Bounded domain crossed by an infinitely thin composite  layer; (iii) Unbounded composite domain. For the thin layer cases, we consider our reaction-diffusion problem endowed with slow or moderate drift. Using energy-type  estimates, concepts like thin-layer convergence and two-scale convergence, we derive  homogenized  equations. Special attention is paid to the derivation of the effective transmission conditions across the separating limit interfaces. The problem with large drift is treated separately for an unbounded composite domain. Because of the imposed large drift, this nonlinearity is expected to explode in the limit of a vanishing scaling parameter.  This study wants to contribute with theoretical understanding needed when designing thin composite materials which are resistant to slow, moderate, and high velocity impacts.
739

Cost optimization in the cloud : An analysis on how to apply an optimization framework to the procurement of cloud contracts at Spotify

Ekholm, Harald, Englund, Daniel January 2020 (has links)
In the modern era of IT, cloud computing is becoming the new standard. Companies have gone from owning their own data centers to procuring virtualized computational resources as a service. This technology opens up for elasticity and cost savings. Computational resources have gone from being a capital expenditure to an operational expenditure. Vendors, such as Google, Amazon, and Microsoft, offer these services globally with different provisioning alternatives. In this thesis, we focus on providing a cost optimization algorithm for Spotify on the Google Cloud Platform. To achieve this we  construct an algorithm that breaks up the problem in four different parts. Firstly, we generate trajectories of monthly active users. Secondly, we split these trajectories up in regions and redistribute monthly active users to better describe the actual Google Cloud Platform footprint. Thirdly we calculate usage per monthly active users quotas from a representative week of usage and use these to translate the redistributed monthly active users trajectories to usage. Lastly, we apply an optimization algorithm to these trajectories and obtain an objective value. These results are then evaluated using statistical methods to determine the reliability. The final model solves the problem to optimality and provides statistically reliable results. As a consequence, we can give recommendations to Spotify on how to minimize their cloud cost, while considering the uncertainty in demand.
740

Homogenization of Partial Differential Equations using Multiscale Convergence Methods

Johnsen, Pernilla January 2021 (has links)
The focus of this thesis is the theory of periodic homogenization of partial differential equations and some applicable concepts of convergence. More precisely, we study parabolic problems exhibiting both spatial and temporal microscopic oscillations and a vanishing volumetric heat capacity type of coefficient. We also consider a hyperbolic-parabolic problem with two spatial microscopic scales. The tools used are evolution settings of multiscale and very weak multiscale convergence, which are extensions of, or closely related to, the classical method of two-scale convergence. The novelty of the research in the thesis is the homogenization results and, for the studied parabolic problems, adapted compactness results of multiscale convergence type.

Page generated in 0.0535 seconds