• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 94
  • 49
  • 18
  • 14
  • 11
  • 9
  • 9
  • 7
  • 4
  • 4
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 233
  • 115
  • 38
  • 31
  • 25
  • 25
  • 25
  • 22
  • 21
  • 19
  • 18
  • 17
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Classification of Financial Instruments / Klassifikation av finansiella instrument

Lindberg, Andreas January 2019 (has links)
In this thesis a general framework and accompanying guidelines for how to classify financial instruments within the fair value hierarchy (included within IFRS 13) is presented. IFRS 13 introduces a broad and loosely defined regulation of how to classify a financial instrument which leaves room for misinterpretation and uncertainties. In this thesis the pricing of financial instruments and behaviour of the market data used as inputs in the models has been investigated. This is to give better insight into what is classified as significant market data, how it is used and how it is approximated. Instruments that have been investigated are autocalls, swaps, European options and Asian options. The result is presented as general recommendations for how to classify the specified instruments with clearer boarders introduced between the levels in the hierarchy. Methods and deductions introduced in the thesis could also further be implemented in classification of closely related financial instruments but has been limited in this thesis due to time restrictions.     Nyckelord på svenska IFRS, Finansiella instrument, Klassificering, Fair value, Fair value hierarchy, Autocall, Swap, Europeisk option, Asiatisk option, Implicit volatilitet, Korrelation, Marknadsaktivitet, Räntesatser / I denna uppsats är ett generellt ramverk och medföljande riktlinjer för hur man klassificerar finansiella instrument inom fair value hierarkin (inkluderad i IFRS 13) presenterat. IFRS 13 introducerar en bred och löst definierad regulation om hur klassificering finansiella instrument ska gå till som lämnar rum för feltolkningar och oklarheter. I denna uppsats har prissättningen av finansiella instrument och beteende av marknadsdata som används i modellerna undersökts. Detta ger en bättre inblick i vad som klassificeras som signifikant marknadsdata, hur den används och hur den kan approximeras. Instrument som har undersökts är autocalls, swaps, europeiska optioner och asiatiska optioner. Resultatet presenteras som allmänna rekommendationer för hur man klassificerar de angivna instrumenten med tydligare gränser som införts mellan nivåerna i hierarkin. Metoder och slutsatser som är presenterade i uppsatsen kan även vidare användas vid klassifikation av liknande finansiella instrument men har i denna avhandling begränsats på grund av tidsskäl.
172

Functional data analysis with applications in finance

Benko, Michal 26 January 2007 (has links)
An vielen verschiedenen Stellen der angewandten Statistik sind die zu untersuchenden Objekte abhängig von stetigen Parametern. Typische Beispiele in Finanzmarktapplikationen sind implizierte Volatilitäten, risikoneutrale Dichten oder Zinskurven. Aufgrund der Marktkonventionen sowie weiteren technisch bedingten Gründen sind diese Objekte nur an diskreten Punkten, wie zum Beispiel an Ausübungspreise und Maturitäten, für die ein Geschäft in einem bestimmten Zeitraum abgeschlossen wurde, beobachtbar. Ein funktionaler Datensatz ist dann vorhanden, wenn diese Funktionen für verschiedene Zeitpunkte (z.B. Tage) oder verschiedene zugrundeliegende Aktiva gesammelt werden. Das erste Thema, das in dieser Dissertation betrachtet wird, behandelt die nichtparametrischen Methoden der Schätzung dieser Objekte (wie z.B. implizierte Volatilitäten) aus den beobachteten Daten. Neben den bekannten Glättungsmethoden wird eine Prozedur für die Glättung der implizierten Volatilitäten vorgeschlagen, die auf einer Kombination von nichtparametrischer Glättung und den Ergebnissen der arbitragefreien Theorie basiert. Der zweite Teil der Dissertation ist der funktionalen Datenanalyse (FDA), speziell im Zusammenhang mit den Problemen, der empirischen Finanzmarktanalyse gewidmet. Der theoretische Teil der Arbeit konzentriert sich auf die funktionale Hauptkomponentenanalyse -- das funktionale Ebenbild der bekannten Dimensionsreduktionstechnik. Ein umfangreicher überblick der existierenden Methoden wird gegeben, eine Schätzmethode, die von der Lösung des dualen Problems motiviert ist und die Zwei-Stichproben-Inferenz basierend auf der funktionalen Hauptkomponentenanalyse werden behandelt. Die FDA-Techniken sind auf die Analyse der implizierten Volatilitäten- und Zinskurvendynamik angewandt worden. Darüber hinaus, wird die Implementation der FDA-Techniken zusammen mit einer FDA-Bibliothek für die statistische Software Xplore behandelt. / In many different fields of applied statistics an object of interest is depending on some continuous parameter. Typical examples in finance are implied volatility functions, yield curves or risk-neutral densities. Due to the different market conventions and further technical reasons, these objects are observable only on a discrete grid, e.g. for a grid of strikes and maturities for which the trade has been settled at a given time-point. By collecting these functions for several time points (e.g. days) or for different underlyings, a bunch (sample) of functions is obtained - a functional data set. The first topic considered in this thesis concerns the strategies of recovering the functional objects (e.g. implied volatilities function) from the observed data based on the nonparametric smoothing methods. Besides the standard smoothing methods, a procedure based on a combination of nonparametric smoothing and the no-arbitrage-theory results is proposed for implied volatility smoothing. The second part of the thesis is devoted to the functional data analysis (FDA) and its connection to the problems present in the empirical analysis of the financial markets. The theoretical part of the thesis focuses on the functional principal components analysis -- functional counterpart of the well known multivariate dimension-reduction-technique. A comprehensive overview of the existing methods is given, an estimation method based on the dual problem as well as the two-sample inference based on the functional principal component analysis are discussed. The FDA techniques are applied to the analysis of the implied volatility and yield curve dynamics. In addition, the implementation of the FDA techniques together with a FDA library for the statistical environment XploRe are presented.
173

Equity derivatives markets

Detlefsen, Kai 19 October 2007 (has links)
Seit der Entdeckung der arbitragefreien Bewertung hat sich das Gebiet finance grundlegend geändert - sowohl in der Theorie als auch in der Anwendung. Märkte für Derivate haben sich entwickelt und Optionen dienen heutzutage als Basis- und als Absicherungsinstrumente. In dieser Dissertation betrachten wir einige Märkte für Aktienderivate. Wir beginnen mit statistischen Analysen des Marktes für europäische Optionen und des Marktes für Varianzswaps, weil diese Produkte die hauptsächlichen Absicherungsinstrumente für komplexe Optionen sind. Dann betrachten wir verschiedene Optionspreismodelle und ihre Kalibrierung an beobachtete Preisoberflächen. Schließlich untersuchen wir die Verbindung zwischen Optionspreisen und dem grundlegenden ökonomischen Konzept der Risikoaversion anhand des empirischen Preiskernes. / Since the ideas of arbitrage free pricing were born, finance has changed radically - both in theory and practice. Derivatives markets have evolved and options serve nowadays as underlyings and as hedging instruments. In this thesis, we consider some markets for equity derivatives. We start by statistical analysis of the markets for European options and variance swaps because these products are important for hedging more complex claims. Then we consider different option pricing models and their calibration to observed price surfaces. Finally, we investigate the connection between option prices and the fundamental economic concept of risk aversion by the empirical pricing kernel.
174

The Predictive Power of Implied Volatility in Option Pricing / Den Prediktiva Kraften av Implicit Volatilitet vid Optionsprissättning

Berglund, Lovisa January 2023 (has links)
During the last few years, financial derivatives have been growing in trading volume. There seem to be a high demand and supply of derivatives on the market and one common derivative is the option contract. The option contract is frequently the subject of studies and many different pricing models have been created for options. One popular model is the Black-Scholes model, that prices European call options. This project will look closer at the Black-Scholes model and its assumption that volatility remains constant. The project will try to establish what predictive power the implied volatility has for the sign of the price changes in the option’s daily closing price. The implied volatility is defined as the value of volatility that can be used in an option pricing formula to yield the current market price and goes against the assumption of constant volatility. The model consists of a dependent variable that is the binary variable for the sign of the price changes, 1 if positive and 0 if negative. The independent variables are implied volatility, volume, price of the underlying, and VIX. The models used for testing are logistic regression, CART, random forest and XGBoost. The research question is: Can the sign of option price jumps be predicted with the implied volatility? The answer to the research question is that there are indications for the implied volatility having predictive power when predicting the sign of the price changes in the option, even though the results are not conclusive across all models. / Under de senaste åren har finansiella derivat ökat i handelsvolym. Det verkar finnas en hög efterfrågan och tillgång på derivat generellt på marknaden och ett vanligt sådant derivat är optionskontraktet. Optioner är ofta föremål för forskning och många olika prissättningsmodeller har skapats för optioner. En populär modell är Black-Scholes modellen som prissätter europeiska köpoptioner. Detta projekt kommer att titta närmare på Black-Scholes modellen och dess antagande om att volatiliteten förblir konstant. Projektet kommer att försöka fastställa vilken prediktiv kraft den implicita volatiliteten har för tecknet på prisförändringarna i optionens dagliga stängningskurs. Den implicita volatiliteten definieras som värdet av volatiliteten som kan användas i en optionsprissättningsformel för att ge det aktuella marknadspriset och går emot antagandet om konstant volatilitet. Modellen består av en beroende variabel som är en binär variabel för tecknet på prisförändringarna, 1 om positivt och 0 om negativt. De oberoende variablerna är implicit volatilitet, volym, pris på det underliggande instrumentet och VIX. Modellerna som används för att genomföra testen är logistisk regression, CART, random forest och XGBoost. Projektets frågeställning är: Kan tecknet på en options prisförändringar förutsägas med den implicita volatiliteten? Svaret som projektet kommit fram till är att det finns indikationer på att den implicita volatiliteten har prediktiv kraft när det gäller att förutsäga tecknet på prisförändringarna i optionen, även om resultaten inte är helt överensstämmande för alla modeller.
175

Evaluation of Warming function vs Single scenario / Utvärdering av Warming function mot Single scenario

Liljedahl, Ida, Rondahl, Ebba January 2022 (has links)
The private sector has an increasingly important role in limiting the temperature rise to below the crucial 1.5°C of the Paris Agreement. In order to assess how well a portfolio is aligned to the goals of the Paris Agreement, benchmarks need be used. The creation of benchmarks is complex and varying methodologies with associated opinions exist. The purpose of this thesis is to examine two different approaches for creating benchmarks; Single scenario and Warming function. The usability of the benchmarks is examined from the perspective of a financial institution. The methodology is based on a literature review of earlier research. Using empirical data the different methods are tested in a smaller Case study and on a larger data set. The results give insights in different advantages and disadvantages of the methods. Using a Single scenario approach allows for a more granular analysis, however it is criticized for being misleading as the result is easily manipulated. The Warming function is less granular, as almost all companies in the empirical data are assessed against the same benchmark - creating unfair results for companies active in sectors with different demands on emission reduction. In order to gain sectoral granularity the Warming function methodology need to be investigated further. By lowering the demands on the final benchmarks, the adaptation to different sectors can increase inthe Warming function approach. / Den privata sektorn har en växande roll i kampen för att begränsa den globala uppvärmningen till Parisavtalets kritiska gräns på 1,5°C. För att utvärdera hur väl ett företag presterar klimatmässigt jämfört med kraven i Parisavtalet behövs jämförelsemått. Skapandet av jämförelsemått är komplext och en mängd olika metodologier existerar med olika grader av popularitet. Syftet med denna uppsats är att undersöka två olika tillvägagångssätt för att skapa jämförelsemått; Single scenario och Warming function. Användarbarheten av metodologierna utvärderas från perspektivet av en finansiell institution. Metodologin baseras på en litteraturstudie av tidigare undersökningar. Warming function och Single scenario utvärderas i en mindre fallstudie och i tester på ett större data set med hjälp av empiriska data. Resultatet ger insikter i olika för- och nackdelar med metoderna. Att använda Single scenario tillåter en mer granulär analys, dock kritiseras denna metod för att vara missledande då de slutgiltiga resultaten är enkla att manipulera. Warming functio när mindre granulär då de flesta företagen utvärderas mot samma jämförelsemått vilket skapar orättvisa resultat för företag som är aktiva i sektorer med olika höga krav på utsläppsminskning. För att öka granulariteten måste metodologin för Warming function undersökas djupare. Genom att sänka kraven på de slutgiltiga jämförelsemåtten kan sektoranpassade jämförelsemått öka i Warming function.
176

An Examination of the Common Law Obligation of Good Faith in the Performance and Enforcement of Commercial Contracts in Australia

Dixon, William Michael January 2005 (has links)
This examination of the common law obligation of good faith in the performance and enforcement of commercial contracts in Australia seeks to achieve a number of objectives. First, to chart the historical development of the implied good faith obligation. Secondly, to identify a number of issues that remain unresolved at Australian lower court level. Thirdly, to consider five doctrinal approaches that could be adopted by the High Court when ultimately confronted by the competing claims and tensions that have proven divisive in the courts below. Fourthly, to assess each approach against three identified benchmarks. The essential thesis is that good faith should be implied, as a matter of law, in commercial contracts that are relational in nature with an additional call being made for the High Court to explicitly recognise that the underlying basis of the implied good faith obligation is the reasonable expectations of the contractual parties. This approach is the one approach that satisfies all three benchmarks and provides the most satisfactory resolution of the issues that presently bedevil the commercial good faith debate in Australia.
177

Incomprehension or resistance? : the Markan disciples and the narrative logic of Mark 4:1-8:30

Blakley, J. Ted January 2008 (has links)
The characterization of the Markan disciples has been and continues to be the object of much scholarly reflection and speculation. For many, the Markan author's presentation of Jesus' disciples holds a key, if not the key, to unlocking the purpose and function of the gospel as a whole. Commentators differ as to whether the Markan disciples ultimately serve a pedagogical or polemical function, yet they are generally agreed that the disciples in Mark come off rather badly, especially when compared to their literary counterparts in Matthew, Luke, and John. This narrative-critical study considers the characterization of the Markan disciples within the Sea Crossing movement (Mark 4:1-8:30). While commentators have, on the whole, interpreted the disciples' negative characterization in this movement in terms of lack of faith and/or incomprehension, neither of these, nor a combination of the two, fully accounts for the severity of language leveled against the disciples by the narrator (6:52) and Jesus (8:17-18). Taking as its starting point an argument by Jeffrey B. Gibson (1986) that the harshness of Jesus' rebuke in Mark 8:14-21 is occasioned not by the disciples' lack of faith or incomprehension but by their active resistance to his Gentile mission, this investigation uncovers additional examples of the disciples' resistance to Gentile mission, offering a better account of their negative portrayal within the Sea Crossing movement and helping explain many of their other failures. In short, this study argues that in Mark 4:1-8:26, the disciples are characterized as resistant to Jesus' Gentile mission and to their participation in that mission, the chief consequence being that they are rendered incapable of recognizing Jesus' vocational identity as Israel's Messiah (Thesis A). This leads to a secondary thesis, namely, that in Mark 8:27-30, Peter's recognition of Jesus' messianic identity indicates that the disciples have finally come to accept Jesus' Gentile mission and their participation in it (Thesis B). Chapter One: Introduction: offers a selective review of scholarly treatments of the Markan disciples, which shows that few scholars attribute resistance, let alone purposeful resistance, to the disciples. Chapter Two: The Rhetoric of Repetition: introduces the methodological tools, concepts, and perspectives employed in the study. It includes a section on narrative criticism, which focuses upon the story-as-discoursed and the implied author and reader, and a section on Construction Grammar, a branch of cognitive linguistics founded by Charles Fillmore and further developed by Paul Danove, which focuses upon semantic and narrative frames and case frame analysis. Chapter Three: The Sea Crossing Movement, Mark 4:1-8:30: addresses the question of Markan structure and argues that Mark 4:1-8:30 comprises a single, unified, narrative movement, whose action and plot is oriented to the Sea of Galilee and whose most distinctive feature is the network of sea crossings that transport Jesus and his disciples back and forth between Jewish and Gentile geopolitical spaces. Following William Freedman, Chapter Four: The Literary Motif: introduces two criteria (frequency and avoidability) for determining objectively what constitutes a literary motif and provides the methodological basis and starting point for the analyses performed in chapters five and six. Chapter Five: The Sea Crossing Motif: establishes and then carries out a lengthy narrative analysis of the Sea Crossing motif, which is oriented around Mark's use of ‎θάλασσα (thalassa) and πλοῖον (ploion), and Chapter Six: The Loaves Motif: does the same for The Loaves motif, oriented around Mark's use of ἄρτος (artos). Finally, Chapter Seven: The Narrative Logic of the Disciples (In)comprehension: draws together all narrative, linguistic, and exegetical insights of the previous chapters and offers a single coherent reading of the Sea Crossing movement that establishes Theses A and B.
178

The meaning of 'Organ of State` in South African law

Mdumbe, Moses Fanyana 30 June 2003 (has links)
`Organ of state' as a constitutional concept was first introduced by the 1993 Constitution, in which it was defined as including any statutory body or functionary. In their interpretation of this notion, the courts and academic writers invoked the tests developed at common law in order to determine its meaning. The commentators, on the one hand, used a variety of tests. The courts, on the other hand, subscribed to what has come to be known as the `control test'. The 1996 Constitution followed with a comprehensive definition of `organ of state'. This notion is also employed in other laws by direct reference or incorporation of the definition in section 239 with slight adjustments. Regrettably, the limited approach developed by the court in their interpretation of the notion of `organ of state' for the purposes of the 1993 Constitution has spilled over to the interpretation of the concept under the 1996 Constitution. The question is whether this is justifiable. The constitutional definition of `organ of state' makes it clear that other institutions and functionaries are organs of state on the basis of what they are and others by virtue of the functions they are engaged in. Therefore strict adherence to the control test or any other test could unjustifiably limit the application of the Constitution. / Jurisprudence / LL.M.
179

Testing the afforestation reservation price of small forest landowners in New Zealand

Rodenberg Ballweg, Julie January 2013 (has links)
The estimation of afforestation reservation prices for small landowners in New Zealand has not been the subject of much research despite its importance in predicting future land use. Reservation prices for planting represent the minimum payment a landowner must receive before converting land from agriculture to forest. A survey of 728 landowners from every region of New Zealand who own between 20 and 200 hectares of forest as well as other unplanted land used for agriculture were surveyed about forestland, forest land owner demographics, ownership objectives, silviculture and reservation prices. In this study, reservation price strategies were investigated by offering hypothetical annual and one-time payments for converting land from agriculture to forestry. From this survey, the average one-time payment a landowner would be willing to accept to convert a hectare of land from agriculture to forestry was $3,554 and the average annual payment to convert a hectare of land was $360. The key factors influencing the reservation price were; whether or not the landowner lived on the property, if one of the ownership objectives was income from carbon, the primary agricultural enterprise and total household income. An implied discount rate was calculated for each landowner and excluding those who would not accept any payment the average after-tax discount rate was 9.7%. Small landowners indicated that their primary reason for owning plantation forest was income from timber with very few landowners using their forest land for recreation. The median farm size was 400 hectares and the median forest plantation was 37 hectares. Planting of radiata pine peaked in 1994 and 1995 with more radiata pine planted in 1994 than in all the years from 2000-2009. Most landowners are performing some type of silviculture in their forests. Ninety percent of landowners are pruning in the current rotation while only 61% plan to prune in the future. Only 26% of landowners have engaged in any commercial harvesting in the past ten years but as their current rotation matures 71% plan to replant on the same site. A majority of respondents thought the situation for forest landowners was getting better. Understanding the reservation price strategies of landowners is important for predicting future land use patterns and recognizing how close landowners are to converting land. The ownership objectives of landowners and the replanting decisions they make are critical for future timber supply. The results of this study can assist in the development of forest establishment incentive programmes. Better information about landowner characteristics will result in enhanced decision-making for the timber industry and the government in New Zealand.
180

動態隱含波動度模型:以台指選擇權為例 / Dynamic Implied Volatility Functions in Taiwan Options Market

陳鴻隆, Chen,Hung Lung Unknown Date (has links)
本文提出一個動態隱含波動度函數模型,以改善一般隱含波動度函數難以隨時間的經過而調整波動度曲線且無法描述資料的時間序列特性等缺點。本文模型為兩階段隱含波動度函數模型,分別配適隱含波動度函數的時間穩定(time-invariant)部分與時間不穩定(time-variant)部分。 本文模型在波動度的時間不穩定部分配適非對稱GARCH(1,1)過程,以描述隱含波動度的時間序列特性。本文使用的非對稱GARCH(1,1)過程將標的資產的正報酬與負報酬對價平隱含波動度的影響分別估計,並將蘊含於歷史價平隱含波動度中的訊息及標的資產報酬率與波動度之間的關連性藉由價平隱含波動度過程納入隱含波動度函數中,使隱含波動度函數能納入波動度的時間序列特性及資產報酬與波動度的相關性,藉此納入最近期的市場資訊,以增加隱含波動度模型的解釋及預測能力。時間穩定部分則根據Pena et al.(1999)的研究結果,取不對稱二次函數形式以配適實證上發現的笑狀波幅現象。時間穩定部分並導入相對價內外程度做為變數,以之描述價內外程度、距到期時間、及價平隱含波動度三者的交互關係;並以相對隱含波動度作為被解釋變數,使隱含波動度函數模型除理論上包含了比先前文獻提出的模型更多的訊息及彈性外,還能描繪「隱含波動度函數隨波動度的高低水準而變動」、「越接近到期日,隱含波動度對價內外程度的曲線越彎曲」、「隱含波動度函數為非對稱的曲線」、「波動度和資產價格有很高的相關性」等實證上常發現的現象。 本文以統計測度及交易策略之獲利能力檢定模型的解釋能力及預測能力是否具有統計與經濟上的顯著性。本文歸納之前文獻提出的不同隱含波動度函數模型,並以之與本文提出的模型做比較。本文以台指選擇權五分鐘交易頻率的成交價作為實證標的,以2003年1月1日~2006年12月31日作為樣本期間,並將模型解釋力及AIC作為模型樣本內配適能力之比較標準,我們發現本文提出的模型具有最佳的資料解釋能力。本文以2006年7月1日~2006年12月31日作為隱含波動度模型預測期間,以統計誤差及delta投資策略檢定模型的預測能力是否具有統計及經濟上的顯著性。實證結果指出,本文提出的模型對於預測下一期的隱含波動度及下一期的選擇權價格,皆有相當良好的表現。關於統計顯著性方面,我們發現本文提出的動態隱含波動度函數模型對於未來的隱含波動度及選擇權價格的預測偏誤約為其他隱含波動度函數模型的五分之一,而預測方向正確頻率亦高於預測錯誤的頻率且超過50%。關於經濟顯著性方面,本文使用delta投資組合進行經濟顯著性檢定,結果發現在不考慮交易成本下,本文提出的模型具有顯著的獲利能力。顯示去除標的資產價格變動對選擇權造成的影響後,選擇權波動度的預測準確性確實能經由delta投資組合捕捉;在考慮交易成本後,各模型皆無法獲得超額報酬。最後,本文提出的動態隱含波動度函數模型在考量非同步交易問題、30分鐘及60分鐘等不同的資料頻率、不同的投資組合交易策略後,整體的結論依然不變。 / This paper proposes a new implied volatility function to facilitate implied volatility forecasting and option pricing. This function specifically takes the time variation in the option implied volatility into account. Our model considers the time-variant part and fits it with an asymmetric GARCH(1,1) model, so that our model contains the information in the returns of spot asset and contains the relationship of the returns and the volatility of spot asset. This function also takes the time invariant in the option implied volatility into account. Our model fits the time invariant part with an asymmetric quadratic functional form to model the smile on the volatility. Our model describes the phenomena often found in the literature, such as the implied volatility level increases as time to maturity decreases, the curvature of the dependence of implied volatility on moneyness increases as options near maturity, the implied volatility curve changes as the volatility level changes, and the implied volatility function is an asymmetric curve. For the empirical results, we used a sample of 5 minutes transaction prices for Taiwan stock index options. For the in-sample period January 1, 2003–June 30, 2006, our model has the highest adjusted- and lowest AIC. For the out-of-sample period July 1, 2006–December 31, 2006, the statistical significance shows that our model substantially improves the forecasting ability and reduces the out-of-sample valuation errors in comparison with previous implied volatility functions. We conjecture that such good performance may be due to the ability of the GARCH model to simultaneously capture the correlation of volatility with spot returns and the path dependence in volatility. To test the economic significance of our model, we examine the profitability of the delta-hedged trading strategy based on various volatility models. We find that although these strategies are able to generate profits without transaction costs, their profits disappear quickly when the transaction costs are taken into consideration. Our conclusions were unchanged when we considered the non-synchronization problem or when we test various data frequency and different strategies.

Page generated in 0.0265 seconds