31 |
Etiska och traditionella fonders avkastning : En jämförande studie mellan etiska och traditionella fonderChoudhury, Jenny, Pektas, Mete January 2012 (has links)
Syfte: Syftet med uppsatsen är att utreda huruvida avkastningar mellan etiska aktiefonder och traditionella aktiefonder är likvärdiga. Vidare avser uppsatsen att klargöra hur etiska fonder definieras ur ett teoretiskt perspektiv med utgångspunkt i rådande forskning. Metod: Studien är av kvantitativ karaktär och utfördes med hjälp av fonddata inhämtad från respektive storbank och Morningstar. Det kvantitativa innehållet består av fondernas årliga avkastningar. Undersökningsperioden sträcker sig från december 2008 till december 2012. Teori: Beta, Sharpekvot och Modern Portföljteori. Slutsats: Studiens slutsats påvisar inga större skillnader mellan de etiska och de traditionella fonderna sett till avkastning. Den traditionella fondgruppen var den som hade marginellt bättre avkastning.
|
32 |
Estimation de l’écart type du délai de bout-en-bout par méthodes passives / Passive measurement in Software Defined NetworksNguyen, Huu-Nghi 09 March 2017 (has links)
Depuis l'avènement du réseau Internet, le volume de données échangées sur les réseaux a crû de manière exponentielle. Le matériel présent sur les réseaux est devenu très hétérogène, dû entre autres à la multiplication des "middleboxes" (parefeux, routeurs NAT, serveurs VPN, proxy, etc.). Les algorithmes exécutés sur les équipements réseaux (routage, “spanning tree”, etc.) sont souvent complexes, parfois fermés et propriétaires et les interfaces de supervision peuvent être très différentes d'un constructeur/équipement à un autre. Ces différents facteurs rendent la compréhension et le fonctionnement du réseau complexe. Cela a motivé la définition d'un nouveau paradigme réseaux afin de simplifier la conception et la gestion des réseaux : le SDN (“Software-defined Networking”). Il introduit la notion de contrôleur, qui est un équipement qui a pour rôle de contrôler les équipements du plan de données. Le concept SDN sépare donc le plan de données chargés de l'acheminement des paquets, qui est opéré par des équipements nommés virtual switches dans la terminologie SDN, et le plan contrôle, en charge de toutes les décisions, et qui est donc effectué par le contrôleur SDN. Pour permettre au contrôleur de prendre ses décisions, il doit disposer d'une vue globale du réseau. En plus de la topologie et de la capacité des liens, des critères de performances comme le délai, le taux de pertes, la bande passante disponible, peuvent être pris en compte. Cette connaissance peut permettre par exemple un routage multi-classes, ou/et garantir des niveaux de qualité de service. Les contributions de cette thèse portent sur la proposition d'algorithmes permettant à une entité centralisée, et en particulier à un contrôleur dans un cadre SDN, d'obtenir des estimations fiables du délai de bout-en-bout pour les flux traversant le réseau. Les méthodes proposées sont passives, c'est-à-dire qu'elles ne génèrent aucun trafic supplémentaire. Nous nous intéressons tout particulièrement à la moyenne et l'écart type du délai. Il apparaît que le premier moment peut être obtenu assez facilement. Au contraire, la corrélation qui apparaît dans les temps d'attentes des noeuds du réseau rend l'estimation de l'écart type beaucoup plus complexe. Nous montrons que les méthodes développées sont capables de capturer les corrélations des délais dans les différents noeuds et d'offrir des estimations précises de l'écart type. Ces résultats sont validés par simulations où nous considérons un large éventail de scénarios permettant de valider nos algorithmes dans différents contextes d'utilisation / Since the early beginning of Internet, the amount of data exchanged over the networks has exponentially grown. The devices deployed on the networks are very heterogeneous, because of the growing presence of middleboxes (e.g., firewalls, NAT routers, VPN servers, proxy). The algorithms run on the networking devices (e.g., routing, spanning tree) are often complex, closed, and proprietary while the interfaces to access these devices typically vary from one manufacturer to the other. All these factors tend to hinder the understanding and the management of networks. Therefore a new paradigm has been introduced to ease the design and the management of networks, namely, the SDN (Software-defined Networking). In particular, SDN defines a new entity, the controller that is in charge of controlling the devices belonging to the data plane. Thus, in a SDN-network, the data plane, which is handled by networking devices called virtual switches, and the control plane, which takes the decisions and executed by the controller, are separated. In order to let the controller take its decisions, it must have a global view on the network. This includes the topology of the network and its links capacity, along with other possible performance metrics such delays, loss rates, and available bandwidths. This knowledge can enable a multi-class routing, or help guarantee levels of Quality of Service. The contributions of this thesis are new algorithms that allow a centralized entity, such as the controller in an SDN network, to accurately estimate the end-to-end delay for a given flow in its network. The proposed methods are passive in the sense that they do not require any additional traffic to be run. More precisely, we study the expectation and the standard deviation of the delay. We show how the first moment can be easily computed. On the other hand, estimating the standard deviation is much more complex because of the correlations existing between the different waiting times. We show that the proposed methods are able to capture these correlations between delays and thus providing accurate estimations of the standard deviation of the end-to-end delay. Simulations that cover a large range of possible scenariosvalidate these results
|
33 |
Análise da volatilidade de séries financeiras segundo a modelagem da família GARCHMacêdo, Guilherme Ribeiro de January 2009 (has links)
O conhecimento do risco de ativos financeiros é de fundamental importância para gestão ativa de carteiras, determinação de preços de opções e análise de sensibilidade de retornos. O risco é medido através da variância estatística e há na literatura diversos modelos econométricos que servem a esta finalidade. Esta pesquisa contempla o estudo de modelos determinísticos de volatilidade, mais especificamente os modelos GARCH simétricos e assimétricos. O período de análise foi dividido em dois: de janeiro de 2000 à fevereiro de 2008 e à outubro de 2008. Tal procedimento foi adotado procurando identificar a influência da crise econômica originada nos EUA nos modelos de volatilidade. O setor escolhido para o estudo foi o mercado de petróleo e foram escolhidas as nove maiores empresas do setor de acordo com a capacidade produtiva e reservas de petróleo. Além destas, foram modeladas também as commodities negociadas na Bolsa de Valores de Nova York: o barril de petróleo do tipo Brent e WTI. A escolha deste setor deve-se a sua grande importância econômica e estratégica para todas as nações. Os resultados encontrados mostraram que não houve um padrão de modelo de volatilidade para todos os ativos estudados e para a grande maioria dos ativos, há presença de assimetria nos retornos, sendo o modelo GJR (1,1) o que mais prevaleceu, segundo a modelagem pelo método da máxima verossimilhança. Houve aderência, em 81% dos casos, dos ativos a um determinado modelo de volatilidade, alterando apenas, como eram esperados, os coeficientes de reatividade e persistência. Com relação a estes, percebe-se que a crise aumentou os coeficientes de reatividade para alguns ativos. Ao se compararem as volatilidades estimadas de curto prazo, percebe-se que o agravamento da crise introduziu uma elevação média de 265,4% em relação ao período anterior, indicando um aumento substancial de risco. Para a volatilidade de longo prazo, o aumento médio foi de 7,9%, sugerindo que os choques reativos introduzidos com a crise, tendem a ser dissipados ao longo do tempo. / The knowledge of the risk of financial assets is of basic importance for active management of portfolios, determination of prices of options and analysis of sensitivity of returns. The risk is measured through the variance statistics and has in literature several econometrical models that serve to this purpose. This research contemplates the study of deterministic models of volatility, more specifically symmetrical and asymmetrical models GARCH. The period of analysis was divided in two: January of 2000 to the February of 2008 and the October of 2008. Such a proceeding was adopted trying to identify the influence of the economic crisis given rise in U.S.A. in the volatility models. The sector chosen for the study was the oil market and had been chosen the nine bigger companies of the sector in accordance with the productive capacity and reserves of oil. Beyond these, there were modeled also the commodities negotiated in the Stock Exchange of New York: the barrel of oil of the types Brent and WTI. The choice of this sector is due to his great economical and strategic importance for all the nations. The results showed that there was no a standard of model of volatility for all the studied assets and for the majority of them, there is presence of asymmetry in the returns, being the model GJR (1,1) that more prevailed, according to the method of likelihood. There was adherence, in 81 % of the cases, of the assets to a determined model of volatility, altering only the coefficients of reactivity and persistence. Regarding these, it is realized that the crisis increased the coefficients of reactivity for some assets. In relation to the volatilities of short term, it is realized that the aggravation of the crisis introduced an elevation of 265,4% regarding the previous period, indicating a substantial increase of risk. In relation to the volatility of long term, the increase was 7,9 %, suggesting that the reactive shocks introduced with the crisis have a tendency to be dispersed along the time.
|
34 |
Análise da volatilidade de séries financeiras segundo a modelagem da família GARCHMacêdo, Guilherme Ribeiro de January 2009 (has links)
O conhecimento do risco de ativos financeiros é de fundamental importância para gestão ativa de carteiras, determinação de preços de opções e análise de sensibilidade de retornos. O risco é medido através da variância estatística e há na literatura diversos modelos econométricos que servem a esta finalidade. Esta pesquisa contempla o estudo de modelos determinísticos de volatilidade, mais especificamente os modelos GARCH simétricos e assimétricos. O período de análise foi dividido em dois: de janeiro de 2000 à fevereiro de 2008 e à outubro de 2008. Tal procedimento foi adotado procurando identificar a influência da crise econômica originada nos EUA nos modelos de volatilidade. O setor escolhido para o estudo foi o mercado de petróleo e foram escolhidas as nove maiores empresas do setor de acordo com a capacidade produtiva e reservas de petróleo. Além destas, foram modeladas também as commodities negociadas na Bolsa de Valores de Nova York: o barril de petróleo do tipo Brent e WTI. A escolha deste setor deve-se a sua grande importância econômica e estratégica para todas as nações. Os resultados encontrados mostraram que não houve um padrão de modelo de volatilidade para todos os ativos estudados e para a grande maioria dos ativos, há presença de assimetria nos retornos, sendo o modelo GJR (1,1) o que mais prevaleceu, segundo a modelagem pelo método da máxima verossimilhança. Houve aderência, em 81% dos casos, dos ativos a um determinado modelo de volatilidade, alterando apenas, como eram esperados, os coeficientes de reatividade e persistência. Com relação a estes, percebe-se que a crise aumentou os coeficientes de reatividade para alguns ativos. Ao se compararem as volatilidades estimadas de curto prazo, percebe-se que o agravamento da crise introduziu uma elevação média de 265,4% em relação ao período anterior, indicando um aumento substancial de risco. Para a volatilidade de longo prazo, o aumento médio foi de 7,9%, sugerindo que os choques reativos introduzidos com a crise, tendem a ser dissipados ao longo do tempo. / The knowledge of the risk of financial assets is of basic importance for active management of portfolios, determination of prices of options and analysis of sensitivity of returns. The risk is measured through the variance statistics and has in literature several econometrical models that serve to this purpose. This research contemplates the study of deterministic models of volatility, more specifically symmetrical and asymmetrical models GARCH. The period of analysis was divided in two: January of 2000 to the February of 2008 and the October of 2008. Such a proceeding was adopted trying to identify the influence of the economic crisis given rise in U.S.A. in the volatility models. The sector chosen for the study was the oil market and had been chosen the nine bigger companies of the sector in accordance with the productive capacity and reserves of oil. Beyond these, there were modeled also the commodities negotiated in the Stock Exchange of New York: the barrel of oil of the types Brent and WTI. The choice of this sector is due to his great economical and strategic importance for all the nations. The results showed that there was no a standard of model of volatility for all the studied assets and for the majority of them, there is presence of asymmetry in the returns, being the model GJR (1,1) that more prevailed, according to the method of likelihood. There was adherence, in 81 % of the cases, of the assets to a determined model of volatility, altering only the coefficients of reactivity and persistence. Regarding these, it is realized that the crisis increased the coefficients of reactivity for some assets. In relation to the volatilities of short term, it is realized that the aggravation of the crisis introduced an elevation of 265,4% regarding the previous period, indicating a substantial increase of risk. In relation to the volatility of long term, the increase was 7,9 %, suggesting that the reactive shocks introduced with the crisis have a tendency to be dispersed along the time.
|
35 |
Predicting Glass Sponge (Porifera, Hexactinellida) Distributions in the North Pacific Ocean and Spatially Quantifying Model UncertaintyDavidson, Fiona 07 January 2020 (has links)
Predictions of species’ ranges from distribution modeling are often used to inform marine management and conservation efforts, but few studies justify the model selected or quantify the uncertainty of the model predictions in a spatial manner. This thesis employs a multi-model, multi-area SDM analysis to develop a higher certainty in the predictions where similarities exist across models and areas. Partial dependence plots and variable importance rankings were shown to be useful in producing further certainty in the results. The modeling indicated that glass sponges (Hexactinellida) are most likely to exist within the North Pacific Ocean where alkalinity is greater than 2.2 μmol l-1 and dissolved oxygen is lower than 2 ml l-1. Silicate was also found to be an important environmental predictor. All areas, except Hecate Strait, indicated that high glass sponge probability of presence coincided with silicate values of 150 μmol l-1 and over, although lower values in Hecate Strait confirmed that sponges can exist in areas with silicate values of as low as 40 μmol l-1. Three methods of showing spatial uncertainty of model predictions were presented: the standard error (SE) of a binomial GLM, the standard deviation of predictions made from 200 bootstrapped GLM models, and the standard deviation of eight commonly used SDM algorithms. Certain areas with few input data points or extreme ranges of predictor variables were highlighted by these methods as having high uncertainty. Such areas should be treated cautiously regardless of the overall accuracy of the model as indicated by accuracy metrics (AUC, TSS), and such areas could be targeted for future data collection. The uncertainty metrics produced by the multi-model SE varied from the GLM SE and the bootstrapped GLM. The uncertainty was lowest where models predicted low probability of presence and highest where the models predicted high probability of presence and these predictions differed slightly, indicating high confidence in where the models predicted the sponges would not exist.
|
36 |
Analysis of Monthly Suspended Sediment Load in Rivers and Streams Using Linear Regression and Similar Precipitation DataEchiejile, Faith 18 August 2021 (has links)
No description available.
|
37 |
Diverzifikace portfolia prostřednictvím investic do burzovních indexů / Portfolio Diversification through Investment in Stock IndicesKřižka, Adam January 2020 (has links)
The diploma thesis focuses on the design of suitable stock exchange indices for portfolio diversification. The essence and principle of functioning of financial markets and investment funds is presented. According to suitable indicators, stock exchange indices are analyzed and compared with the market. Suitable indices are verified by means of correlation analysis and subsequently recommended to diversify the portfolios of investment funds managed through the investment company.
|
38 |
Evaluation regarding the US fund market : A comparison between different US fund risk classes and their performanceSjöstrand, Victor, Svensson Kanstedt, Albert January 2021 (has links)
The intent of this thesis is to investigate how US equity funds performance differ due to their standard deviation. In order to accomplish this study, we collected daily data for 99 US equity funds for the period 2011-2020 and divided the funds into three risk classification groups based on their standard deviation for the year 2011. The collected data was used to perform an CAPM regression and to calculate returns on a three-, five- and ten-year basis. The results for the regression and the returns for the funds was later presented as average values for the different risk classification groups. We then compared the average outcomes for the three risk classifications with each other and the index S&P 500. Our result showed that the index S&P 500 outperformed the three risk classification groups average returns for every time period. We also noticed that the difference between the average returns and the index got greater by time. We did not find any big differences between our risk classifications when it comes to their performance. Our regression analysis resulted in many negative alpha values indicating that S&P 500, as many previous studies claims, outperforms actively mutual funds. The conclusion is therefore that we could not show any evidence that the there is a major different in performance between our risk groups but also that it is difficult for fund managers to outperform index.
|
39 |
Sound Absorption and Sound Power Measurements in Reverberation Chambers Using Energy Density MethodsNutter, David B. 28 August 2006 (has links) (PDF)
Measurements in a reverberation chamber use spatially averaged squared pressure to calculate sound absorption, sound power, and other sound measurements. While a reverberation chamber provides an approximation of a diffuse sound field, variations in the measurements introduce uncertainty in measurement results. Room qualification procedures require a sufficient number of source-receiver locations to obtain suitable measurements. The total acoustic energy density provides greater spatial uniformity than squared pressure, which requires fewer source-receiver positions to produce similar or better accuracy in measurement results. This paper explores the possibility of using energy density in place of squared pressure, using methods outlined in current ISO standards, by describing several experimental and analytical results.
|
40 |
混合線性模式的估計于國欽, YU, GUO-GIN Unknown Date (has links)
一般說來,我們所遇到的線性模式都是
Y=Xβ+ι
Y 是一N ×1的向量,其元素為應變數的觀測值
X 是一N ×P 的矩陣,其中的元素為已知數
β是一P ×1的向量,其中的元素為母體的參數(parameters)
ι是一N ×1的向量,其中的元素為隨機誤差(random errors)
我們可以用幾何求得參數的估計式
β=(X′X)-X′Y
同理,我們可求得有關β的標準差,建立βi ′s 的信賴區間,及作各種有關的假設
檢定。一旦我們將模式改成
Y=Xα+ZU (ii)
Y 是一N ×1的向量,其元素為應變數的觀測值
X 是一N ×P 的已知常數矩陣
Z 是N × 的已知常數矩陣
α是P ×1未知的參數向量(固定效應)
U是γ×1的向量,其中包括隨機效應和隨機誤差兩項
因為(ii)式中的隨機向量U包括隨機效應和隨機誤差兩項,倘若我們把(ii)式中
兩部分予以分解,則(ii)式可以改寫如下:
(圖表省略)
本文所討論的是運用已知的原理去估計β和U,其中將討論如何運用極限的原理去估
計β和U的向量
|
Page generated in 0.025 seconds