Spelling suggestions: "subject:"principal components"" "subject:"aprincipal components""
291 |
カテゴリカル・デ-タの主成分分析の心理計量学的研究村上, 隆 03 1900 (has links)
科学研究費補助金 研究種目:基盤研究(C)(2) 課題番号:09610114 研究代表者:村上 隆 研究期間:1997-1998年度
|
292 |
Essays on corporate risk, U.S. business cycles, international spillovers of stock returns, and dual listingIvaschenko, Iryna January 2003 (has links)
This thesis consists of four self-contained essays on the various topics in finance. The first essay, The Information Content of The Systematic Risk Structure of Corporate Yields for Future Real Activity: An Exploratory Empirical Investigation, constructs a proxy for the systematic component of the risk structure of corporate yields (or systematic risk structure), and tests how well it predicts real economic activity in the United States. It finds that the systematic risk structure predicts the growth rate of industrial production 3 to 18 months into the future even when other leading indicators are controlled for, outperforming other models. A regime-switching estimation also shows that the systematic risk structure is very successful in identifying and capturing different growth regimes of industrial production. The second essay, How Much Leverage is Too Much, or Does Corporate Risk Determine the Severity of a Recession? investigates whether financial conditions of the U.S. corporate sector can explain the probability and severity of recessions. It proposes a measure of corporate vulnerability, the Corporate Vulnerability Index (CVI) constructed as the default probability for the entire corporate sector. It finds that the CVI is a significant predictor of the probability of a recession 4 to 6 quarters ahead, even controlling for other leading indicators, and that an increase in the CVI is also associated with a rise in the probability of a more severe and lengthy recession 3 to 6 quarters ahead. The third essay, Asian Flu or Wall Street Virus? Tech and Non-Tech Spillovers in the United States and Asia (with Jorge A. Chan-Lau), using TGARCH models, finds that U.S. stock markets have been the major source of price and volatility spillovers to stock markets in the Asia-Pacific region during three different periods: the pre-LTCM crisis period, the “tech bubble” period, and the “stock market correction” period. Hong Kong SAR, Japan, and Singapore were sources of spillovers within the region and affected the United States during the latter period. There is also evidence of structural breaks in the stock price and volatility dynamics induced during the “tech bubble” period. The fourth essay, Coping with Financial Spillovers from the United States: The Effect of U. S. Corporate Scandals on Canadian Stock Prices, investigates the effect of U.S. corporate scandals on stock prices of Canadian firms interlisted in the United States. It finds that firms interlisted during the pre-Enron period enjoyed increases in post-listing equilibrium prices, while firms interlisted during the post-Enron period experienced declines in post-listing equilibrium prices, relative to a model-based benchmark. Analyzing the entire universe of Canadian firms, it finds that interlisted firms, regardless of their listing time, were perceived as increasingly risky by Canadian investors after the Enron’s bankruptcy. / Diss. Stockholm : Handelshögskolan, 2003
|
293 |
[en] A COMPUTATIONAL APPROACH TO THE STRUCTURE AND DYNAMICS OF HUMAN SERUM ALBUMIN: EFFECTS OF THE HEME / [pt] UMA ABORDAGEM COMPUTACIONAL DA ESTRUTURA E DINÂMICA DA ALBUMINA SÉRICA HUMANA: EFEITOS DO HEMETEOBALDO RICARDO CUYA GUIZADO 18 July 2018 (has links)
[pt] As doenças trasmitidas pelo sangue, assim como a necessidade de bancos de sangue para um pronto auxílio em casos de acidentes tem estimulado esforços para desenvolver substitutos do sangue. A albumina serica humana (HSA do ingles Human Serum Albumin) é a proteína mais abundante no plasma sanguíneo. A molécula heme é a transportadora de oxigênio no sangue. Portanto, um estudo detalhado da interação HSA/heme seria útil em pesquisas que visam tornar o complexo HSA-heme em um substituto do sangue. Nesta tese, foram usadas técnicas de dinâmica molecular e ferramentas estatísticas para estudar o sistema HSA-heme em solvente explícito. Tanto o ligante quanto a proteína foram também estudados separadamente em meio aquoso. Dentre outros resultados, nosso estudo revelou a organização da água circundante, os efeitos da ligação do heme na HSA, os mecanismos moleculares da ligação
do heme, os movimentos coletivos da proteína livre e ligada, assim como também os aminoácidos que atuam como dobradiças moleculares na mudança conformacional que sofre a proteína ao ligar o heme. / [en] Diseases transmitted through the blood, as well as the need for blood banks to help in case of accidents, stimulated efforts to develop blood substitutes. The human serum albumin (HSA) is the most abundant protein in blood plasma. The heme molecule is the carrier of oxygen in the blood. Therefore, a detailed study of the interaction HSA/heme could give useful insights in the research aimed to convert the HSA-heme complex into a blood substitute. In this thesis, molecular dynamics techniques and statistical tools were applied to study the HSA-heme system in explicit solvent. Both ligand and protein were also studied separately in aqueous medium. Among other results, our study reveals the organization of the surrounding water, the effects of the heme upon its binding to HSA, the molecular mechanisms for heme binding, the collective motions of the protein with and without the heme, as well as the amino acids that act as molecular hinges in the conformational change between the free and bound forms of the protein.
|
294 |
Inference for stationary functional time series: dimension reduction and regressionKidzinski, Lukasz 24 October 2014 (has links)
Les progrès continus dans les techniques du stockage et de la collection des données permettent d'observer et d'enregistrer des processus d’une façon presque continue. Des exemples incluent des données climatiques, des valeurs de transactions financières, des modèles des niveaux de pollution, etc. Pour analyser ces processus, nous avons besoin des outils statistiques appropriés. Une technique très connue est l'analyse de données fonctionnelles (ADF).<p><p>L'objectif principal de ce projet de doctorat est d'analyser la dépendance temporelle de l’ADF. Cette dépendance se produit, par exemple, si les données sont constituées à partir d'un processus en temps continu qui a été découpé en segments, les jours par exemple. Nous sommes alors dans le cadre des séries temporelles fonctionnelles.<p><p>La première partie de la thèse concerne la régression linéaire fonctionnelle, une extension de la régression multivariée. Nous avons découvert une méthode, basé sur les données, pour choisir la dimension de l’estimateur. Contrairement aux résultats existants, cette méthode n’exige pas d'assomptions invérifiables. <p><p>Dans la deuxième partie, on analyse les modèles linéaires fonctionnels dynamiques (MLFD), afin d'étendre les modèles linéaires, déjà reconnu, dans un cadre de la dépendance temporelle. Nous obtenons des estimateurs et des tests statistiques par des méthodes d’analyse harmonique. Nous nous inspirons par des idées de Brillinger qui a étudié ces models dans un contexte d’espaces vectoriels. / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
|
295 |
Determinants of financial market development : the role of institutionsMadheu, Violet 10 1900 (has links)
This study aims to determine the main drivers of financial market development, with a
specific interest in the relationship between the stock and bank credit markets, as proxies
of financial market development, and the role of institutional quality, in ten African
countries for the period of 2009 to 2017. A number of econometric techniques such as
the General Methods of Moments (GMM) model for dynamic panel data, autoregressive
distribution lag (ARDL) bound testing approach to cointegration, vector error correction
model (VECM), and granger causality tests were applied in the study. We further
developed a composite index for both financial market development and institutional
quality using Principal Components Analysis (PCA). The results demonstrate that
institutional quality, as well as infrastructure development, economic growth, and inflation
are the main determinants of financial market development in our sample of ten African
countries. Findings from the ARDL bound testing approach confirm the existence of a
long-run association between institutional quality and financial market development.
Although financial market development has no effect on economic growth, institutional
quality was found to have a positive and highly significant effect on economic growth.
Furthermore, employing the Granger causality test, we found uni-directional granger
causality between financial market development and institutional quality, implying that
financial market development is a significant causal factor for institutional quality. In
consideration of these findings, policy formulation by governments should be designed
towards enhancing financial and institutional quality development, and this can be
possibly achieved by effective enforcement of law to encourage compliance, while
simultaneously eliminating corruption and other institutional hindrances to development / Lolu cwaningo luhlose ukuveza izinhlaka ezingabaphembeleli abasemqoka
ekuthuthukisweni kwezimakethe zezimali, kugxilwe kakhulu kubudlelwano obuphakathi
kwesitoko kanye nezimakethe zamabhangi ahlinzekana ngezikweletu, njengabancedisi
abathuthukisa izimakethe zezimali, kanye nendima emayelana nezinga leziko, emazweni
ase-Afrika ayishumi esikhathini esiphakathi kuka 2009 ukufikela ku 2017. Inani lezindlela
zokulinganisa izinga lomnotho ezinjenge-General Methods of Moments (GMM) model
yedatha yephaneli eguquguqukayo, i-autoregressive distribution lag (ARDL) bound
testing approach to cointegration, i-vector error correction model (VECM), Kanye negranger causality tests zisetshenzisiwe kucwaningo. Siqhubekele phambili nokwakha
inkomba ehlangene yazo zombili izinhlaka; ukuthuthukiswa kwezimakethe zezimali
Kanye nezinga leziko ngokusebenzisa uhlelo lwe-Principal Components Analysis (PCA).
Imiphumela ikhombisile ukuthi izinga leziko, Kanye nokuthuthukiswa kwengqalasizinda,
ukuhluma komnotho, Kanye nezinga lamandla email yizinkomba ezisemqoka
zokuthuthukiswa kwezimakethe zezimali kusampuli yethu elula yamazwe ase-Afrika
ayishumi. Ulwazi olutholakele ku-ARDL bound testing approach luqinisekisa ubukhona
kobudlelwano besikhathi eside obuphakathi kwezinga leziko kanye nokuthuthukiswa
kwezimakethe zezimali. Yize ukuthuthukiswa kwemakethe yezimali kungenawo
umthelela kwezokuhluma komnotho, izinga leziko lona liye latholakala ukuthi linomthelela
omuhle nosemqoka kakhulu ekukhuleni komnotho. Ngaphezu kwalokho, uma
sisebenzisa uhlelo lweGranger causality test, sifumene i-uni-directional granger causality
phakathi kwemakethe yezimali Kanye nezinga leziko, lokhu kuchaza ukuthi
ukuthuthukiswa kwezimakethe zezimali kuyimbangela esemqoka yezinga leziko. Uma
kubhekwa lolu lwazi olutholakele, imigomo eyakhwa uhulumeni kufanele yakhiwe
ngenhloso yokuqinisa ukuthuthukiswa kwezinga lezimali Kanye nezinga leziko, kanti
lokhu kungafinyelelwa ngokuqinisa kahle umthetho ukukhuthaza ukulandelwa
komthetho, kanti ngakolunye uhlangothi kuncishiswe izinga lenkohlakalo Kanye nezinye
izihibhe eziphazamiso ukuthuthukiswa kweziko. / Maikaelelo a thutopatlisiso ke go swetsa ka ditsamaisi tse dikgolo tsa tlhabololo ya mebaraka ya ditšhelete, ka kgatlhego e rileng mo kamanong magareng ga mebaraka ya setoko le ya sekoloto sa dibanka, jaaka kemedi ya tlhabololo ya mebaraka ya ditšhelete,
le seabe sa boleng jwa ditheo, mo dinageng di le lesome tsa Aforika mo pakeng ya 2009 go ya go 2017. Go dirisitswe dithekeniki di le mmalwa tsa ikonometiriki di tshwana le sekao sa General Methods of Moments (GMM) sa data ya phanele e anameng, molebo wa tekeletso e kopanyang ya autoregressive distribution lag (ARDL), sekao sa vector error correction (VECM) le diteko tsa sesusumetsi tsa Granger. Gape re tlhamile tshupane ya dikarolo ya tlhabololo ya mmaraka wa ditšhelete le boleng jwa ditheo re dirisa Tokololo ya Dikarolo tse Dikgolo (Principal Components Analysis (PCA)). Dipholo di bontsha gore boleng jwa ditheo, gammogo le tlhabololo ya mafaratlhatlha, kgolo ya ikonomi le infoleišene ke diswetsi tsa tlhabololo ya mebaraka ya ditšhelete mo sampoleng ya rona ya dinaga di le lesome tsa Aforika. Diphitlhelelo go tswa mo molebong wa teko e kopanyang ya ARDL di tlhomamisa go nna teng ga kamano ya paka e telele magareng ga boleng jwa ditheo le tlhabololo ya mebaraka ya ditšhelete. Le fa tlhabololo ya mebaraka ya ditšhelete e sa ame kgolo ya ikonomi ka gope, boleng jwa ditheo bo fitlhetswe bo na le ditlamorago tse di siameng e bile di le botlhokwa mo kgolong ya ikonomi. Mo godimo ga moo, ka go dirisa teko ya Granger ya sesusumetsi, re fitlhetse go
na le sesusumetsi sa ntlha e le nngwe sa Granger magareng ga lhabololo ya mebaraka ya ditšhelete le boleng jwa ditheo, mo go rayang gore tlhabololo ya mebaraka ya ditšhelete ke ntlha e e botlhokwa ya sesusumetsi sa boleng jwa ditheo. Fa go lebelelwa
diphitlhelelo tseno, go dirwa ga dipholisi ke dipuso go tshwanetse ga dirwa gore go tokafatse tlhabololo ya boleng jwa ditšhelete le ditheo, mme seno se ka fitlhelelwa ka tiragatso e e bokgoni ya molao go rotloetsa kobamelo mme go ntse go fedisiwa bobodu le dikgoreletsi tse dingwe tsa tlhabololo mo ditheong. / Business Management / M. Com. (Business Management (Finance))
|
296 |
3D CBCT analysis of the frontal sinus and its relationship to forensic identificationKrus, Bianaca S. January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The positive identification of human remains that are decomposed, burnt, or otherwise disfigured can prove especially challenging in forensic anthropology, resulting in the need for specialized methods of analysis. Due to the unique morphological characteristics of the frontal sinus, a positive identification can be made in cases of unknown human remains, even when remains are highly cremated or decomposed. This study retrospectively reviews 3D CBCT images of a total of 43 Caucasian patients between the ages of 20-38 from the Indiana University School of Dentistry to quantify frontal sinus differences between adult males and females and investigate the usefulness of frontal sinus morphology for forensic identification. Digit codes with six sections and eleven-digit numbers were created to classify each individual sinus. It was shown that 3D CBCT images of the frontal sinus could be used to make a positive forensic identification. Metric measurements displayed a high degree of variability between sinuses and no two digit codes were identical. However, it was also shown that there were almost no quantifiable and significant sexually dimorphic differences between male and female frontal sinuses. This study confirms that sex determination should not be a primary goal of frontal sinus analysis and highlights the importance of creating a standard method of frontal sinus evaluation based on metric measurements.
|
297 |
Spectroscopic and chemometric analysis of automotive clear coat paints by micro fourier transform infrared spectroscopyOsborne Jr., James D. January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Clear coats have been part of automotive field paint finishes for several decades. Originally a layer of paint with no pigment, they have evolved into a protective layer important to the appearance and longevity of the vehicle's finish. These clear coats have been studied previously using infrared spectroscopy and other spectroscopic techniques. Previous studies focused on either all the layers of an automobile finish or on chemometric analysis of clear coats using other analytical techniques. For this study, chemometric analysis was performed on preprocessed spectra averaged from five separate samples. Samples were analyzed on a Thermo-Nicolet Nexus 670 connected to a Continuμm™ FT-IR microscope. Two unsupervised chemometric techniques, Agglomerative Hierarchical Clustering (AHC) and Principal Component Analysis (PCA), were used to evaluate the data set. Discriminant analysis, a supervised technique, was evaluated using several known qualifiers; these included cluster group from AHC, make, model, and year. Although discriminant analysis confirmed the AHC and PCA results, no correlation to make, model, or year was indicated.
|
298 |
Modelle zur Beschreibung der Verkehrssicherheit innerörtlicher Hauptverkehrsstraßennetze unter besonderer Berücksichtigung der UmfeldnutzungAurich, Allan 17 May 2013 (has links)
In der Arbeit wird eine Methodik einer zusammenhängenden Analyse und modellhaften Beschreibung der Verkehrssicherheit in städtischen Hauptstraßennetzen am Beispiel der Stadt Dresden entwickelt. Die dabei gewonnenen Modelle dienen der Abschätzung von Erwartungswerten von Unfallhäufigkeiten mit und ohne Personenschaden unter Berücksichtigung der Verkehrsbeteiligungsart.
Die Grundlage bilden multivariate Regressionsmodelle auf Basis verallgemeinerter linearer Modelle (GLM). Die Verwendung verallgemeinerter Regressionsmodelle erlaubt eine Berücksichtigung von Verteilungen, die besser geeignet sind, den Unfallentstehungsprozess wiederzugeben, als die häufig verwendete Normalverteilung. Im konkreten Fall werden hierzu die Poisson-Verteilung sowie die negative Binomialverteilung verwendet.
Um Effekte im Hauptverkehrsstraßennetz möglichst trennscharf abbilden zu können, werden vier grundsätzliche Netzelemente differenziert und das Netz entsprechend zerlegt. Unterschieden werden neben Streckenabschnitten und Hauptverkehrsknotenpunkten auch Annäherungsbereiche und Anschlussknotenpunkte. Die Kollektive der Knotenpunkte werden ferner in signalisierte und nicht-signalisierte unterteilt. Es werden zunächst Modelle unterschiedlicher Unfallkollektive getrennt für alle Kollektive der vier Netzelemente berechnet. Anschließend werden verschiedene Vorgehensweisen für eine Zusammenfassung zu Netzmodellen entwickelt.
Neben der Verwendung verkehrstechnischer und infrastruktureller Größen als erklärende Variable werden in der Arbeit auch Kenngrößen zur Beschreibung der Umfeldnutzung ermittelt und im Rahmen der Regression einbezogen. Die Quantifizierung der Umfeldnutzung erfolgt mit Hilfe von Korrelations-, Kontingenz- und von Hauptkomponentenanalysen (PCA).
Im Ergebnis werden Modelle präsentiert, die eine multivariate Quantifizierung erwarteter Unfallhäufigkeiten in Hauptverkehrsstraßennetzen erlauben. Die vorgestellte Methodik bildet eine mögliche Grundlage für eine differenzierte Sicherheitsbewertung verkehrsplanerischer Variantenabschätzungen. / A methodology is developed in order to predict the number of accidents within an urban main road network. The analysis was carried out by surveying the road network of Dresden. The resulting models allow the calculation of individual expectancy values for accidents with and without injury involving different traffic modes.
The statistical modelling process is based on generalized linear models (GLM). These were chosen due to their ability to take into account certain non-normal distributions. In the specific case of accident counts, both the Poisson distribution and the negative binomial distribution are more suitable for reproducing the origination process than the normal distribution. Thus they were chosen as underlying distributions for the subsequent regressions.
In order to differentiate overlaying influences, the main road network is separated into four basic elements: major intersections, road sections, minor intersections and approaches. Furthermore the major and minor intersections are additionally subdivided into signalised and non-signalised intersections. Separate models are calculated for different accident collectives for the various types of elements. Afterwards several methodologies for calculating aggregated network models are developed and analysed.
Apart from traffic-related and infrastructural attributes, environmental parameters are derived taking into account the adjacent building structure as well as the surrounding land-use, and incorporated as explanatory variables within the regression. The environmental variables are derived from statistical analyses including correlation matrices, contingency tables and principal components analyses (PCA).
As a result, a set of models is introduced which allows a multivariate calculation of expected accident counts for urban main road networks. The methodology developed can serve as a basis for a differentiated safety assessment of varying scenarios within a traffic planning process.
|
299 |
Three essays in asset pricing and llimate financeN'Dri, Kouadio Stéphane 08 1900 (has links)
Cette thèse, divisée en trois chapitres, contribue à la vaste et récente littérature sur l'évaluation des actifs et la finance climatique. Le premier chapitre contribue à la littérature sur la finance climatique tandis que les deux derniers contribuent à la littérature sur l'évalutaion des actifs.
Le premier chapitre analyse comment les politiques environnementales visant à réduire les émissions de carbone affectent les prix des actifs et la consommation des ménages. En utilisant de nouvelles données, je propose une mesure des émissions de carbone du point de vue du consommateur et une mesure du risque de croissance de la consommation de carbone. Les mesures sont basées sur des informations sur la consommation totale et l'empreinte carbone de chaque bien et service. Pour analyser les effets des politiques environnementales, un modèle de risques de long terme est développé dans lequel la croissance de la consommation comprend deux composantes: le taux de croissance de la consommation de carbone et le taux de croissance de la part de la consommation de carbone dans la consommation totale. Ce chapitre soutient que le risque de long terme de la croissance de la consommation provient principalement de la croissance de la consommation de carbone découlant des politiques et des actions visant à réduire les émissions, telles que l'Accord de Paris et la Conférence des Nations Unies sur le changement climatique (COP26). Mon modèle aide à détecter le risque de long terme dans la consommation des politiques climatiques tout en résolvant simultanément les énigmes de la prime de risque et de la volatilité, et en expliquant la coupe transversale des actifs. La décomposition de la consommation pourrait conduire à identifier les postes de consommation les plus polluants et à construire une stratégie d'investissement minimisant ou maximisant un critère environnemental de long terme.
Le deuxième chapitre (co-écrit avec René Garcia et Caio Almeida) étudie le rôle des facteurs non linéaires indépendants dans la valorisation des actifs. Alors que la majorité des facteurs d'actualisation stochastique (SDF) les plus utilisés qui expliquent la coupe transversale des rendements boursiers sont obtenus à partir des composantes principales linéaires, nous montrons dans ce deuxième chapitre que le fait de permettre la substitution de certaines composantes principales linéaires par des facteurs non linéaires indépendants améliore systématiquement la capacité des facteurs d'actualisation stochastique de valoriser la coupe transversale des actifs. Nous utilisons les 25 portefeuilles de Fama-French, cinquante portefeuilles d'anomalies et cinquante anomalies plus les termes d'interaction basés sur les caractéristiques pour tester l'efficacité des facteurs dynamiques non linéaires. Le SDF estimé à l'aide d'un mélange de facteurs non linéaires et linéaires surpasse ceux qui utilisent uniquement des facteurs linéaires ou des rendements caractéristiques bruts en termes de performance mesurée par le R-carré hors échantillon. De plus, le modèle hybride - utilisant à la fois des composantes principales non linéaires et linéaires - nécessite moins de facteurs de risque pour atteindre les performances hors échantillon les plus élevées par rapport à un modèle utilisant uniquement des facteurs linéaires.
Le dernier chapitre étudie la prévisibilité du rendement des anomalies à travers les déciles à l'aide d'un ensemble de quarante-huit variables d'anomalie construites à partir des caractéristiques de titres individuels. Après avoir construit les portefeuilles déciles, cet article étudie leur prévisibilité en utilisant leurs propres informations passées et d'autres prédicteurs bien connus. Les analyses révèlent que les rendements des portefeuilles déciles sont persistants et prévisibles par le ratio de la valeur comptable sur la valeur de marché de l'entreprise, la variance des actions, le rendement des dividendes, le ratio des prix sur les dividendes, le taux de rendement à long terme, le rendement des obligations d'entreprise, le TED Spread et l'indice VIX. De plus, une stratégie consistant à prendre une position longue sur le décile avec le rendement attendu le plus élevé et à prendre une position courte sur le décile avec le rendement attendu le plus bas chaque mois donne des rendements moyens et un rendement par risque bien meilleurs que la stratégie traditionnelle fondée sur les déciles extrêmes pour quarante-cinq des quarante-huit anomalies. / This thesis, divided into three chapters, contributes to the vast and recent literature on asset pricing, and climate finance. The first chapter contributes to the climate finance literature while the last two contribute to the asset pricing literature.
The first chapter analyzes how environmental policies that aim to reduce carbon emissions affect asset prices and household consumption. Using novel data, I propose a measure of carbon emissions from a consumer point of view and a carbon consumption growth risk measure. The measures are based on information on aggregate consumption and the carbon footprint for each good and service. To analyze the effects of environmental policies, a long-run risks model is developed where consumption growth is decomposed into two components: the growth rate of carbon consumption and the growth rate of the share of carbon consumption out of total consumption. This paper argues that the long-run risk in consumption growth comes mainly from the carbon consumption growth arising from policies and actions to curb emissions, such as the Paris Agreement and the U.N. Climate Change Conference (COP26). My model helps to detect long-run risk in consumption from climate policies while simultaneously solving the equity premium and volatility puzzles, and explaining the cross-section of assets. The decomposition of consumption could lead to identifying the most polluting consumption items and to constructing an investment strategy that minimizes or maximizes a long-term environmental criterion.
The second chapter (co-authored with René Garcia, and Caio Almeida) studies the role of truly independent nonlinear factors in asset pricing. While the most successful stochastic discount factor (SDF) models that price well the cross-section of stock returns are obtained from regularized linear principal components of characteristic-based returns we show that allowing for substitution of some linear principal components by independent nonlinear factors consistently improves the SDF's ability to price this cross-section. We use the Fama-French 25 ME/BM-sorted portfolios, fifty anomaly portfolios, and fifty anomalies plus characteristic-based interaction terms to test the effectiveness of the nonlinear dynamic factors. The SDF estimated using a mixture of nonlinear and linear factors outperforms the ones using solely linear factors or raw characteristic returns in terms of out-of-sample R-squared pricing performance. Moreover, the hybrid model --using both nonlinear and linear principal components-- requires fewer risk factors to achieve the highest out-of-sample performance compared to a model using only linear factors.
The last chapter studies anomaly return predictability across deciles using a set of forty-eight anomaly variables built using individual stock characteristics. After constructing the decile portfolios, this paper studies their predictability using their own past information, and other well-known predictors. The analyses reveal that decile portfolio returns are persistent and predictable by book-to-market, stock variance, dividend yield, dividend price ratio, long-term rate of return, corporate bond return, TED Spread, and VIX index. Moreover, a strategy consisting of going long on the decile with the highest expected return and going short on the decile with the lowest expected return each month gives better mean returns and Sharpe ratios than the traditional strategy based on extreme deciles for forty-five out of forty-eight anomalies.
|
300 |
OBJECTIVE FLOW PATTERN IDENTIFICATION AND CLASSIFICATION IN INCLINED TWO-PHASE FLOWS USING MACHINE LEARNING METHODSDavid H Kang Jr (15352852) 27 April 2023 (has links)
<p>Two-phase modeling and simulation capabilities are strongly dependent on the accuracy of flow regime identification methods. Flow regimes have traditionally been determined through visual observation, resulting in subjective classifications that are susceptible to inconsistencies and disagreements between researchers. Since the majority of two-phase flow studies have been concentrated around vertical and horizontal pipe orientations, flow patterns in inclined pipes are not well-understood. Moreover, they may not be adequately described by conventional flow regimes which were conceptualized for vertical and horizontal flows. Recent work has explored applying machine learning methods to vertical and horizontal flow regime identification to help remedy the subjectivity of classification. Such methods have not, however, been successfully applied to inclined flow orientations. In this study, two novel unsupervised machine learning methods are proposed: a modular configuration of multiple machine learning algorithms that is adaptable to different pipe orientations, and a second universal approach consisting of several layered algorithms which is capable of performing flow regime classification for data spanning multiple orientations. To support this endeavor, an experimental database is established using a dual-ring impedance meter. The signals obtained by the impedance meter are capable of conveying distinct features of the various flow patterns observed in vertical, horizontal, and inclined pipes. Inputs to the unsupervised learning algorithms consist of statistical measures computed from these signals. A novel conceptualization for flow pattern classification is developed, which maps three statistical parameters from the data to red, green, and blue primary color intensities. By combining the three components, a flow pattern map can be developed wherein similar colors are produced by flow conditions with like statistics, transforming the way flow regimes are represented on a flow regime map. The resulting dynamic RGB flow pattern map provides a physical representation of gradual changes in flow patterns as they transition from one regime to another. By replacing the static transition boundaries with physically informed, dynamic gradients between flow patterns, transitional flow patterns may be described with far greater accuracy. This study demonstrates the effectiveness of the proposed method in generating objective flow regime maps, providing a basis for further research on the characterization of two-phase flow patterns in inclined pipes. The three proposed methods are compared and evaluated against flow regime maps found in literature.</p>
|
Page generated in 0.1026 seconds