• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 29
  • 15
  • 10
  • 9
  • 6
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • Tagged with
  • 290
  • 290
  • 143
  • 82
  • 59
  • 46
  • 46
  • 37
  • 32
  • 31
  • 31
  • 26
  • 24
  • 22
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Faktorer som kan ha samband med företags lönsamhet : En empirisk studie på de 20 största bolagen på Stockholmsbörsen / Factors that can correlate with corporate profitability : An empirical study of the 20 largest public corporations on the Stockholm stock market

Karell-Holmgren, Kasper, Mirza, Pauline January 2009 (has links)
<p><strong>Syfte: </strong>Syftet med uppsatsen är att undersöka och analysera om det finns något samband mellan företags lönsamhet och dess kapitalstruktur, storlek eller branschtillhörigheten. Detta innebär att en empirisk studie kommer att ske på soliditet, omsättning samt branschtillhörigheten för att se hur och om det finns något samband mellan lönsamheten och dessa tre faktorer. Vidare är syftet även att undersöka om det kan finnas något samband mellan företags lönsamhet och företags standardavvikelse på räntabilitet.</p><p><strong>Metod: </strong>Undersökningen är en empirisk studie med en deduktiv kvantitativ och kvalitativ ansats. Empirin undersöks med olika statistiska metoder såsom regressionsanalys och korrelationsberäkning.</p><p><strong>Teori: </strong>Uppsatsen har utgått från teorier gällande kapitalstruktur och lönsamhet. Nyckeltalen som används från dessa teorier är soliditet respektive räntabilitet på eget kapital.</p><p><strong>Empiri: </strong>Data från de 20 största börsnoterade företagen på Stockholmsbörsen har samlats in från företagens årsredovisningar 2003-2007. Den data som tagits fram är data på företagens lönsamhet (räntabilitet på eget kapital), kapitalstruktur (soliditet), storlek (omsättning), samt lönsamhet för företag utöver de 20 valda företagens, detta för att användas i analysen av branschtillhörigheten.</p><p><strong>Resultat: </strong>Resultatet av undersökningen visar att det inte finns något signifikant samband mellan vare sig företags kapitalstruktur och lönsamheten, företagsstorlek och lönsamheten, branschtillhörigheten för ett företag och lönsamheten eller standardavvikelse på räntabilitet och lönsamhet.</p> / <p><strong>Purpose: </strong>The purpose of the essay is to analyze the potential correlation between corporate profitability and corporate capital structure, corporate size, and corporate line of business. An empirical study will be done on solidity, turnover and on the line of business to determine if a correlation exists between profitability and these factors. The purpose is also to examine if there is a correlation between the corporate profitabilty and standard deviation of the corporations return on equity.</p><p><strong>Method: </strong>The survey is an empirical study employing a deductive quantitative and qualitative approach. The empirics are examined with statistical methods such as regression analysis and calculation of correlation.</p><p><strong>Theory: </strong>The essay uses theories about capital structure and profitability. The key numbers that have been used from these theories are solidity and return on equity.</p><p><strong>Empirics: </strong>Data from the 20 largest public corporations on the Stockholm stock market collected from their respective annual 2003-2007 reports. This includes data about corporate profitability (return on equity), capital structure (solidity), size (turnover), and the profitability of corporations beyond the 20 chosen ones, this to be used in the analyze of corporations within the specific line of business.</p><p><strong>Result: </strong>This survey shows that there is no significant correlation between corporate capital structure and the profitability, corporate size and the profitability, line of business of a corporation and the profitability or standard deviation of the corporations return on equity and the profitability.</p>
212

Faktorer som kan ha samband med företags lönsamhet : En empirisk studie på de 20 största bolagen på Stockholmsbörsen / Factors that can correlate with corporate profitability : An empirical study of the 20 largest public corporations on the Stockholm stock market

Karell-Holmgren, Kasper, Mirza, Pauline January 2009 (has links)
Syfte: Syftet med uppsatsen är att undersöka och analysera om det finns något samband mellan företags lönsamhet och dess kapitalstruktur, storlek eller branschtillhörigheten. Detta innebär att en empirisk studie kommer att ske på soliditet, omsättning samt branschtillhörigheten för att se hur och om det finns något samband mellan lönsamheten och dessa tre faktorer. Vidare är syftet även att undersöka om det kan finnas något samband mellan företags lönsamhet och företags standardavvikelse på räntabilitet. Metod: Undersökningen är en empirisk studie med en deduktiv kvantitativ och kvalitativ ansats. Empirin undersöks med olika statistiska metoder såsom regressionsanalys och korrelationsberäkning. Teori: Uppsatsen har utgått från teorier gällande kapitalstruktur och lönsamhet. Nyckeltalen som används från dessa teorier är soliditet respektive räntabilitet på eget kapital. Empiri: Data från de 20 största börsnoterade företagen på Stockholmsbörsen har samlats in från företagens årsredovisningar 2003-2007. Den data som tagits fram är data på företagens lönsamhet (räntabilitet på eget kapital), kapitalstruktur (soliditet), storlek (omsättning), samt lönsamhet för företag utöver de 20 valda företagens, detta för att användas i analysen av branschtillhörigheten. Resultat: Resultatet av undersökningen visar att det inte finns något signifikant samband mellan vare sig företags kapitalstruktur och lönsamheten, företagsstorlek och lönsamheten, branschtillhörigheten för ett företag och lönsamheten eller standardavvikelse på räntabilitet och lönsamhet. / Purpose: The purpose of the essay is to analyze the potential correlation between corporate profitability and corporate capital structure, corporate size, and corporate line of business. An empirical study will be done on solidity, turnover and on the line of business to determine if a correlation exists between profitability and these factors. The purpose is also to examine if there is a correlation between the corporate profitabilty and standard deviation of the corporations return on equity. Method: The survey is an empirical study employing a deductive quantitative and qualitative approach. The empirics are examined with statistical methods such as regression analysis and calculation of correlation. Theory: The essay uses theories about capital structure and profitability. The key numbers that have been used from these theories are solidity and return on equity. Empirics: Data from the 20 largest public corporations on the Stockholm stock market collected from their respective annual 2003-2007 reports. This includes data about corporate profitability (return on equity), capital structure (solidity), size (turnover), and the profitability of corporations beyond the 20 chosen ones, this to be used in the analyze of corporations within the specific line of business. Result: This survey shows that there is no significant correlation between corporate capital structure and the profitability, corporate size and the profitability, line of business of a corporation and the profitability or standard deviation of the corporations return on equity and the profitability.
213

Likvida tillgångars påverkan på lönsamhet och aktievärde : En studie av svenska företag på Nasdaq OMX Nordic Stockholm mellan 2008-2011

Nitschmann, Johanna, Norén, William January 2013 (has links)
Objective: The study will investigate whether cash liquidity have a negative affect on profitability and share value of companies, listed on Nasdaq OMX Stockholm 2008-2011. Part of the purpose is also to show if the industry risk is of importance for treasury management of these companies. Method: The methodology for the study is key analysis through hypothesis testing and regression analysis Conclusion: The liquidity ratio affects profitability in a negative direction on the entire sample. No other conclusion can be drawn. / Syfte: Studiens syfte är att undersöka om kassalikviditeten har en negativ påverkan på lönsamhet och aktievärde, hos företag noterade på Nasdaq OMX Nordic Stockholm mellan åren 2008 till 2011. Delsyftet är att undersöka om branschrisken har ett positivt samband med kassalikviditeten för dessa företag. Metod: Metoden för studien är nyckeltalsanalys med hjälp av hypotesprövning och regressionsanalys, för företag på Nasdaq OMX Nordic Stockholm. Slutsats: Kassalikviditeten påverkar lönsamheten i negativ riktning på hela urvalet. Inga andra slutsatser kan dras.
214

Some questions in risk management and high-dimensional data analysis

Wang, Ruodu 04 May 2012 (has links)
This thesis addresses three topics in the area of statistics and probability, with applications in risk management. First, for the testing problems in the high-dimensional (HD) data analysis, we present a novel method to formulate empirical likelihood tests and jackknife empirical likelihood tests by splitting the sample into subgroups. New tests are constructed to test the equality of two HD means, the coefficient in the HD linear models and the HD covariance matrices. Second, we propose jackknife empirical likelihood methods to formulate interval estimations for important quantities in actuarial science and risk management, such as the risk-distortion measures, Spearman's rho and parametric copulas. Lastly, we introduce the theory of completely mixable (CM) distributions. We give properties of the CM distributions, show that a few classes of distributions are CM and use the new technique to find the bounds for the sum of individual risks with given marginal distributions but unspecific dependence structure. The result partially solves a problem that had been a challenge for decades, and directly leads to the bounds on quantities of interest in risk management, such as the variance, the stop-loss premium, the price of the European options and the Value-at-Risk associated with a joint portfolio.
215

Evaluation of Ultra-Wideband Sensing Technology for Position Location in Indoor Construction Environments

Aryan, Afrooz January 2011 (has links)
Effective construction management involves real-time decisions regarding the progress of specific activities, the location of materials and equipment, and the construction site safety. The decision making process can be improved using real-time positioning technologies such as Radio Frequency Identification Device (RFID) systems, Global Positioning System (GPS), and Ultra Wide Band (UWB) sensors. While the GPS is not applicable to indoor positioning and RFID tags cannot provide a fully automated system for position location, the characteristics of UWB systems make this technology a strong candidate for a fully automated positioning system in an indoor construction environment. This thesis presents a comprehensive study of the performance of UWB systems in a controlled laboratory environment and in an institutional construction site in Waterloo, Canada as well as for a particular safety application. A primary objective of the research was to establish the accuracy of real-time position location under various conditions, including the effect of different construction materials (e.g., wood and metal), and to analyze changes in the accuracy of position location as construction progresses and the indoor environment physically evolves. Different challenges faced in implementing such a system in an active construction environment are addressed. Based on a statistical analysis of laboratory data, and considering the construction site experience, the reliability of the UWB positioning system for the aforementioned environments is discussed. Furthermore, an automated safety system is proposed using the real-time UWB positioning technology. Based on the error modeling of the UWB position location, an optimum alarming algorithm is designed for the proposed safety system and the reliability of such system is evaluated through a statistical analysis.
216

Model Validation and Discovery for Complex Stochastic Systems

Jha, Sumit Kumar 02 July 2010 (has links)
In this thesis, we study two fundamental problems that arise in the modeling of stochastic systems: (i) Validation of stochastic models against behavioral specifications such as temporal logics, and (ii) Discovery of kinetic parameters of stochastic biochemical models from behavioral specifications. We present a new Bayesian algorithm for Statistical Model Checking of stochastic systems based on a sequential version of Jeffreys’ Bayes Factor test. We argue that the Bayesian approach is more suited for application do- mains like systems biology modeling, where distributions on nuisance parameters and priors may be known. We prove that our Bayesian Statistical Model Checking algorithm terminates for a large subclass of prior probabilities. We also characterize the Type I/II errors associated with our algorithm. We experimentally demonstrate that this algorithm is suitable for the analysis of complex biochemical models like those written in the BioNetGen language. We then argue that i.i.d. sampling based Statistical Model Checking algorithms are not an effective way to study rare behaviors of stochastic models and present another Bayesian Statistical Model Checking algorithm that can incorporate non-i.i.d. sampling strategies. We also present algorithms for synthesis of chemical kinetic parameters of stochastic biochemical models from high level behavioral specifications. We consider the setting where a modeler knows facts that must hold on the stochastic model but is not confident about some of the kinetic parameters in her model. We suggest algorithms for discovering these kinetic parameters from facts stated in appropriate formal probabilistic specification languages. Our algorithms are based on our theoretical results characterizing the probability of a specification being true on a stochastic biochemical model. We have applied this algorithm to discover kinetic parameters for biochemical models with as many as six unknown parameters.
217

Evaluation of Ultra-Wideband Sensing Technology for Position Location in Indoor Construction Environments

Aryan, Afrooz January 2011 (has links)
Effective construction management involves real-time decisions regarding the progress of specific activities, the location of materials and equipment, and the construction site safety. The decision making process can be improved using real-time positioning technologies such as Radio Frequency Identification Device (RFID) systems, Global Positioning System (GPS), and Ultra Wide Band (UWB) sensors. While the GPS is not applicable to indoor positioning and RFID tags cannot provide a fully automated system for position location, the characteristics of UWB systems make this technology a strong candidate for a fully automated positioning system in an indoor construction environment. This thesis presents a comprehensive study of the performance of UWB systems in a controlled laboratory environment and in an institutional construction site in Waterloo, Canada as well as for a particular safety application. A primary objective of the research was to establish the accuracy of real-time position location under various conditions, including the effect of different construction materials (e.g., wood and metal), and to analyze changes in the accuracy of position location as construction progresses and the indoor environment physically evolves. Different challenges faced in implementing such a system in an active construction environment are addressed. Based on a statistical analysis of laboratory data, and considering the construction site experience, the reliability of the UWB positioning system for the aforementioned environments is discussed. Furthermore, an automated safety system is proposed using the real-time UWB positioning technology. Based on the error modeling of the UWB position location, an optimum alarming algorithm is designed for the proposed safety system and the reliability of such system is evaluated through a statistical analysis.
218

Theoretical and empirical essays on microeconometrics

Possebom, Vitor Augusto 17 February 2016 (has links)
Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-08T00:06:59Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 905848 bytes, checksum: 1d5af42563617b7a8058b09baab1e040 (MD5) / Rejected by Letícia Monteiro de Souza (leticia.dsouza@fgv.br), reason: Prezado, Vítor, O seu trabalho foge totalmente das normas ABNT ou APA. Por gentileza, verificar trabalhos dos seus colegas postados na Biblioteca Digital para conhecimento. Qualquer dúvida, estou a disposição para falar ao telefone, onde fica mais fácil a comunicação. Atenciosamente, Letícia Monteiro 3799-3631 on 2016-03-08T11:53:56Z (GMT) / Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-08T20:40:10Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 926719 bytes, checksum: 6db5399b2d8e24a4b01f2bec748e4e95 (MD5) / Rejected by Letícia Monteiro de Souza (leticia.dsouza@fgv.br), reason: Prezado Vítor, Favor alterar o seu trabalho conforme as normas da ABNT. 1 - O Epigrafo deve constar na 5ª página, anteriormente a Dedicatória. 2 - Agradecimentos na 7ª página: Deve constar uma versão em português antes da versão em inglês. O título deve ser em caixa alta, centralizado e em negrito. 3 - Assim como em Agradecimentos, os títulos de: Resumo, Abstract e Sumário, devem ser em caixa alta, centralizado e em negrito. Estou a disposição para eventuais dúvidas. Atenciosamente, Letícia Monteiro 3799-3631 on 2016-03-09T12:15:54Z (GMT) / Submitted by Vitor Augusto Possebom (vitorapossebom@gmail.com) on 2016-03-10T00:21:36Z No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) / Approved for entry into archive by Letícia Monteiro de Souza (leticia.dsouza@fgv.br) on 2016-03-10T11:59:00Z (GMT) No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) / Made available in DSpace on 2016-03-10T12:48:54Z (GMT). No. of bitstreams: 1 possebom_2016_masters-thesis.pdf: 933731 bytes, checksum: 69d467a1d6cb459ddd326d7fd593b4f9 (MD5) Previous issue date: 2016-02-17 / This Master Thesis consists of one theoretical article and one empirical article on the field of Microeconometrics. The first chapter\footnote{We also thank useful suggestions by Marinho Bertanha, Gabriel Cepaluni, Brigham Frandsen, Dalia Ghanem, Ricardo Masini, Marcela Mello, Áureo de Paula, Cristine Pinto, Edson Severnini and seminar participants at São Paulo School of Economics, the California Econometrics Conference 2015 and the 37\textsuperscript{th} Brazilian Meeting of Econometrics.}, called \emph{Synthetic Control Estimator: A Generalized Inference Procedure and Confidence Sets}, contributes to the literature about inference techniques of the Synthetic Control Method. This methodology was proposed to answer questions involving counterfactuals when only one treated unit and a few control units are observed. Although this method was applied in many empirical works, the formal theory behind its inference procedure is still an open question. In order to fulfill this lacuna, we make clear the sufficient hypotheses that guarantee the adequacy of Fisher's Exact Hypothesis Testing Procedure for panel data, allowing us to test any \emph{sharp null hypothesis} and, consequently, to propose a new way to estimate Confidence Sets for the Synthetic Control Estimator by inverting a test statistic, the first confidence set when we have access only to finite sample, aggregate level data whose cross-sectional dimension may be larger than its time dimension. Moreover, we analyze the size and the power of the proposed test with a Monte Carlo experiment and find that test statistics that use the synthetic control method outperforms test statistics commonly used in the evaluation literature. We also extend our framework for the cases when we observe more than one outcome of interest (simultaneous hypothesis testing) or more than one treated unit (pooled intervention effect) and when heteroskedasticity is present. The second chapter, called \emph{Free Economic Area of Manaus: An Impact Evaluation using the Synthetic Control Method}, is an empirical article. We apply the synthetic control method for Brazilian city-level data during the 20\textsuperscript{th} Century in order to evaluate the economic impact of the Free Economic Area of Manaus (FEAM). We find that this enterprise zone had positive significant effects on Real GDP per capita and Services Total Production per capita, but it also had negative significant effects on Agriculture Total Production per capita. Our results suggest that this subsidy policy achieve its goal of promoting regional economic growth, even though it may have provoked mis-allocation of resources among economic sectors. / Esta dissertação de mestrado consiste em um artigo teórico e um artigo empírico no campo da Microeconometria. O primeiro capítulo contribui para a literatura sobre técnica de inferência do método de controle sintético. Essa metodologia foi proposta para responder a questões envolvendo contrafactuais quando apenas uma unidade tratada e poucas unidades controle são observadas. Apesar de esse método ter sido aplicado em muitos trabalhos empíricos, a teoria formal por trás de seu procedimento de inferência ainda é uma questão em aberto. Para preencher essa lacuna, nós deixamos claras hipóteses suficientes que garantem a validade do Procedimento Exato de Teste de Hipótese de Fisher para dados em painel, permitindo que nós testássemos qualquer hipótese nula do tipo \emph{sharp} e, consequentemente, que nós propuséssemos uma nova forma de estimar conjuntos de confiança para o Estimador de Controle Sintético por meio da inversão de uma estatística de teste, o primeiro conjunto de confiança quando temos acesso apenas a dados agregados cuja dimensão de \emph{cross-section} pode ser maior que a dimensão temporal. Ademais, nós analisamos o tamanho e o poder do teste proposto por meio de um experimento de Monte Carlo e encontramos que estatísticas de teste que usam o método de controle sintético apresentam uma performance superior àquela apresentada pelas estatísticas de teste comumente analisadas na literatura de avaliação de impacto. Nós também estendemos nosso procedimento para abarcar os casos em que observamos mais de uma variável de interesse (teste simultâneo de hipótese) ou mais de uma unidade tratada (efeito agregado da intervenção) e quando heterocedasticidade está presente. O segundo capítulo é um artigo empírico. Nós aplicamos o método de controle sintético a dados municipais brasileiros durante o século 20 com o intuito de avaliar o impacto econômico da Zona Franca de Manaus (ZFM). Nós encontramos que essa zona de empreendimento teve efeitos positivos significantes sobre o PIB Real per capita e sobre a Produção Total per capita do setor de Serviços, mas também teve um efeito negativo e significante sobre a Produção total per capita do setor Agrícola. Nossos resultados sugerem que essa política de subsídio alcançou seu objetivo de promover crescimento econômico regional, apesar de possivelmente ter provocado falhas de alocação de recursos entre setores econômicos.
219

Statistical properties of barycenters in the Wasserstein space and fast algorithms for optimal transport of measures / Propriétés statistiques du barycentre dans l’espace de Wasserstein

Cazelles, Elsa 21 September 2018 (has links)
Cette thèse se concentre sur l'analyse de données présentées sous forme de mesures de probabilité sur R^d. L'objectif est alors de fournir une meilleure compréhension des outils statistiques usuels sur cet espace muni de la distance de Wasserstein. Une première notion naturelle est l'analyse statistique d'ordre un, consistant en l'étude de la moyenne de Fréchet (ou barycentre). En particulier, nous nous concentrons sur le cas de données (ou observations) discrètes échantillonnées à partir de mesures de probabilité absolument continues (a.c.) par rapport à la mesure de Lebesgue. Nous introduisons ainsi un estimateur du barycentre de mesures aléatoires, pénalisé par une fonction convexe, permettant ainsi d'imposer son a.c. Un autre estimateur est régularisé par l'ajout d'entropie lors du calcul de la distance de Wasserstein. Nous nous intéressons notamment au contrôle de la variance de ces estimateurs. Grâce à ces résultats, le principe de Goldenshluger et Lepski nous permet d'obtenir une calibration automatique des paramètres de régularisation. Nous appliquons ensuite ce travail au recalage de densités multivariées, notamment pour des données de cytométrie de flux. Nous proposons également un test d'adéquation de lois capable de comparer deux distributions multivariées, efficacement en terme de temps de calcul. Enfin, nous exécutons une analyse statistique d'ordre deux dans le but d'extraire les tendances géométriques globales d'un jeu de donnée, c'est-à-dire les principaux modes de variations. Pour cela nous proposons un algorithme permettant d'effectuer une analyse en composantes principales géodésiques dans l'espace de Wasserstein. / This thesis focuses on the analysis of data in the form of probability measures on R^d. The aim is to provide a better understanding of the usual statistical tools on this space endowed with the Wasserstein distance. The first order statistical analysis is a natural notion to consider, consisting of the study of the Fréchet mean (or barycentre). In particular, we focus on the case of discrete data (or observations) sampled from absolutely continuous probability measures (a.c.) with respect to the Lebesgue measure. We thus introduce an estimator of the barycenter of random measures, penalized by a convex function, making it possible to enforce its a.c. Another estimator is regularized by adding entropy when computing the Wasserstein distance. We are particularly interested in controlling the variance of these estimators. Thanks to these results, the principle of Goldenshluger and Lepski allows us to obtain an automatic calibration of the regularization parameters. We then apply this work to the registration of multivariate densities, especially for flow cytometry data. We also propose a test statistic that can compare two multivariate distributions, efficiently in terms of computational time. Finally, we perform a second-order statistical analysis to extract the global geometric tendency of a dataset, also called the main modes of variation. For that purpose, we propose algorithms allowing to carry out a geodesic principal components analysis in the space of Wasserstein.
220

Die kerk en die sorggewers van VIGS-weeskinders

Strydom, Marina 01 January 2002 (has links)
Text in Afrikaans / Weens die veeleisende aard van sorggewing aan VIGS-weeskinders, bevind die sorggewers hulle dikwels in 'n posisie waar hulle self sorg en ondersteuning nodig het. Die vraag het begin ontstaan op watter manier hierdie sorggewers ondersteun kan word. Dit het duidelik geword dat die kerk vanuit hul sosiale verantwoordelikheid sorg en ondersteuning aan die sorggewers kan bied. Sorggewers van een instansie wat aan die navorsingsreis deelgeneem het, het inderdaad nie genoeg sorg en ondersteuning van die kerk ontvang nie. Hierdie gebrek aan ondersteuning het 'n direkte invloed op die sorggewers se hantering van sorggewingseise. Sorggewers van die ander twee deelnemende instansies ontvang genoeg ondersteuning van lidmate, en dit maak 'n groot verskil aan hoe sorggewingspanning beleef word. In hierdie studie is daar krities gekyk na wyses waarop die kerk betrokke is en verder kan betrokke raak by die sorggewers van VIGSweeskinders. / Philosophy, Practical and Systematic Theology / M.Th. (Praktiese Teologie)

Page generated in 0.1038 seconds