• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 4
  • 1
  • Tagged with
  • 13
  • 13
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Value at Risk no mercado financeiro internacional: avaliação da performance dos modelos nos países desenvolvidos e emergentes / Value at Risk in international finance: evaluation of the models performance in developed and emerging countries

Luiz Eduardo Gaio 01 April 2015 (has links)
Diante das exigências estipuladas pelos órgãos reguladores pelos acordos internacionais, tendo em vistas as inúmeras crises financeiras ocorridas nos últimos séculos, as instituições financeiras desenvolveram diversas ferramentas para a mensuração e controle do risco inerente aos negócios. Apesar da crescente evolução das metodologias de cálculo e mensuração do risco, o Value at Risk (VaR) se tornou referência como ferramenta de estimação do risco de mercado. Nos últimos anos novas técnicas de cálculo do Value at Risk (VaR) vêm sendo desenvolvidas. Porém, nenhuma tem sido considerada como a que melhor ajusta os riscos para diversos mercados e em diferentes momentos. Não existe na literatura um modelo conciso e coerente com as diversidades dos mercados. Assim, o presente trabalho tem por objetivo geral avaliar os estimadores de risco de mercado, gerados pela aplicação de modelos baseados no Value at Risk (VaR), aplicados aos índices das principais bolsas dos países desenvolvidos e emergentes, para os períodos normais e de crise financeira, de modo a apurar os mais efetivos nessa função. Foram considerados no estudo os modelos VaR Não condicional, pelos modelos tradicionais (Simulação Histórica, Delta-Normal e t-Student) e baseados na Teoria de Valores Extremos; o VaR Condicional, comparando os modelos da família ARCH e Riskmetrics e o VaR Multivariado, com os modelos GARCH bivariados (Vech, Bekk e CCC), funções cópulas (t-Student, Clayton, Frank e Gumbel) e por Redes Neurais Artificiais. A base de dados utilizada refere-se as amostras diárias dos retornos dos principais índices de ações dos países desenvolvidos (Alemanha, Estados Unidos, França, Reino Unido e Japão) e emergentes (Brasil, Rússia, Índia, China e África do Sul), no período de 1995 a 2013, contemplando as crises de 1997 e 2008. Os resultados do estudo foram, de certa forma, distintos das premissas iniciais estabelecidas pelas hipóteses de pesquisa. Diante de mais de mil modelagens realizadas, os modelos condicionais foram superiores aos não condicionais, na maioria dos casos. Em específico o modelo GARCH (1,1), tradicional na literatura, teve uma efetividade de ajuste de 93% dos casos. Para a análise Multivariada, não foi possível definir um modelo mais assertivo. Os modelos Vech, Bekk e Cópula - Clayton tiveram desempenho semelhantes, com bons ajustes em 100% dos testes. Diferentemente do que era esperado, não foi possível perceber diferenças significativas entre os ajustes para países desenvolvidos e emergentes e os momentos de crise e normal. O estudo contribuiu na percepção de que os modelos utilizados pelas instituições financeiras não são os que apresentam melhores resultados na estimação dos riscos de mercado, mesmo sendo recomendados pelas instituições renomadas. Cabe uma análise mais profunda sobre o desempenho dos estimadores de riscos, utilizando simulações com as carteiras de cada instituição financeira. / Given the requirements stipulated by regulatory agencies for international agreements, in considering the numerous financial crises in the last centuries, financial institutions have developed several tools to measure and control the risk of the business. Despite the growing evolution of the methodologies of calculation and measurement of Value at Risk (VaR) has become a reference tool as estimate market risk. In recent years new calculation techniques of Value at Risk (VaR) have been developed. However, none has been considered the one that best fits the risks for different markets and in different times. There is no literature in a concise and coherent model with the diversity of markets. Thus, this work has the objective to assess the market risk estimates generated by the application of models based on Value at Risk (VaR), applied to the indices of the major stock exchanges in developed and emerging countries, for normal and crisis periods financial, in order to ascertain the most effective in that role. Were considered in the study models conditional VaR, the conventional models (Historical Simulation, Delta-Normal and Student t test) and based on Extreme Value Theory; Conditional VaR by comparing the models of ARCH family and RiskMetrics and the Multivariate VaR, with bivariate GARCH (VECH, Bekk and CCC), copula functions (Student t, Clayton, Frank and Gumbel) and Artificial Neural Networks. The database used refers to the daily samples of the returns of major stock indexes of developed countries (Germany, USA, France, UK and Japan) and emerging (Brazil, Russia, India, China and South Africa) from 1995 to 2013, covering the crisis in 1997 and 2008. The results were somewhat different from the initial premises established by the research hypotheses. Before more than 1 mil modeling performed, the conditional models were superior to non-contingent, in the majority of cases. In particular the GARCH (1,1) model, traditional literature, had a 93% adjustment effectiveness of cases. For multivariate analysis, it was not possible to set a more assertive style. VECH models, and Bekk, Copula - Clayton had similar performance with good fits to 100% of the tests. Unlike what was expected, it was not possible to see significant differences between the settings for developed and emerging countries and the moments of crisis and normal. The study contributed to the perception that the models used by financial institutions are not the best performing in the estimation of market risk, even if recommended by renowned institutions. It is a deeper analysis on the performance of the estimators of risk, using simulations with the portfolios of each financial institution.
12

Innovative Tessellation Algorithm for Generating More Uniform Temperature Distribution in the Powder-bed Fusion Process

Ehsan Maleki Pour (5931092) 16 January 2019 (has links)
<div>Powder Bed Fusion Additive Manufacturing enables the fabrication of metal parts with complex geometry and elaborates internal features, the simplication of the assembly process, and the reduction of development time. However, the lack of consis-tent quality hinders its tremendous potential for widespread application in industry. This limits its ability as a viable manufacturing process particularly in the aerospace and medical industries where high quality and repeatability are critical. A variety of defects, which may be initiated during the powder-bed fusion additive manufacturing process, compromise the repeatability, precision, and resulting mechanical properties of the final part. The literature review shows that a non-uniform temperature distribution throughout fabricated layers is a signicant source of the majority of thermal defects. Therefore, the work introduces an online thermography methodology to study temperature distribution, thermal evolution, and thermal specications of the fabricated layers in powder-bed fusion process or any other thermal inherent AM process. This methodology utilizes infrared technique and segmentation image processing to extract the required data about temperature distribution and HAZs of the layer under fabrication. We conducted some primary experiments in the FDM process to leverage the thermography technique and achieve a certain insight to be able to propose a technique to generate a more uniform temperature distribution. These experiments lead to proposing an innovative chessboard scanning strategy called tessellation algorithm, which can generate more uniform temperature distribution and diminish the layer warpage consequently especially throughout the layers with either geometry that is more complex or poses relatively longer dimensions. In the next step, this work develops a new technique in ABAQUS to verify the proposed scanning strategy. This technique simulates temperature distribution throughout a layer printed by chessboard printing patterns in powder-bed fusion process in a fraction of the time taken by current methods in the literature. This technique compares the temperature distribution throughout a designed layer printed by three presented chessboard-scanning patterns, namely, rastering pattern, helical pattern, and tessellation pattern. The results conrm that the tessellation pattern generates more uniform temperature distribution compared with the other two patterns. Further research is in progress to leverage the thermography methodology to verify the simulation technique. It is also pursuing a hybrid closed-loop online monitoring (OM) and control methodology, which bases on the introduced tessellation algorithm and online thermography in this work and Articial Neural Networking (ANN) to generate the most possible uniform temperature distribution within a safe temperature range layer-by-layer.</div>
13

Innovative Tessellation Algorithm for Generating More Uniform Temperature Distribution in the Powder-bed Fusion Process

Maleki Pour, Ehsan 12 1900 (has links)
Purdue School of Engineering and Technology, Indianapolis / Powder Bed Fusion Additive Manufacturing enables the fabrication of metal parts with complex geometry and elaborates internal features, the simplification of the assembly process, and the reduction of development time. However, the lack of consistent quality hinders its tremendous potential for widespread application in industry. This limits its ability as a viable manufacturing process particularly in the aerospace and medical industries where high quality and repeatability are critical. A variety of defects, which may be initiated during the powder-bed fusion additive manufacturing process, compromise the repeatability, precision, and resulting mechanical properties of the final part. The literature review shows that a non-uniform temperature distribution throughout fabricated layers is a significant source of the majority of thermal defects. Therefore, the work introduces an online thermography methodology to study temperature distribution, thermal evolution, and thermal specifications of the fabricated layers in powder-bed fusion process or any other thermal inherent AM process. This methodology utilizes infrared technique and segmentation image processing to extract the required data about temperature distribution and HAZs of the layer under fabrication. We conducted some primary experiments in the FDM process to leverage the thermography technique and achieve a certain insight to be able to propose a technique to generate a more uniform temperature distribution. These experiments lead to proposing an innovative chessboard scanning strategy called tessellation algorithm, which can generate more uniform temperature distribution and diminish the layer warpage consequently especially throughout the layers with either geometry that is more complex or poses relatively longer dimensions. In the next step, this work develops a new technique in ABAQUS to verify the proposed scanning strategy. This technique simulates temperature distribution throughout a layer printed by chessboard printing patterns in powder-bed fusion process in a fraction of the time taken by current methods in the literature. This technique compares the temperature distribution throughout a designed layer printed by three presented chessboard-scanning patterns, namely, rastering pattern, helical pattern, and tessellation pattern. The results confirm that the tessellation pattern generates more uniform temperature distribution compared with the other two patterns. Further research is in progress to leverage the thermography methodology to verify the simulation technique. It is also pursuing a hybrid closed-loop online monitoring and control methodology, which bases on the introduced tessellation algorithm and online thermography in this work and Artificial Neural Networking (ANN) to generate the most possible uniform temperature distribution within a safe temperature range layer-by-layer.

Page generated in 0.0583 seconds