• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 37
  • 12
  • 7
  • 4
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 79
  • 79
  • 18
  • 17
  • 15
  • 13
  • 12
  • 12
  • 10
  • 9
  • 8
  • 8
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Mise en oeuvre d'un système normalisé optimisé par les démarches du Lean Management / How to optimize an organization system (ISO Standards) by the Lean Management methods

Bacoup, Pascal 13 October 2016 (has links)
Les travaux présentés dans ce mémoire de thèse portent sur une démarche de fusion entre le Lean Management et la mise en oeuvre d’une certification de type ISO. Nous avons qualifié cette démarche de « Lean Normalisation ». Elle propose une nouvelle façon d'aborder la mise en oeuvre des normes ISO dans une entreprise, avec six étapes inspirées des principaux concepts du Lean Management.Après un état de l’art présentant l’ISO, son historique et les diverses versions normatives, nous proposons une analyse des différentes approches et définitions du Lean Management. Cette analyse repose sur une étude bibliographique abondante du domaine depuis plusieurs années.L’étude bibliographique couplée ISO/Lean nous a permis d’identifier la problématique suivante :les tentatives de fusion entre ces deux approches n’émergent pas sur une proposition de modélisation mettant en synergie l’ISO et le Lean Management. Nous proposons, de ce fait, une approche novatrice que nous avons qualifiée de « Lean Normalisation ».Cette approche se déroule en six étapes : le Muda Documentaire, le Juste Documentaire, la Conception du Processus d’Amélioration Continue, la Qualité Exigible, le Management Visuel, et l’Animation du Système de Management de la Qualité. La mise en oeuvre de cette approche s’appuie sur une démarche projet constituée de dix jalons.Suite à l’application de notre approche dans différentes entreprises, et après avoir fait le constat de l’inefficacité des audits de certification classiques, nous ouvrons de nouveaux horizons en proposant une vision différente de l'audit fondée sur le Stress Test Organisationnel. / The research presented in this PhD thesis looks at an approach which attempts to merge Lean Management and the implementation of ISO-type certification. This method, that we have called "Lean Normalization", proposes a new way of approaching the implementation of ISO standards in a company, using six steps inspired from Lean Management concepts.Once a state of the art of ISO, its history and the different normative versions have been presented, we then analyze the different approaches and definitions of Lean Management.The bibliographic study of both ISO and Lean methods has enabled us to identify the following problem: the attempts to merge ISO and Lean do not lead to any type of formal modelling. We therefore propose an innovative approach which we call "Lean Normalization".This approach is composed of the following six steps: Documentary Muda, Right Documents,Design of the Process of Continuous Improvement, Due Quality, Visual Management and Animation of the Quality Management System. The application of our approach in companies has enabled us to highlight the ineffectiveness of conventional audit. Therefore, in the last chapter, we propose a different vision of an audit based on the Organizational Stress Test.
12

Recherches sur les coûts et les bénéfices de la nouvelle régulation bancaire. Applications au cas européen / Essays on the costs and the benefits of the new regulatory framework : An application to European banks

Toader, Oana 05 July 2016 (has links)
L’arbitrage entre la stabilité et l’efficience des systèmes bancaires a toujours été au coeur de la définition desdispositifs de régulation bancaire. Cette thèse analyse comment cet arbitrage a été conçu et dans quelle mesure ila permis de concilier les deux types d’objectifs, c’est-à-dire les coûts et les bénéfices de la nouvelle réglementation.Nous évaluons, dans une première partie, l’impact des exigences renforcées en fonds propres et en liquidité sur lecoût du capital et sur l’activité bancaire. Les résultats empiriques montrent de manière générale que les coûts desnormes imposées par Bâle III sont relativement faibles et ont un impact limité sur l’activité de crédit. Nous mettonsen évidence un effet différencié selon les différentes catégories de banques, selon leur taille, importance systémique oubusiness model. Un autre constat tiré de cette étude concerne les anomalies dans la tarification du risque, généréespar l’existence des garanties implicites. C’est pourquoi, la seconde partie est consacrée à leur analyse approfondie etaux mesures mises en place pour éliminer le fameux problème de too big to fail. Même si des mesures ambitieusesont été adoptées par les autorités de régulation, les distorsions liées à l’activité des banques TBTF persistent. On envient à s’interroger, dans le cadre de la dernière partie, sur l’amélioration de la stabilité des institutions. Les résultatsmontrent que la mise en place des bonnes incitations, notamment grâce à des standards prudentiels adéquats, pourraits’avérer comme une solution efficiente pour réduire les risques financiers (probabilité de défaut, sensibilité au risquesystémique, perte en cas de scénario adverse). Ces différentes questions sont analysées pour les banques européennes.La démarche retenue est principalement empirique et les aspects microéconomiques ont été privilégiés. / The arbitrage between financial stability and the efficiency of banking systems has always been a key issue indefining the prudential regulation. This thesis analyses how this arbitrage is conceived and the extent to which itallows to reconcile the two objectives, namely the cost and the benefits of the new regulatory framework. We firstfocus on the impact of the new capital and liquidity requirements on the cost of capital and on banking activities. Ourfindings indicate that, globally, the cost of the recent reform is relatively low and does not have a significant impacton lending. We also emphasize a differentiated effect according to banks’ size, systemic importance and businessmodel. The existence of various distortions, affecting the pricing of risk, motivates the second part of the thesis, whichis dedicated to the analysis of implicit guarantees. We also assess the impact of resolution regimes and practices inending the too big to fail anomaly. Although ambitious measures have been undertaken, there is still a way to go toeliminate these distortions. In the third part, we examine the contribution of solvency and liquidity requirements tostrengthen the resilience of banks. The results indicate that setting up good incentives, through adequate prudentialstandards, could efficiently reduce financial risks (default probability, systemic risk, capital shortfall in case of adversescenario). The approach adopted in this thesis focuses on microeconomic aspects and is based on empirical studiesapplied to a sample of European banks.
13

Trois essais sur les mesures et déterminants du risque systémique / Three essays on the measures and determinants of systemic risk

El Amraoui, Sonia 08 November 2018 (has links)
Le risque systémique est un risque qui peut mettre en danger la survie du système financier. En effet, le risque systémique désigne la propagation d’un risque de défaillance bancaire unique aux autres banques. Quels sont les mesures et les déterminants du risque systémique ? Ainsi pourrait-être résumée la question transversale qui anime les recherches présentées dans cette thèse. Un premier chapitre dresse un état des lieux des différentes mesures du risque systémique, en identifie les points communs et les différences et précise l’intérêt de chaque mesure. La question abordée est celle de la corrélation entre les résultats des stress tests et les différentes mesures du risque systémique. Un second chapitre étudie la notion d’« Asset Commonality » comme une nouvelle mesure de risque systémique. Le troisième chapitre examine le lien entre les différentes mesures du risque systémique et la responsabilité sociétale des entreprises. Les résultats empiriques révèlent que -1- les résultats des stress tests devraient être complétés par une évaluation des mesures du risque systémique, -2- l’« Asset Commonality » pourrait être considéré comme un outil complémentaire pour évaluer le risque systémique, -3- la responsabilité sociale des institutions financières est importante afin de réduire le risque systémique. / Systemic risk is a risk that can compromise the survival of the financial system. Systemic risk refers to the spread of a single bank failure to other banks. What are the measures and determinants of systemic risk? This thesis proposes an investigation of this transversal question through three chapters. The first chapter gives an overview of the various measures of systemic risk, identifies commonalities and differences and specifies the interest of each measure. The issue is the correlation between the stress test results and the various measures of systemic risk. The second chapter studies the concept of Asset Commonality as a new measure of systemic risk. The third chapter examines the relationship between different measures of systemic risk and corporate social responsibility. The empirical results show that -1- the stress test results should be supplemented by an evaluation of the systemic risk measures, -2- Asset Commonality could be considered as a complementary tool to assess the systemic risk, -3- the corporate social responsibility of financial institutions is important in order to reduce systemic risk.
14

Graded Exercise Stress Testing: Treadmill Protocols Comparison Of Peak Exercise Times In Cardiac Patients

Salameh, Ahlam 05 October 2009 (has links)
No description available.
15

Bank capitalization and credit rating assessment : Evidence from the EBA stress test

Dimitrova, Evgenia January 2016 (has links)
Banks face market pressure when determining their capital structures because they are subject to strict regulations. CFOs are willing to adjust their company’s capital structures in order to obtain higher ratings. The credit ratings are highly valuable not only because they assess the creditworthiness of the borrowers but also because those agencies take advantage of the information asymmetry and have access to data that companies might not disclose publicly. Also, this industry gained much interest after the BIS proposals back in 1999 and 2001 that the Basel Committee on Banking Supervision should consider the borrower’s credit ratings when examining banks’ solvency and adequacy. Factors used to determine the credit ratings are banks’ asset quality which is fundamental measure for the creditworthiness, banks’ capital which is related to the asset quality in relation to the RWA, banks’ profitability, and liquidity measurements. The purpose of this paper is to investigate whether the banks that keep excess equity to balance sheet receive better credit ratings, given the predictors capital, banks size and defaulted to total exposures. The European Banking Authority (EBA) stress test results are used as a benchmark for determining banks’ capital adequacy and solvency, whereas the credit ratings are obtained shortly after the EBA’s reports publication. The sample size is 73 and 95 banks for the years 2011 and 2014, respectively. The results from the multivariate ordinal regression do not show significant correlation results between the excess equity to balance sheet and the credit ratings, even though the estimated coefficient is negative, namely excess equity is associated with lower credit ratings. An explanation to this one can find in the low-quality capital relative to the banks’ capital base. Also, banks which plan to implement risker projects or currently hold risker assets are subject to higher capital requirements. Moreover, banks currently being rated low but with the potential of being upgraded would be more willing to issue equity than debt in order to avoid the corresponding risk and achieve the higher rating. The equity ratio and the defaulted to total exposures ratio show significant correlation to the banks’ credit ratings. Overall, since the results of the regression are insignificant, we do not have reasons to believe that holding excess equity is not beneficial for banks. When banks make changes in their leverage ratios they would either carry the cost of being downgraded or the cost related to issuing more equity, therefore at the end they will balance the leverage ratio close to the optimal and keep as much capital as required by regulations.
16

Modelos para teste de estresse do sistema financeiro no Brasil / Stress test models for the brazilian financial system

Zaniboni, Natália Cordeiro 05 June 2018 (has links)
A literatura sobre testes de estresse do sistema financeiro vem crescendo substancialmente nos últimos anos devido à importância destes exercícios, destacada pela crise financeira do subprime, a sequência de falências bancárias em muitos países e a crise econômica brasileira. Este trabalho propõe uma metodologia para testes de estresse, focada em risco de crédito, para o sistema financeiro brasileiro. Após a definição do escopo, o segundo passo de um teste de estresse é a identificação das vulnerabilidades do sistema financeiro, em que se captura as relações entre fatores macroeconômicos e a inadimplência. A maior parte dos trabalhos utiliza um conjunto limitado de fatores macroeconômicos. Este trabalho propôs a utilização de mais de 300 variáveis e uma análise fatorial para obter fatores macroeconômicos que consideram um conjunto mais abrangente de variáveis em um modelo ARIMAX. Além disso, os trabalhos comumente empregam modelos de dados em painel, VAR, séries temporais ou modelos de regressão linear. Porém, a mudança em uma variável raramente afeta outra instantaneamente, pois o efeito é distribuído ao longo do tempo. Neste trabalho é proposto o modelo de defasagem distribuída polinomial, que considera este efeito ao estimar os parâmetros defasados por meio de um polinômio de segundo grau. Os modelos foram construídos utilizando o período de março de 2007 a agosto de 2016 como período de modelagem e setembro de 2016 a agosto de 2017 como período de validação fora do tempo. Para os meses de validação, os modelos propostos apresentaram menor soma dos quadrados dos erros. O terceiro passo é a calibração de um cenário de estresse adverso e plausível, que pode ser obtido pelos métodos histórico, hipotético e probabilístico. Nota-se uma lacuna na literatura brasileira, suprida neste trabalho, em que não há propostas de cenários hipotéticos e históricos (que consideram todas as crises de 2002, crise subprime de 2008 e crise de 2015/2017) para o Brasil. Notou-se que os choques históricos geram valores mais severos que os hipotéticos, e há variáveis mais sensíveis aos diferentes tipos de crises econômicas. Ao verificar o impacto do cenário obtido para as instituições, a inadimplência estimada em cenário de estresse foi de 6,38%, um aumento de 68% em relação ao cenário base. Este aumento foi semelhante, um pouco mais severo, aos choques obtidos na literatura brasileira e ao Relatório de Estabilidade Financeira construído pelo Banco Central do Brasil, que estima que o sistema bancário está preparado para absorver um cenário de estresse macroeconômico. / Studies on stress testing the financial system has been growing substantially recently due to the importance of these exercises, highlighted by the subprime financial crisis, the bank failures sequence in many countries and the Brazilian economic crisis. This paper proposes a stress test methodology, focused on credit risk, for the Brazilian financial system. After the scope definition, the second step of a stress test is the vulnerabilities of the financial system identification, in which the relation between macroeconomic factors and credit risk are captured. Most papers use a limited set of macroeconomic factors. This paper proposes the use of more than 300 variables and a factor analysis to obtain macroeconomic factors to consider a more comprehensive set of variables in an ARIMAX model. In addition, academic papers commonly employ panel data models, VAR, time series or linear regression models. However, changing one variable rarely affects another instantaneously because the effect is distributed over time. In this work the polynomial distributed lag model is proposed, which considers this effect when estimating the lagged parameters by a second degree polynomial. The models were constructed using March 2007 to August 2016 as a modeling period and September 2016 to August 2017 as an out of time validation period. For the validation period, the proposed models presented a smaller sum of the squares errors. The third step is the calibration of an adverse and plausible stress scenario, which can be obtained by historical, hypothetical and probabilistic methods. We note a gap in the Brazilian literature, provided in this paper, in which there are no hypothetical and historical scenarios (which consider all crises of 2002, subprime crisis of 2008 and crisis of 2015/2017) for Brazil. It was noted that historical shocks generate more severe values than hypothetical shocks, and there are variables more sensitive to different types of economic crises. When verifying the impact of the scenario obtained for the institutions, the estimated default in the stress scenario was 6.38%, an increase of 68% in relation to the base scenario. This increase was similar, somewhat more severe, to the shocks obtained in the Brazilian literature and to the Financial Stability Report built by the Central Bank of Brazil, which estimates that the banking system is prepared to absorb a macroeconomic stress scenario.
17

Sistema para testes de stress em uma carteira de opções de moedas / System for stress test in an FX option portfolio

Mello, Moreno Siqueira e 17 October 2017 (has links)
Na gestão de recursos financeiros, visualizar e gerenciar em tempo real os riscos inerentes a uma carteira de investimentos é uma tarefa crucial para que o objetivo de gerar lucro possa ser atingido, ou que pelo menos as perdas possam ser minimizadas. Uma das formas de realizar esse gerenciamento é submeter essas carteiras a simulações onde são definidos cenários contendo variações de fatores que possam influenciar os ativos nelas contidos. Dependendo da classe dos ativos financeiros analisados, essas simulações requerem uma ferramenta mais sofisticada, capaz de lidar com modelos complexos de precificação. O objetivo deste trabalho consiste em resolver uma demanda real de uma gestora de recursos onde este autor atua: o desenvolvimento de uma ferramenta capaz de realizar testes de stress em uma carteira de investimentos contendo mais especificamente opções de moedas. Foi desenvolvido um sistema no formato de add-in de Excel em que os gestores podem definir cenários com as variações desejadas e, em conjunto com dados de mercado em tempo real, avaliar o impacto dessas variações em seu portfolio. O desenvolvimento foi realizado em etapas, e a versão atual da ferramenta trouxe ganhos no tempo de execução das simulações na ordem de dez vezes, quando comparado à versão anterior. Nesta dissertação serão mostrados detalhes da implementação do sistema, bem como o embasamento teórico utilizado no seu desenvolvimento. Será apresentada uma breve descrição sobre o mercado de câmbio e seus instrumentos, incluindo opções de moedas. Também será descrito um modelo para precificação e mensuração de risco desses instrumentos. / In the financial resources management, visualizing and handling risks inherent in an investment portfolio in real time are key tasks to ensure that the objective of profit is accomplished, or at least that the losses are mitigated. One way to perform this kind of management is to submit the portfolio to scenario simulations, in which factors that might affect the assets held in the portfolio are stressed. Depending on the class of these assets, there is the need of a more sophisticated tool, capable of handling complex pricing models. The main purpose of this work is to solve a real demand for an investment management company for which this author works: the development of a tool capable to perform stress tests in an investment portfolio containing more specifically Foreign eXchange options. An Excel add-in has been developed and managers can use it to define scenarios with the desired bumps and, along with real time market data, analyze the impact of these bumps in the portfolio. The development has been made in phases and the tools current version has brought a reasonable improvement to the execution time of the simulations. In this thesis we will discuss systems implementation details, as well as the theoretical basis used in its development. An overview of the FX market and its instruments will be presented, including FX options. Also, there will be a description of a model for pricing and risk measurement of these instruments.
18

Teste de caminhada de seis minutos como preditor de morbidade e mortalidade cardiovascular em pacientes após infarto agudo do miocárdio / Morbidity and mortality predictor of six minute walk test after acute myocardial infarction patients

Umeda, Iracema Ioco Kikuchi 11 December 2014 (has links)
Introdução: O teste de caminhada de seis minutos (TC6M) é um teste muito utilizado para avaliar as condições de saúde de idosos e saudáveis, bem como pacientes com doenças pulmonares e cardiovasculares. Porém, poucos são os relatos na literatura científica habitual sobre a utilização do teste de caminhada de seis minutos para avaliar a morbidade e mortalidade de pacientes após infarto agudo do miocárdio (IAM). Objetivo: O objetivo deste estudo foi verificar se o TC6M tem valor preditivo para morbidade e/ou mortalidade cardiovascular após IAM. Queremos verificar o ponto de corte da distância no TC6M para síndrome coronariana aguda, insuficiência cardíaca, re-hospitalização ou óbito por causa cardiovascular. Método: Trata-se de um estudo observacional, no qual se utilizou análise de prontuários, contato telefônico, correio e SIM (Sistema de Informação de Mortalidade da Secretaria de Saúde) de pacientes com diagnóstico de IAM não complicado que realizaram o TC6M antes da alta hospitalar. Desfechos observados: síndrome coronariana aguda, insuficiência cardíaca, acidente vascular cerebral, re-hospitalização e óbito por causa cardiovascular. A coleta de dados se deu no Instituto Dante Pazzanese de Cardiologia, por meio de análise de prontuário e foram incluídos no estudo, os pacientes com diagnóstico de IAM não complicado que realizaram o teste de caminhada de seis minutos antes da alta hospitalar. Para análise estatística foram utilizados: correlação de Pearson ou Spearman, teste t de Student ou Mann-Whitney e ANOVA ou teste de Kruskall Wallis para analisar os efeitos das características físicas e clínicas dos pacientes analisados na distância percorrida no TC6M. Estas características e a distância percorrida foram avaliadas nos desfechos, ao longo de tempo, observando a curva de viii sobrevivência de Kaplan-Meier ou a sobrevivência em média de Cox, a significância dos efeitos foi testada por teste de log-rank ou pelo modelo de riscos proporcionais de Cox, respectivamente. Também foi ajustado um modelo de sobrevivência de Cox final para avaliar o efeito de todas as co-variáveis juntamente presentes no desfecho. Na análise múltipla foi utilizado o método de seleção de variáveis forward para selecionar as variáveis mais associadas à sobrevida. O tamanho dos efeitos, quando significativos, foi medido pela odds ratio (OR). Resultados: Foram incluídos 234 pacientes, 173(73,9 por cento ) do sexo masculino, 57,18 (10,35) anos, 103(44 por cento ) IAM anterior, 182 (77,8 por cento ) Killip I, 190 (81,2 por cento ) com terapia de reperfusão e fração de ejeção do ventrículo esquerdo de 49,99 (10,14) por cento . Foram observados 89 (38,03 por cento ) pacientes com pelo menos um desfecho adverso, sendo 18 (8,1 por cento ) óbitos por causa cardiovascular num período de seguimento médio de 1.355,47 (777,53) dias. A distância do TC6M não se associou à ocorrência dos desfechos adversos, porém à ocorrência de óbito, resultando dois modelos: a) metragem do primeiro quartil (370,5 m) (OR = 2,737; p = 0,046), índice de percepção de esforço (IPE) de Borg (OR = 1,380; p = 0,020) e saturação periférica de oxigênio (SpO2) < 90 por cento (OR = 2,326; p = 0,103); b) metragem do teste de log rank (232 m) (p = 0,036; OR = 3,459), índice de Borg (OR = 1,351; p = 0,044) e SpO2 < 90 por cento (OR = 2,936; p = 0,030). A metragem e a SpO2 também se associaram à pior sobrevida ao longo do tempo: modelo 1) IPE Borg (OR = 1,334; p = 0,041, SpO2 < 90 por cento (OR = 2,675; p = 0,067) e a distância de 370,5m (OR = 2,882; p = 0,042) e modelo 2: SpO2 < 90 por cento (OR = 4,193; p=0,004) e distância de 232m (OR = 5,014; p=0,005). Numa análise do comportamento da FC, SpO2 e PS e PD ao longo do tempo no TC6M entre os grupos óbito e não óbito foram observadas diferenças significantes apenas da FC (p < 0,0001) e SpO2 (p < 0,0001). Conclusão: Na amostra estudada, a distância e a SpO2 < 90 por cento no TC6M se associaram ao óbito e à pior sobrevida em pacientes após IAM não complicado. / Background: The six-minute walk test (6MWT) is a test used to assess the prognosis of patients with heart failure, chronic obstructive pulmonary disease and health status of the elderly. However, there are few reports in the scientific literature about the use of this test as a tool to assess the prognosis after acute myocardial infarction (AMI) patients. The aim of this study is to assess the prognostic value of the 6MWT in AMI patients. We also intend to point out whether there is a minimum distance in the 6MWT that defines a group of patients with worse prognosis, i.e, in the occurrence of death, re-infarction, or heart failure re-hospitalization from cardiovascular causes. Methods: This is an observational study for which we used analysis of medical records, telephone contact, mail and SIM (Mortality Information System of the Department of Health) of uncomplicated AMI patients who underwent 6MWT before hospital discharge. Observed outcomes: acute coronary syndrome, heart failure, stroke,re-hospitalization and cardiovascular death. Data collection has taken place at the Institute Dante Pazzanese of Cardiology, with analysis of medical records and has be included patients with uncomplicated AMI who underwent 6MWT before discharge. Statistical analysis: we used Pearson or Spearman correlation, Student\'s t test or Mann-Whitney test and ANOVA or Kruskal Wallis test to analyze the effects of physical and clinical characteristics in 6MWT distance. Such characteristics and the 6MWT distance were evaluated in outcomes over time, observing the Kaplan-Meier survival curve or the average survival by Cox, the significance of the effects was tested by log-rank test or the Cox proportional hazards model, respectively. It was also set a Cox survival x model to assess the effects of all covariates together present in the outcomes. We used selection of forward variables for multivariate analysis to select the variables most associated with survival. The size of the effects was measured by odds ratio (OR). Results: We included 234 patients, 173(73,9 per cent ) males, 57.18 (10.35) years old, 103(44 per cent ) anterior AMI, 182 (77.8 per cent ) Killip I, 190 (81.2 per cent ) with reperfusion therapy and left ventricular ejection fraction = 49.99 (10.14) per cent . We observed 89 (84.03 per cent ) patients with cardiovascular outcomes and 18 (8.1 per cent ) deaths for 1,355.47 (777.53) days of follow up. There was no association between the 6MWT distance and the combined endpoints. We observed association with 6MWT distance and death, resulting two models: a) distance of first quartile (370.5 m) (OR = 2.737, p = 0.046), Borg scale of perceived exertion (SPE) (OR = 1.380, p = 0.020) and oxygen saturation (SpO2) <90 per cent (OR = 2.326, p = 0,103); b) distance of the log rank test (232 m) (OR = 3.459, p = 0.036), Borg SPE (OR = 1.351, p = 0.044) and SpO2 <90 per cent (OR = 2.936, p = 0.030). The distance and the SpO2 were also associated with poor survival over time: model 1) Borg SPE (OR = 1.334, p = 0.041, SpO2 <90 per cent (OR = 2.675, p = 0.067) and 6MWT distance = 370.5 m (OR = 2.882, p = 0.042) and model 2: SpO2 <90 per cent (OR = 4.193, p = 0.004) and 6MWT distance = 232m (OR = 5.014, p = 0.005). In comparison with death group and survival group, there was a significant difference in HR (p <0.0001) and SpO2 (p <0.0001) overtime. Conclusion: The distance and SpO2 < 90 per cent in 6MWT were associated with death and worse survival conditions in patients after uncomplicated AMI.
19

An investigation of attentional bias in test anxiety

Buck, Robert January 2018 (has links)
Test anxiety is an individual personality trait, which results in elevated state anxiety in situations of performance evaluation. For school-age children, high-stakes examinations occurring at the culmination of programmes of study are where they frequently experience such evaluation. Alongside its impact on an individual's wellbeing, heightened test anxiety has been reliably linked to deficits in performance on examinations and assessments. Attentional bias has been shown to be an aspect of many forms of anxiety and is considered to have role in the maintenance of state anxiety, though the mechanisms underlying this are not fully clear. However, Attentional Control Theory (Eysenck, Derakshan, Santos, &amp; Calvo, 2007) implicates preferential allocation of attention to threat in its explanation of performance deficits associated with test anxiety. The presence of attentional bias in test anxiety appears theoretically plausible with some empirical support (e.g. Putwain, Langdale, Woods and Nicholson, 2011); however, its reliability is under question. This study aims to investigate the presence of attentional bias in test anxiety, with a view to further understanding its underlying mechanisms and informing the development of interventions to ameliorate its effects. To ensure ecological validity, this study was conducted in schools and colleges, with a sample of 16-18-year olds following high-stakes programmes of study. Full investigation of test anxiety requires individuals to experience heightened state anxiety through performance evaluation threat; hence, the Trier Social Stress Test (TSST) was modified to make it applicable to this context and population. This study was conducted in two experimental phases, both of which adopted a mixed methodological approach to provide quantitative and qualitative data. The preliminary phase evaluated the materials and anxiety manipulation protocols. The main phase employed the modified-TSST in collaboration with a dot-probe task to investigate participants' attentional bias when under high performance evaluation threat. No patterns of attentional bias were uncovered to indicate a consistent relationship to either trait test anxiety or attentional control. However, there was a level of congruence between how some individuals describe themselves in evaluative situations and the attentional bias they displayed. Further investigation employing mixed methodological approaches such as Single Case Experimental Design is recommended to identify and address attentional bias in test anxiety.
20

Characterization and Correlation Analysis of Pharmaceutical Gelatin

Felix, Pascal Georges 18 November 2003 (has links)
The properties of the aged gel and subsequent softgels were examined using mechanical and chemical testing methods. Our hypothesis was that a negligible variation will exist between the aged gel of the same type. The greater difference is expected to be seen between the types of gels described as 150 Bloom (alkaline treated collagen) and 195 Bloom (acid treated collagen). The types of gelatin used were the acid processed (195 Acid Bone) and alkaline processed (150 Lime Bone). Because of the differences expressed as the result of their manufacture sequence (namely their molecular weights), it follows that physical attributes will further contribute to their distinction. In addition to observing different characteristics between the types of gels, we aged the gelatin and produced softgel capsules to qualify and quantify the changes that occur as a function of time. Two production lots of over 1 million softgel capsules were executed to produce a population that lends itself to statistical analysis. Softgel capsules were manufactured with gelatin which was aged at intervals of 0-8 hrs, 32-40 hrs, 66-72 hrs and 88-96 hrs. The manufacturing process made use of this strategy for the acid and alkaline treated gelatin where a total of eight lots were made (4 acid and 4 alkaline). One hundred thousand softgels were manufactured for the acid processed gelatin, per lot. Additionally, one hundred and fifty thousand softgels were manufactured for the alkaline processed gelatin per lot. The results of the different tests provided trends that were not solely a function of time. Gel extensibility for both gel types showed a decrease in the amount of force needed to rupture the gelatin ribbon, as a function of time. The resilience of the tested ribbon remained constant throughout the aging process. The burst strength was the only test showing an inverse relationship between the two gel types. The amount of force needed to rupture the 150 Bloom softgels decrease in time whereas the amount of force needed to rupture the 195 Bloom softgels increase with time. The rheological testing was described in the literature as being associated with the molecular weight distribution. Such association was seen in our research and both the results of the rheological and the molecular weight tests decreased with the aging process.

Page generated in 0.0583 seconds