• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 25
  • 13
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 68
  • 68
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Conceito de embarcações adaptadas à via aplicado à navegação fluvial no Brasil. / Waterway adapted ships concept applied on Brazilian inland navigation.

Carlos Daher Padovezi 03 October 2003 (has links)
É elaborada uma proposta de procedimentos de projetos de comboios fluviais adaptados às condições existentes das vias navegáveis, a partir de uma visão ampliada da necessidade de obtenção de menores custos de transporte, com níveis adequados de segurança e de respeito ao meio ambiente. Uma análise das inter-relações técnicas entre hidrovias e embarcações, assim como dos condicionantes e implicações do conceito de embarcações adaptadas às vias navegáveis, orientou a elaboração do modelo. Este foi estruturado em módulos, com o objetivo de reproduzir, um a um, os fatores mais importantes que influenciam a eficiência, a segurança e o nível de interferência ambiental do transporte de cargas por comboios. Um programa computacional foi desenvolvido como instrumento de aplicação do modelo, consolidando os procedimentos propostos para a escolha das melhores alternativas de projeto e de operação de comboios. Resultados experimentais com comboios em escala real e com modelos em tanques de provas, foram utilizados para validação dos procedimentos adotados. Dados de acidentes com comboios em várias hidrovias do mundo foram utilizados como bases para avaliações de risco. O modelo foi aplicado aos casos de transportes de soja pela hidrovia Tietê-Paraná e pelo rio Araguaia, exemplificando as formas de análise e de escolha das alternativas de soluções de projeto. Ao final, os resultados obtidos comprovaram a utilidade da adoção de um enfoque mais abrangente do processo de projeto de comboios fluviais. / It is proposed a procedure model for design of barges push-tow adapted to waterway actual conditions, with the purpose of minimize transportation costs but always making verifications of navigation safety and ambient interferences levels. An analysis of inter-relations on inland waterways and cargo ships and, also, detailed conditionings and implications of waterway adapted ships concept, was used for model elaboration. It was structured in blocks to reproduce, one to one, the most important factors that modify efficiency, safety and environmental interference levels of barges push-tow cargo transportation. A computational program was developed to consolidate the proposed model and to apply procedures to choose best design and operational alternatives. Results of full scale and towing tank tests with push-tows were used to verify the mathematical and semi-empirical models. Barges push-tows accidents data from waterways of the world was used as risk model basis. To evaluate its effectiveness, the model was applied to bulk grain transportation cases by Tietê-Paraná waterway and by Araguaia river. The results shows that the special emphasis on three factors (efficiency, safety and ambient) improves the quality of barges push-tow design process.
52

The quality of selected food products containing nanosilica additive (E551) in South Africa

Thakur, Rookmoney January 2017 (has links)
Submitted in fulfilment of the requirements for the Degree Master of Philosophy: Quality Management, Durban University of Technology, 2017. / The proliferation of nanotechnology, whilst perceived to be positive for human advancement, introduces potential risks when applied to food. Silicon Dioxide (E551), a common food additive made up of particles in the nano-range, is found in spices, salt, sweets and some frozen foods and functions as an anti-caking agent to allow these food products to flow and mix evenly. According to Codex Alimentarius, E551 is generally regarded as safe (GRAS), provided that food manufacturers apply good manufacturing practice (GMP) principles and use the lowest possible amounts necessary. Smaller nanoparticles are more readily taken up by the human body than larger sized particles and could be injurious to human health. While the use of E551 is strictly regulated in some countries, there is growing debate regarding the health and safety implications for consumers and the quality of food. This study examined the quality of selected food products containing E551 (nanosilica) in South Africa (SA). A mixed method paradigm (qualitative and quantitative) and an experimental research strategy were adopted. Respondents were purposefully selected, their participation in this study was voluntary and confidentiality was maintained. Pilot studies were conducted for the semi-structured interviews and the survey, with a sample size of one food expert and three food technologists, respectively. The main study consisted of interviews, a survey and experimental work. The interviews, conducted with five food experts, were recorded and transcribed to ensure credibility. The results were interpreted and analysed against existing literature using thematic content analysis. The findings suggest that it was critical for food manufacturers to demonstrate the safe use of products without posing any safety risks to the consumer and the environment; and for the South African government to address and regulate the application of nanomaterials in food either by legislation or guidelines. The survey was conducted with a sample population of thirty food technologists who reported that public awareness of nanotechnology was limited as many consumers were not familiar with this technology. Descriptive and inferential statistics were used to analyse the quantitative data. Content validity ensured that the survey focused on concepts and constructs that emerged from the review of literature on the application of nanotechnology in food products. Cronbach’s alpha index was used to assess the reliability of the surveys and found α = 0.862 and α = 0.809 for food additives awareness and nanosilica safety in food, respectively. Different characterisation methods, such as Fourier Spectra Infrared Spectroscopy (FT-IR), Energy Dispersive X-ray Spectroscopy (EDX) and X-ray Diffraction (XRD), were used to determine the type and form of silica, and its levels in selected food brands available in SA. This was compared against similar products manufactured and packed in the European Union (EU) and Asia. This study benchmarked against the EU standard because of its more stringent guidelines in the field of nanotechnology and regulations. The results indicate that while the comparative EU food sample conformed to the European Food Safety Association (EFSA) permissible level of 1 %, the South African sample levels were higher. Even though the regulatory standards are different in both countries, the potential health effects remain the same. Significantly, the most prominent finding of this study is that the form of silica in some of the South African and Asian products were crystalline in nature, rather than synthetic amorphous silica (SAS), which is indicative of E551. Thus, it stands to reason that the generalised limit set by Codex Alimentarius was inadequate to regulate and control the quantity and type of E551 used as it varied from each of the selected samples. The identification of traces of crystalline silica is of concern since studies in literature showed that exposure to and ingestion of crystalline silica that was not food grade, is likely to induce perilous health effects such as cancer and fibrosis in humans. In light of this finding on the crystalline nature of silica in the studied brands, it is therefore imperative that specific limits and regulations be put in place and enforceable in SA to ensure that products sold are in line with acceptable standards as found in some developed countries like the United States of America (US) and EU. In view of the above, and to ensure proper monitoring and minimal risk exposure, a risk management framework, a ‘Hazard identification, Access the risks, Control the risks’ (HAC) model, was developed and recommended to ensure that the correct form and type, and limits of silica is used and the associated risk controls applied. / M
53

Semiparametrický model aditivního rizika / Semiparametric additive risk model

Zavřelová, Adéla January 2020 (has links)
Cox proportional hazard model is often used to estimate the effect of covariates on hazard for censored event times. In this thesis we study the semiparametric models of additive risk for censored data. In this model the hazard is given as a sum of unknown baseline hazard function and a product of covariates and coefficients. Further the general additive-multiplicative model is assumed. In this model the effect of a covariate can be either multiplicative, additive or both at the same time. We focuse on determining the effect of a covariate in the general model. This model can be used to test for the multiplicative or addtive effect of a covariate on the hazard.
54

Distributional Dynamics of Fama-French Factors in European Markets / Tidsvarierande fördelningar för Fama-French-faktorer på europeiska marknader

Löfgren, Wilmer January 2020 (has links)
The three-factor model of Fama and French has proved to be a seminal contribution to asset pricing theory, and was recently extended to include two more factors, yielding the Fama-French five-factor model. Other proposed augmentations of the three-factor model includes the introduction of a momentum factor by Carthart. The extensive use of such factors in asset pricing theory and investing motivates the study of the distributional properties of the returns of these factors. However, previous studies have focused on subsets of these six factors on the U.S. market. In this thesis, the distributional properties of daily log-returns of the five Fama-French factors and the Carthart momentum factor in European data from 2009 to 2019 are examined. The univariate distributional dynamics of the factor log-returns are modelled as ARMA-NGARCH processes with skewed t distributed driving noise sequences. The Gaussian and t copula are then used to model the joint distributions of these factor log-returns. The models developed are applied to estimate the one-day ahead Value-at-Risk (VaR) in testing data. The estimations of the VaR are backtested to check for correct unconditional coverage and exponentially distributed durations between exceedances. The results suggest that the ARMA-NGARCH processes are a valid approximation of the factor log-returns, and lead to good estimations of the VaR. The results of the multivariate analysis suggest that constant Gaussian and t copulas might be insufficient to model the dependence structure of the factors, and that there might be a need for more flexible copula models with dynamic correlations between factor log-returns. / Fama och Frenchs trefaktormodell har blivit en populär modell för aktieavkastning, och utvidgades nyligen av Fama och French genom att två ytterligare faktorer lades till för att skapa en femfaktormodell. Carthart föreslår en annan modell där trefaktormodellen kompletteras med en momentumfaktor. Då dessa faktorer används inom både akademiska sammanhang och kapitalförvaltning finns det ett tydligt behov av att undersöka vilka egenskaper fördelningen av faktorernas avkastning har. Dock har tidigare sådan forskning inte undersökt detta för alla sex faktorer, och endast använt data från USA:s marknad. I detta examensarbete undersökt därför sannolikhetsfördelningen för den logaritmiska dagliga avkastningen av de fem Fama-French-faktorerna och Cartharts momentumfaktor i europeisk data från åren 2009 till 2019. De endimensionella sannolikhetsfördelningarna modelleras som dynamiska med hjälp av ARMA-NGARCH-processer med feltermer som är fördelade enligt en generaliserad t-fördelning som tillåter skevhet. För att modellera multivariata fördelningar används en Gaussisk copula och en t-copula. De erhållna modellerna används sedan för att uppskatta daglig Value-at-Risk (VaR) i testdata. Dessa uppskattningar av VaR genomgår sedan statistiska test för att undersöka om antalet överträdelser är korrekt och tiderna mellan varje överträdelse är exponentialfördelade. Resultaten i detta examensarbete tyder på att ARMA-NGARCH-processer är en bra approximation av faktorernas logaritmiska dagliga avkastning, och ger bra uppskattningar av VaR. Resultaten för den multivariata analysen tyder på att en konstant copula kan vara en otillräcklig modell för beroendestrukturen mellan faktorerna, och att det möjligen finns ett behov av att använda mer flexibla copula-modeller med en dynamisk korrelation mellan faktorernas logaritmiska avkastning.
55

Optimizing Care for Oncologic and Hematologic Patients with Febrile Neutropenia

Graham, Emily Nicole 08 August 2017 (has links)
No description available.
56

Prise en compte des erreurs de mesure dans l’analyse du risque associe a l’exposition aux rayonnements ionisants dans une cohorte professionnelle : application à la cohorte française des mineurs d'uranium / Taking into account measurement error in the analysis of risk associated with exposure to ionizing radiation in an occupational cohort : application to the French cohort of uranium miners.

Allodji, Setcheou Rodrigue 09 December 2011 (has links)
Dans les études épidémiologiques, les erreurs de mesure de l’exposition étudiée peuvent biaiser l’estimation des risques liés à cette exposition. Un grand nombre de méthodes de correction de l’effet de ces erreurs a été développé mais en pratique elles ont été rarement appliquées, probablement à cause du fait que leur capacité de correction et leur mise en œuvre sont peu maîtrisées. Une autre raison non moins importante est que, en l’absence de données répétées ou de données de validation, ces méthodes de correction exigent la connaissance détaillée des caractéristiques (taille, nature, structure et distribution) des erreurs de mesure. L’objectif principal de cette thèse est d’étudier l’impact de la prise en compte des erreurs de mesure dans les analyses du risque de décès par cancer du poumon associé à l’exposition au radon à partir de la cohorte française des mineurs d’uranium (qui ne dispose ni de données répétées, ni de données de validation). Les objectifs spécifiques étaient (1) de caractériser les erreurs de mesure associées aux expositions radiologiques (radon et ses descendants, poussières d’uranium et rayonnements gamma), (2) d’étudier l’impact des erreurs de mesure de l’exposition au radon et à ses descendants sur l’estimation de l’excès de risque relatif (ERR) de décès par cancer du poumon et (3) d’étudier et comparer la performance des méthodes de correction de l’effet de ces erreurs. La cohorte française des mineurs d’uranium comprend plus de 5000 individus exposés de manière chronique au radon et à ses descendants qui ont été suivis en moyenne pendant 30 ans. Les erreurs de mesure ont été caractérisées en prenant en compte l’évolution des méthodes d’extraction et de la surveillance radiologique des mineurs au fil du temps. Une étude de simulation basée sur la cohorte française des mineurs d’uranium a été mise en place pour étudier l’impact de ces erreurs sur l’ERR ainsi que pour comparer la performance des méthodes de correction. Les résultats montrent que les erreurs de mesure de l’exposition au radon et à ses descendants ont diminué au fil des années. Pour les premières années, avant 1970, elles dépassaient 45 % et après 1980 elles étaient de l’ordre de 10 %. La nature de ces erreurs a aussi changé au cours du temps ; les erreurs essentiellement de nature Berkson ont fait place à des erreurs de nature classique après la mise en place des dosimètres individuels à partir de 1983. Les résultats de l’étude de simulation ont montré que les erreurs de mesure conduisent à une atténuation de l’ERR vers la valeur nulle, avec un biais important de l’ordre de 60 %. Les trois méthodes de correction d’erreurs considérées ont permis une réduction notable mais partielle du biais d’atténuation. Un avantage semble exister pour la méthode de simulation extrapolation (SIMEX) dans notre contexte, cependant, les performances des trois méthodes de correction sont fortement tributaires de la détermination précise des caractéristiques des erreurs de mesure.Ce travail illustre l’importance de l’effet des erreurs de mesure sur les estimations de la relation entre l’exposition au radon et le risque de décès par cancer du poumon. L’obtention d’estimation de risque pour laquelle l’effet des erreurs de mesure est corrigé devrait s’avérer d’un intérêt majeur en support des politiques de protection contre le radon en radioprotection et en santé publique. / In epidemiological studies, measurement errors in exposure can substantially bias the estimation of the risk associated to exposure. A broad variety of methods for measurement error correction has been developed, but they have been rarely applied in practice, probably because their ability to correct measurement error effects and their implementation are poorly understood. Another important reason is that many of the proposed correction methods require to know measurement errors characteristics (size, nature, structure and distribution).The aim of this thesis is to take into account measurement error in the analysis of risk of lung cancer death associated to radon exposure based on the French cohort of uranium miners. The mains stages were (1) to assess the characteristics (size, nature, structure and distribution) of measurement error in the French uranium miners cohort, (2) to investigate the impact of measurement error in radon exposure on the estimated excess relative risk (ERR) of lung cancer death associated to radon exposure, and (3) to compare the performance of methods for correction of these measurement error effects.The French cohort of uranium miners includes more than 5000 miners chronically exposed to radon with a follow-up duration of 30 years. Measurement errors have been characterized taking into account the evolution of uranium extraction methods and of radiation protection measures over time. A simulation study based on the French cohort of uranium miners has been carried out to investigate the effects of these measurement errors on the estimated ERR and to assess the performance of different methods for correcting these effects.Measurement error associated to radon exposure decreased over time, from more than 45% in the early 70’s to about 10% in the late 80’s. Its nature also changed over time from mostly Berkson to classical type from 1983. Simulation results showed that measurement error leads to an attenuation of the ERR towards the null, with substantial bias on ERR estimates in the order of 60%. All three error-correction methods allowed a noticeable but partial reduction of the attenuation bias. An advantage was observed for the simulation-extrapolation method (SIMEX) in our context, but the performance of the three correction methods highly depended on the accurate determination of the characteristics of measurement error.This work illustrates the importance of measurement error correction in order to obtain reliable estimates of the exposure-risk relationship between radon and lung cancer. Corrected risk estimates should prove of great interest in the elaboration of protection policies against radon in radioprotection and in public health.
57

Pursuing Enterprise Risk Management: A Local Roadmap for Canadian Health Care Leaders

Haney, James 19 July 2012 (has links)
An in-depth analysis of organizational risk management in health care, and in particular the concepts of Enterprise Risk Management (ERM), has identified a five part model that can be used by Canadian health care leaders as an evidence supported approach to successful organizational risk management. The Model for Organizational Risk Management has been developed as a basis for linking the components of an ERM framework into a Canadian health organization in order to overcome the barriers that commonly disrupt strategic risk management. The Model addresses how an ERM framework can fit within an existing health organization by building off of and enhancing existing processes and resources in order to ensure familiarity, acceptance, and sustainability of the risk management program. By approaching the Model in a stepwise fashion (based on individual organizational context) health care leaders are provided with a roadmap from which to advance their own organizational risk management program.
58

Pursuing Enterprise Risk Management: A Local Roadmap for Canadian Health Care Leaders

Haney, James 19 July 2012 (has links)
An in-depth analysis of organizational risk management in health care, and in particular the concepts of Enterprise Risk Management (ERM), has identified a five part model that can be used by Canadian health care leaders as an evidence supported approach to successful organizational risk management. The Model for Organizational Risk Management has been developed as a basis for linking the components of an ERM framework into a Canadian health organization in order to overcome the barriers that commonly disrupt strategic risk management. The Model addresses how an ERM framework can fit within an existing health organization by building off of and enhancing existing processes and resources in order to ensure familiarity, acceptance, and sustainability of the risk management program. By approaching the Model in a stepwise fashion (based on individual organizational context) health care leaders are provided with a roadmap from which to advance their own organizational risk management program.
59

Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks

Webster, Ronald A. January 2008 (has links)
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.
60

Um modelo de risco proporcional dependente do tempo

Parreira, Daniela Ribeiro Martins 30 March 2007 (has links)
Made available in DSpace on 2016-06-02T20:06:00Z (GMT). No. of bitstreams: 1 1662.pdf: 571364 bytes, checksum: 6091268473b4a7cb920748fd364c2a99 (MD5) Previous issue date: 2007-03-30 / Survival data analysis models is used to study experimental data where, normally, the variable "answer"is the time passed until an event of interest. Many authors do prefer modeling survival data, in the presence of co-variables, by using a hazard function - which is related with its interpretation. The Cox model (1972) - most commonly used by the authors - is applicable when the fail rates are proportional. This model is very flexible and used in the survival analysis. It can be easily extended to, for example, incorporate the time-dependent co-variables. In the present work we propose a proportional risk model which incorporates a time-dependent parameter named "time-dependent proportional risk model". / A análise de sobrevivência tem por objetivo estudar dados de experimento em que a variável resposta é o tempo até a ocorrência de um evento de interesse. Vários autores têm preferido modelar dados de sobrevivência na presença de covariáveis por meio da função de risco, fato este relacionado à sua interpretação. Ela descreve como a probabilidade instantânea de falha se modifca com o passar do tempo. Nesse contexto, um dos modelos mais utilizados é o modelo de Cox (Cox, 1972), onde a suposição básica para o seu uso é que as taxas de falhas sejam proporcionais. O modelo de riscos proporcionais de Cox é bastante flexível e extensivamente usado em análise de sobrevivência. Ele pode ser facilmente estendido para incorporar, por exemplo, o efeito de covariáveis dependentes do tempo. Neste estudo, propõe-se um modelo de risco proporcional, que incorpora um parâmetro dependente do tempo, denominado modelo de risco proporcional dependente do tempo. Uma análise clássica baseada nas propriedades assintóticas dos estimadores de máxima verossimilhança dos parâmetros envolvidos é desenvolvida, bem como um estudo de simulação via técnicas de reamostragem para estimação intervalar e testes de hipóteses dos parâmetros do modelo. É estudado o custo de estimar o efeito da covariável quando o parâmetro que mede o efeito do tempo é considerado na modelagem. E, finalizando, apresentamos uma abordagem do ponto de vista Bayesiano.

Page generated in 0.0743 seconds