1 |
Procedural volume in the radial Percutaneous Coronary Intervention era : an analysis of the British Cardiovascular Intervention Society registryHulme, William January 2018 (has links)
Percutaneous Coronary Intervention is a common treatment for obstructive coronary artery disease, in both planned and emergency settings. Its use in the United Kingdom and elsewhere has increased dramatically in recent years due to its efficacy in the management of Acute Coronary Syndromes, increased access to PCI services, and more permissive patient selection practices. Further, a patient undergoing PCI now is likely to be treated quite differently than the same patient ten years ago, with the emergence of new interventional techniques, devices, stent types, and drugs. The widespread adoption of transradial access in favour of transfemoral access in particular has marked a new era in the delivery of PCI in the UK. Due to the rapid changes in patient and treatment characteristics, evidence generated in settings that no longer reflect the radial era is increasingly irrelevant. This thesis addresses this evidence deficit using data from the British Cardiovascular Intervention Society national PCI registry to describe contemporary trends in PCI practice and investigate the potential implications of these trends on the quality of PCI delivery. It focuses on the relationship between procedural volume, arterial access site, and short-term mortality which has not been explored in radial-era UK practice. Broadly, three research questions were posed: (1) What are the qualities and limitations of the British Cardiovascular Intervention Society PCI Registry in answering questions about routine clinical practice in the United Kingdom? (2) What is the impact on PCI outcomes of changes to the underlying patient population, changes to the selection of these patients, and changes to the treatment of these selected patients? (3) What are the consequences of these changes on the relationship of procedural volume and access site on outcomes? This thesis has showed that those centres adopting radial access more readily did not experience a decline in femoral quality, and in the most recent period were associated with better outcomes overall, particularly amongst the highest volume centres. Operator volume itself however was not associated with improved outcomes, suggesting the organisation of PCI services is not leaving operators with too few, or too many, procedures to perform competently. The current trajectory in UK practice of increasing radial adoption should continue unabated, with radial access considered as the primary access route across all clinical settings wherever possible.
|
2 |
Human response to earthquake shaking : analysis of video footage of the 2010-2011 Canterbury earthquake sequence.Lambie, Emily Susan January 2014 (has links)
Research on human behaviour during earthquake shaking has identified three main influences of
behaviour: the environment the individual is located immediately before and during the
earthquake, in terms of where the individual is and who the individual is with at the time of the
earthquake; individual characteristics, such as age, gender, previous earthquake experience, and
the intensity and duration of earthquake shaking. However, little research to date has
systematically analysed the immediate observable human responses to earthquake shaking,
mostly due to data constraints and/or ethical considerations. Research on human behaviour
during earthquakes has relied on simulations or post-event, reflective interviews and
questionnaire studies, often performed weeks to months or even years following the event. Such
studies are therefore subject to limitations such as the quality of the participant's memory or
(perceived) realism of a simulation.
The aim of this research was to develop a robust coding scheme to analyse human behaviour
during earthquake shaking using video footage captured during an earthquake event. This will
allow systematic analysis of individuals during real earthquakes using a previously unutilized
data source, thus help develop guidance on appropriate protective actions. The coding scheme
was developed in a two-part process, combining a deductive and inductive approach. Previous
research studies of human behavioral response during earthquake shaking provided the basis for
the coding scheme. This was then iteratively refined by applying the coding scheme to a broad
range of video footage of people exposed to strong shaking during the Canterbury earthquake
sequence. The aim of this was to optimise coding scheme content and application across a broad
range of scenarios, and to increase inter-coder reliability.
The methodology to code data will enhance objective observation of video footage to allow
cross-event analysis and explore (among others): reaction time, patterns of behaviour, and social,
environmental and situational influences of behaviour. This can provide guidance for building
configuration and design, and evidence-based recommendations for public education about
injury-preventing behavioural responses during earthquake shaking.
|
3 |
Utilisation de données observationnelles en réanimation / Use of observational data in intensive care settingsPham, Tai Olivier 28 November 2016 (has links)
Introduction : La réanimation est une discipline médicale récente ayant pour spécificité une surveillance rapprochée et tracée des patients et de leurs traitements. Une grande quantité de données concernant les caractéristiques des patients, leur prise en charge et leur évolution sont ainsi générées et collectées quotidiennement. La médecine basée sur les preuves oppose classiquement les études observationnelles et interventionnelles et confère à ces dernières, notamment aux études randomisées et contrôlées, un niveau de preuve scientifique supérieur.Objectifs : Décrire l’apport de l’utilisation des données observationnelles en réanimation au travers de travaux récents recourant à différentes méthodologies d’analyse. Discuter les contributions respectives qu’ont les études observationnelles et interventionnelles dans l’acquisition des connaissances scientifiques en réanimation.Méthode : Quatre études observationnelles prospectives multicentriques conduites en réanimation et publiées dans des revues à comité de relecture. Détail de la variété des outils disponibles dans la conception d’études observationnelles à travers différentes méthodes telles que l’utilisation de données en temps réel, la description de cohortes, et l’appariement sur un score de propension afin d’estimer l’effet d’un traitement. Résultats : Les différents travaux détaillés dans cette thèse décrivent la variété des apports des données observationnelles en réanimation. Ces données peuvent constituer un outil de surveillance des épidémies en temps réel comme nous l’avons montré au cours de la pandémie de grippe A H1N1 en France. Elles sont aussi indispensables à la réalisation d’études épidémiologiques telle celle que nous présentons concernant le syndrome de détresse respiratoire aigu. Nous montrons aussi comment ce type de données a remis en question la définition des groupes de sevrage de la ventilation mécanique initialement proposée par une conférence de consensus. Enfin, l’utilisation de techniques statistiques innovantes telles celles basées sur un score de propension a permis l’évaluation du bénéfice de l’utilisation de la circulation extra-corporelle à visée d’oxygénation dans les cas les plus sévères de détresse respiratoire secondaire à la grippe A H1N1.Conclusion : Les études observationnelles ne sont pas à opposer aux études interventionnelles mais apportent des résultats complémentaires et offrent des solutions alternatives lorsqu’une intervention ne peut pas être testée. Connaitre les avantages et limites de chacune des méthodes permet d’optimiser la conception des études l’interprétation des résultats. L’utilisation des données observationnelles en réanimation participe pleinement au progrès des connaissances de cette spécialité. / Introduction: Intensive care is a recent medical specialty which has the particularity to provide a close monitoring and traceability of patients and their treatments. Thus, a huge amount of data is daily collected on characteristics, management and evolution of patients. Evidence based medicine usually confronts observational and interventional studies confers to the latters, particularly randomized controlled trials, a higher scientific quality.Objective: To describe the benefit of observational data for intensive care through recent works using different analysis methods. To discuss respective benefits of observational and interventional studies for scientific knowledge in intensive care.Methods: Four prospective observational multicenter studies conducted in intensive care units and published in peer reviewed journals. Detail of the spectrum of available tools to design observational studies presenting different methods such as real time data use, cohort description and propensity matched analysis to estimate a treatment effect. Results: The studies presented in this thesis describe the different contributions of observational data for intensive care. As we demonstrated during the Influenza A H1N1 pandemics, observational data can be used for a real-time monitoring of epidemics. They are necessary to conduct epidemiological studies such as the acute respiratory distress syndrome study presented. We also show how observational data lead to question the definition of weaning from mechanical ventilation groups initially proposed by a consensus conference. Finally, innovative statistics techniques as propensity score allowed the evaluation of extracorporeal membrane oxygenation for the most severe cases of respiratory failure due the Influenza A H1N1. Conclusion: Observational studies should not be opposed to interventional studies as they provide additional results and give alternative options when an interventional cannot be tested. One must know the benefits and limits of each methods in order to optimize studies design and results interpretation. Observational data are fully part of the knowledge progress of intensive care specialty.
|
4 |
Causal modelling in stratified and personalised health : developing methodology for analysis of primary care databases in stratified medicineMarsden, Antonia January 2016 (has links)
Personalised medicine describes the practice of tailoring medical care to the individual characteristics of each patient. Fundamental to this practice is the identification of markers associated with differential treatment response. Such markers can be identified through the assessment of treatment effect modification using statistical methods. Randomised controlled trials provide the optimal setting for evaluating differential response to treatment. Due to restrictions regarding sample size, study length and ethics, observational studies are more appropriate in many circumstances, particularly for the identification of markers associated with adverse side-effects and long term response to treatments. However, the analysis of observational data raises some additional challenges. The overall aim of this thesis was to develop statistical methodology for the analysis of observational data, specifically primary care databases, to identify and evaluate markers associated with differential treatment response. Three aspects of the assessment of treatment effect modification in an observational setting were addressed. The first aspect related to the assessment of treatment effect modification on the additive measurement scale which corresponds to a comparison of absolute treatment effects across patient subgroups. Various ways in which this can be assessed in an observational setting were reviewed and a novel measure, the ratio of absolute effects, which can be calculated from certain multiplicative regression models, was proposed. The second aspect regarded the confounding adjustment and it was investigated how the presence of interactions between the moderator and confounders on both treatment receipt and outcome can bias estimates of treatment effect modification if unaccounted for using Monte Carlo simulations. It was determined that the presence of bias differed across different confounding adjustment methods and, in the majority of settings, the bias was reduced when the interactions between the moderator and confounders were accounted for in the confounding adjustment model. Thirdly, it has been proposed that patient data in observational studies be organised into and analysed as series of nested nonrandomised trials. This thesis extended this study design to evaluate predictive markers of differential treatment response and explored the benefits of this methodology for this purpose. It was suggested how absolute treatment effect estimates can be estimated and compared across patient subgroups in this setting. A dataset comprising primary care medical records of adults with rheumatoid arthritis was used throughout this thesis. Interest lay in the identification of characteristics predictive of the onset of type II diabetes associated with steroid (glucocorticoid) therapy. The analysis in this thesis suggested older age may be associated with a higher risk of steroid-associated type II diabetes, but this warrants further investigation. Overall, this thesis demonstrates how observational studies can be analysed such that accurate and meaningful conclusions are made within personalised medicine research.
|
5 |
METHODOLOGICAL ISSUES IN PREDICTION MODELS AND DATA ANALYSES USING OBSERVATIONAL AND CLINICAL TRIAL DATALI, GUOWEI January 2016 (has links)
Background and objectives:
Prediction models are useful tools in clinical practise by providing predictive estimates of
outcome probabilities to aid in decision making. As biomedical research advances, concerns
have been raised regarding combined effectiveness (benefit) and safety (harm) outcomes in a
prediction model, while typically different prediction models only focus on predictions of
separate outcomes. A second issue is that, evidence also reveals poor predictive accuracy in
different populations and settings for some prediction models, requiring model calibration or
redevelopment. A third issue in data analyses is whether the treatment effect estimates could
be influenced by competing risk bias. If other events preclude the outcomes of interest, these
events would compete with the outcomes and thus fundamentally change the probability of
the outcomes of interest. Failure to recognize the existence of competing risk or to account
for it may result in misleading conclusions in health research. Therefore in this thesis, we
explored three methodological issues in prediction models and data analyses by: (1)
developing and externally validating a prediction model for patients’ individual combined
benefit and harm outcomes (stroke with no major bleeding, major bleeding with no stroke,
neither event, or both stroke and major bleeding) with and without warfarin therapy for atrial
fibrillation; (2) constructing a prediction model for hospital mortality in medical-surgical
critically ill patients; and (3) performing a competing risk analysis to assess the efficacy of
the low molecular weight heparin dalteparin versus unfractionated heparin in venous
thromboembolism in medical-surgical critically ill patients.
Methods:
Project 1: Using the Kaiser Permanente Colorado (KPCO) anticoagulation management
cohort in the Denver-Boulder metropolitan area of Colorado in the United States to include
patients with AF who were and were not prescribed warfarin therapy, we used a new
approach to build a prediction model of individual combined benefit and harm outcomes
related to warfarin therapy (stroke with no major bleeding, major bleeding with no stroke, neither event, or both stroke and major bleeding) in patients with AF. We utilized a
polytomous logistic regression (PLR) model to identify risk factors and then construct the
new prediction model. Model performances and validation were evaluated systematically in
the study.
Project 2: We used data from a multicenter randomized controlled trial named Prophylaxis for
Thromboembolism in Critical Care Trial (PROTECT) to develop a new prediction model for
hospital mortality in critically ill medical-surgical patients receiving heparin
thromboprophylaxis. We first identified risk factors independent of APACHE (Acute
Physiology and Chronic Health Evaluation) II score for hospital mortality, and then combined
the identified risk factors and APACHE II score to build the new prediction model. Model
performances were compared between the new prediction model and the APACHE II score.
Project 3: We re-analyzed the data from PROTECT to perform a sensitivity analysis based on
a competing risk analysis to investigate the efficacy of dalteparin versus unfractionated
heparin in preventing venous thromboembolism in medical-surgical critically ill patients,
taking all-cause death as a competing risk for venous thromboembolism. Results from the
competing risk analysis were compared with findings from the cause-specific analysis.
Results and Conclusions:
Project 1: The PLR model could simultaneously predict risk of individual combined benefit
and harm outcomes in patients with and without warfarin therapy for AF. The prediction
model was a good fit, had acceptable discrimination and calibration, and was internally and
externally validated. Should this approach be validated in other patient populations, it has
potential advantages over existing risk stratification approaches.
Project 2: The new model combining other risk factors and APACHE II score was a good fit,
well calibrated and internally validated. However, the discriminative ability of the prediction
model was not satisfactory. Compared with the APACHE II score alone, the new prediction
model increased data collection, was more complex but did not substantially improve discriminative ability.
Project 3: The competing risk analysis yielded no significant effect of dalteparin compared
with unfractionated heparin on proximal leg deep vein thromboses, but a lower risk of
pulmonary embolism in critically ill medical-surgical patients. Findings from the competing
risk analysis were similar to results from the cause-specific analysis. / Thesis / Doctor of Philosophy (PhD)
|
6 |
Forward and Inverse Problems Under Uncertainty / Problèmes directets et inverses Sous incertitudeZhang, Wenlong 27 June 2017 (has links)
Cette thèse contient deux matières différentes. Dans la première partie, deux cas sont considérés. L'un est le modèle plus lisse de la plaque mince et l'autre est les équations des limites elliptiques avec des données limites incertaines. Dans cette partie, les convergences stochastiques des méthodes des éléments finis sont prouvées pour chaque problème.Dans la deuxième partie, nous fournissons une analyse mathématique du problème inverse linéarisé dans la tomographie d'impédance électrique multifréquence. Nous présentons un cadre mathématique et numérique pour une procédure d'imagerie du tenseur de conductivité électrique anisotrope en utilisant une nouvelle technique appelée Tentomètre de diffusion Magnéto-acoustographie et proposons une approche de contrôle optimale pour reconstruire le facteur de propriété intrinsèque reliant le tenseur de diffusion au tenseur de conductivité électrique anisotrope. Nous démontrons la convergence et la stabilité du type Lipschitz de l'algorithme et présente des exemples numériques pour illustrer sa précision. Le modèle cellulaire pour Electropermécanisme est démontré. Nous étudions les paramètres efficaces dans un modèle d'homogénéisation. Nous démontrons numériquement la sensibilité de ces paramètres efficaces aux paramètres microscopiques critiques régissant l'électropermécanisme. / This thesis contains two different subjects. In first part, two cases are considered. One is the thin plate spline smoother model and the other one is the elliptic boundary equations with uncertain boundary data. In this part, stochastic convergences of the finite element methods are proved for each problem.In second part, we provide a mathematical analysis of the linearized inverse problem in multifrequency electrical impedance tomography. We present a mathematical and numerical framework for a procedure of imaging anisotropic electrical conductivity tensor using a novel technique called Diffusion Tensor Magneto-acoustography and propose an optimal control approach for reconstructing the cross-property factor relating the diffusion tensor to the anisotropic electrical conductivity tensor. We prove convergence and Lipschitz type stability of the algorithm and present numerical examples to illustrate its accuracy. The cell model for Electropermeabilization is demonstrated. We study effective parameters in a homogenization model. We demonstrate numerically the sensitivity of these effective parameters to critical microscopic parameters governing electropermeabilization..
|
7 |
Apports de la modélisation causale dans l’évaluation des immunothérapies à partir de données observationnelles / Contribution of the Causal Model in the Evaluation of Immunotherapy Based on Observational DataAsvatourian, Vahé 09 November 2018 (has links)
De nouveaux traitements comme l’immunothérapie ont été proposés en oncologie. Ils sont basés sur les mécanismes de régulation du système immunitaire. Cependant tous les patients ne répondent pas à ces nouveaux traitements. Afin de pouvoir les identifier, on mesure l’association des marqueurs immunologiques exprimés à la réponse au traitement ainsi qu’à la toxicité à l’instaurationdu traitement et leur évolution sous traitement. En situation observationnelle, l’absence de tirage au sort empêche la comparabilité des groupes et l'effet mesuré est juste une mesure d'association. Les méthodes d’inférence causalepermettent dans certains cas, après avoir identifié les sources de biais de par la construction de diagrammes acycliques dirigés (DAG), d'atteindre l’interchangeabilité conditionnelle entre exposés et non exposés etpermettent l’estimation d’effets causaux. Dans les cas les plus simples où le nombre de variables est faible, il est possible de dessiner leDAG à partir d’expertise. Dans les situations où le nombre de variables explosent, des algorithmes d’apprentissage ont été proposés pour retrouver la structure de ces graphes. Néanmoins ces algorithmes font d’une part l’hypothèse qu’aucune information n’est connue et n’ont été développés que dans les cas où les covariables sont mesurés à un seul temps. L’objectif de cette thèse est donc de développer ces méthodes d’apprentissages de graphes à des données répétées, puis d’intégrer des connaissances a priori pour améliorer l’estimation de ceux-ci. Une fois les graphes appris les modèles causaux peuvent être appliqués sur les biomarkers immunologiques répétés pour détecter ceux qui sont associés à laréponse et/ou la toxicité. / In oncology, new treatments such as immunotherapy have been proposed, which are based on regulation of the immune system. However, not all treated patient have a long-term benefit of the treatment. To identify those patients who benefit most, we measured markers of the immune system expressed at treatment initiation and across time. In an observational study, the lack of randomization makes the groups not comparable and the effect measured is just an association. In this context, causal inference methods allow in some cases, after having identified all biases by constructing a directed acyclic graph (DAG), to get close to the case of conditional exchangeability between exposed and non-exposed subjects and thus estimating causal effects.In the most simple cases, where the number of variables is low, it is possible to draw the DAG with experts’ beliefs. Whereas in the situation where the number of variables rises, learning algorithms have been proposed in order to estimate the structure of the graphs. Nevertheless, these algorithms make the assumptions that any a priori information between the markers is known and have mainly been developed in the setting in which covariates are measured only once. The objective of this thesis is to develop learning methods of graphs for taking repeated measures into account, and reduce the space search by using a priori expert knowledge. Based on these graphs, we estimate causal effects of the repeated immune markers on treatment response and/or toxicity.
|
8 |
Econometric Analysis of Social Interactions and Economic Incentives in Conservation Schemes / 環境保全制度における社会的相互作用と経済的インセンティブの計量経済研究Shimada, Hideki 23 March 2021 (has links)
京都大学 / 新制・課程博士 / 博士(農学) / 甲第23241号 / 農博第2448号 / 新制||農||1084(附属図書館) / 学位論文||R3||N5331(農学部図書室) / 京都大学大学院農学研究科生物資源経済学専攻 / (主査)准教授 三谷 羊平, 教授 伊藤 順一, 教授 梅津 千恵子 / 学位規則第4条第1項該当 / Doctor of Agricultural Science / Kyoto University / DGAM
|
9 |
Econometric methods for evaluating the cost-effectiveness of health care interventions using observational dataRovithis, Dimitrios January 2014 (has links)
This thesis explores the use of observational microdata in cost-effectiveness analysis. The application of econometric methods adjusting for selection bias is first reviewed and critically appraised in the economic evaluation literature using a structured template. Limitations of identified studies include lack of good quality evidence regarding the performance of different analytical approaches; inadequate assessment of the sensitivity of their results to violations of fundamental assumptions or variations to crucial estimator parameters; failure to combine the cost and effectiveness outcomes in a summary measure; and no consideration of stochastic uncertainty for the purpose of evaluating cost-effectiveness. Data from the Birthplace national cohort study are used in an attempt to address these limitations in the context of an empirical comparison of estimators relying on regression, matching, as well as the propensity score. It is argued that although these methods cannot address the potential impact of unobservable confounding, a novel approach to bias-corrected matching, combining entropy balancing with seemingly unrelated regression, still has the potential to offer important advantages in terms of analytical robustness. The net economic benefit is proposed as a straightforward way to exploit the strengths of rigorous econometric methodology in the development of reliable and informative cost-effectiveness analyses.
|
10 |
Contributions to the multivariate Analysis of Marine Environmental MonitoringGraffelman, Jan 12 September 2000 (has links)
The thesis parts from the view that statistics starts with data, and starts by introducing the data sets studied: marine benthic species counts and chemical measurements made at a set of sites in the Norwegian Ekofisk oil field, with replicates and annually repeated. An introductory chapter details the sampling procedure and shows with reliability calculations that the (transformed) chemical variables have excellent reliability, whereas the biological variables have poor reliability, except for a small subset of abundant species. Transformed chemical variables are shown to be approximately normal. Bootstrap methods are used to assess whether the biological variables follow a Poisson distribution, and lead to the conclusion that the Poisson distribution must be rejected, except for rare species. A separate chapter details more work on the distribution of the species variables: truncated and zero-inflated Poisson distributions as well as Poisson mixtures are used in order to account for sparseness and overdispersion. Species are thought to respond to environmental variables, and regressions of the abundance of a few selected species onto chemical variables are reported. For rare species, logistic regression and Poisson regression are the tools considered, though there are problems of overdispersion. For abundant species, random coefficient models are needed in order to cope with intraclass correlation. The environmental variables, mainly heavy metals, are highly correlated, leading to multicollinearity problems. The next chapters use a multivariate approach, where all species data is now treated simultaneously. The theory of correspondence analysis is reviewed, and some theoretical results on this method are reported (bounds for singular values, centring matrices). An applied chapter discusses the correspondence analysis of the species data in detail, detects outliers, addresses stability issues, and considers different ways of stacking data matrices to obtain an integrated analysis of several years of data, and to decompose variation into a within-sites and between-sites component. More than 40 % of the total inertia is due to variation within stations. Principal components analysis is used to analyse the set of chemical variables. Attempts are made to integrate the analysis of the biological and chemical variables. A detailed theoretical development shows how continuous variables can be mapped in an optimal manner as supplementary vectors into a correspondence analysis biplot. Geometrical properties are worked out in detail, and measures for the quality of the display are given, whereas artificial data and data from the monitoring survey are used to illustrate the theory developed. The theory of display of supplementary variables in biplots is also worked out in detail for principal component analysis, with attention for the different types of scaling, and optimality of displayed correlations. A theoretical chapter follows that gives an in depth theoretical treatment of canonical correspondence analysis, (linearly constrained correspondence analysis, CCA for short) detailing many mathematical properties and aspects of this multivariate method, such as geometrical properties, biplots, use of generalized inverses, relationships with other methods, etc. Some applications of CCA to the survey data are dealt with in a separate chapter, with their interpretation and indication of the quality of the display of the different matrices involved in the analysis. Weighted principal component analysis of weighted averages is proposed as an alternative for CCA. This leads to a better display of the weighted averages of the species, and in the cases so far studied, also leads to biplots with a higher amount of explained variance for the environmental data. The thesis closes with a bibliography and outlines some suggestions for further research, such as a the generalization of canonical correlation analysis for working with singular covariance matrices, the use partial least squares methods to account for the excess of predictors, and data fusion problems to estimate missing biological data.
|
Page generated in 0.0894 seconds