• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 269
  • 95
  • 88
  • 45
  • 41
  • 37
  • 4
  • 2
  • Tagged with
  • 859
  • 286
  • 276
  • 266
  • 244
  • 235
  • 230
  • 226
  • 225
  • 171
  • 71
  • 60
  • 51
  • 50
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Evaluating integrated Neglected Tropical Disease (NTD) control strategies : cost, effectiveness and sustainability of the NTD control programme in Uganda

Fleming, Fiona January 2014 (has links)
The predicted costs and savings of integrated preventive chemotherapy (IPCT) for the Neglected Tropical Diseases (NTDs) have been proposed by a number of prior studies. The principle aims of the research presented in this thesis were to determine the cost, effectiveness and sustainability of the Ugandan integrated NTD Control Programme (NTDCP). Annual programme costs were prospectively and retrospectively collected at all levels of programme implementation in twelve districts over three years. Overall cost per person treated for IPCT were, financial (excluding purchased drugs) US$0.17; economic (excluding purchased and donated) US$0.65 and full economic cost per person (including value of all drugs) US$12.06. When costs of the NTDCP were compared to the combined costs of the pre-integration stand-alone (SA) control programmes in Uganda, cost savings of 54% to 72% were achieved. The IPCT was also effective with over 43 million treatments delivered and 2,444,714 disability-adjusted life years (DALYs) averted; and highly cost-effective at an estimated economic cost of US$16.50 per NTD case averted and US$10.19 per DALY averted. Using a novel approach to measure community drug distributors (CDDs) involvement in IPCT; pictorial diaries collected prospective data on their time allocation. On average 2.5 working weeks were spent on NTDCP activities per year by a CDD. This meant a significant reduction in time available for subsistence and income generating engagements. With each incremental drug delivery required over a mass IPCT campaign i.e. the number of times a CDD delivered a NTD drug package to communities, economic costs increased. Moreover, as CDDs took more time to complete NTDCP activities their programme coverage performance decreased. Motivation for the programme was acknowledged as low and CDDs felt undervalued. Finally the applications and implications of the thesis results are discussed with regards to programme effectiveness and sustainability and in the context of NTD control and elimination.

Effect of the public release of performance information on patient numbers in the English NHS

Laverty, Anthony January 2014 (has links)
Introduction: This thesis examines the impact of two special cases of release of quality information on patient utilisation: three NHS trusts heavily publicised as being poor providers of care (the scandal trusts) and NHS trusts highlighted as being the best or worst places for maternity care (the maternity trusts). It also draws on analyses of patient surveys of the information used in making healthcare choices, and whether patient reported quality improved in the maternity trusts. Methods: (1) Analysis of a survey of 2,181 patients recently referred for an outpatient appointment. Logistic regression examined socio-demographic factors associated with information use, as well as the likelihood of attending the 'local' hospital for treatment. (2) Difference-in-difference analysis of patient numbers between the scandal and maternity trusts and comparison groups using data from Hospital Episode Statistics. (3) Analysis of three patient reported quality measures from two surveys of maternity patients. Results: High-profile reports into the quality of care in the scandal trusts had an impact on patient utilisation for only one out of the three trusts, which disappeared six months after report publication. In the maternity trusts there was no change in patient utilisation in trusts highlighted as the best or the worst providers of care. On two out of three measures of patient reported quality trusts publicised as the worst providers did not improve at a faster rate than providers with similar scores at baseline. Discussion: Neither the scandal nor maternity trusts experienced declines in levels of patient utilisation. This lack of effect from high-profile reports, which were unequivocal on the quality of care, casts doubt on whether other forms of information reporting will have an impact on patient utilisation. The limited findings on patient reported quality suggest that in the absence of changes in patient utilisation, reporting on specific clinical areas is not associated with improvements.

Statistical methods in metabolomics

Muncey, Harriet Jane January 2014 (has links)
Metabolomics lies at the fulcrum of the system biology 'omics'. Metabolic profiling offers researchers new insight into genetic and environmental interactions, responses to pathophysi- ological stimuli and novel biomarker discovery. Metabolomics lacks the simplicity of a single data capturing technique; instead, increasingly sophisticated multivariate statistical techniques are required to tease out useful metabolic features from various complex datasets. In this work, two major metabolomics methods are examined: Nuclear Magnetic Resonance (NMR) Spec- troscopy and Liquid Chromatography-Mass Spectrometry (LC-MS). MetAssimulo, an 1H-NMR metabolic-profile simulator, was developed in part by this author and is described in the Chap- ter 2. Peak positional variation is a phenomenon occurring in NMR spectra that complicates metabolomic analysis so Chapter 3 focuses on modelling the effect of pH on peak position. Analysis of LC-MS data is somewhat more complex given its 2-D structure, so I review existing pre-processing and feature detection techniques in Chapter 4 and then attempt to tackle the issue from a Bayesian viewpoint. A Bayesian Partition Model is developed to distinguish chro- matographic peaks representing useful features from chemical and instrumental interference and noise. Another of the LC-MS pre-processing problems, data binning, is also explored as part of H-MS: a pre-processing algorithm incorporating wavelet smoothing and novel Gaussian and Exponentially Modified Gaussian peak detection. The performance of H-MS is compared alongside two existing pre-processing packages: apLC-MS and XCMS.

Statistical methods for elucidating copy number variation in high-throughput sequencing studies

Bellos, Evangelos January 2014 (has links)
Copy number variation (CNV) is pervasive in the human genome and has been shown to contribute significantly to phenotypic diversity and disease aetiology. High-throughput sequencing (HTS) technologies have allowed for the systematic investigation of CNV at an unprecedented resolution. HTS studies offer multiple distinct features that can provide evidence for the presence of CNV. We have developed an integrative statistical framework that jointly analyses multiple sequencing features at the population level to achieve sensitive and precise discovery of CNV. First, we applied our framework to low-coverage whole-genome sequencing experiments and used data from the 1000 Genomes Project to demonstrate a substantial improvement in CNV detection accuracy over existing methods. Next, we extended our approach to targeted HTS experiments, which offer improved cost-efficiency by focusing on a predetermined subset of the genome. Targeted HTS involves an enrichment step that introduces non-uniformity in sequencing coverage across target regions and thus hinders CNV identification. To that end, we designed a customized normalization procedure that counteracts the effects of enrichment bias and enhances the underlying CNV signal. Our extended framework was benchmarked on contiguous capture datasets, where it was shown to outperform competing strategies by a wide margin. Capture sequencing can also generate large amounts of data in untargeted genomic regions. Although these off-target results can be a valuable source of CNV evidence, they are subject to complex enrichment patterns that confound their interpretation. Therefore, we developed the first normalization strategy that can adapt to the highly heterogeneous nature of off-target capture and thus facilitate CNV investigation in untargeted regions. All in all, we present a generalized CNV detection toolset that has been shown to achieve robust performance across datasets and sequencing platforms and can therefore provide valuable insight into the prevalence and impact of CNV.

Space and time modelling of intra-urban air pollution

Tang, Ho Kin Robert January 2014 (has links)
Exposures to air pollution have adverse effects on health. Traditionally, epidemiological studies used monitoring data to investigate the relationship between air pollution and health. In recent decades, modelling tools have been developed to predict pollutant concentrations for population exposure assessments. Whilst gradual improvements have been made to these techniques, such as dispersion and land use regression (LUR), results have exhibited spatial inconsistencies at times. The processes involved are often time- and data- consuming, and outputs generally do not account for short-term variations in pollution. Improving model prediction capabilities can avoid exposure misclassifications, and provide better estimates for health risk assessment. The aim of this project is to increase the accuracy and efficiency of current exposure modelling techniques to capture spatial and temporal variability of urban air pollution. As part of this study, air pollution models were developed in a GIS framework for London for PM10, NOX and NO2, using dispersion, LUR, hybrid and Bayesian statistical methods. Predictors derived from traffic, land use, population datasets were incorporated in a geographical information system for modelling. For the first time, newly available city-wide datasets were used to extract enhanced geographical variables, including building height/ area, street canyon and detailed urban green space, which may have significant influence on pollution in local dispersion environment. Developed models were cross-validated and compared to concentrations obtained from routine monitoring network. LUR models were found to have higher prediction capabilities over other techniques, providing accurate explanations of spatial variability in urban air pollution. Significant improvements in model performance were seen with addition of buildings and street configuration variables, particularly for traffic-related pollutants. LUR require less computational demands than conventional dispersion methods; therefore can be easily applied over large urban areas. Introducing Bayesian statistical techniques has enabled spatio-temporal predictions which accounted uncertainties, allowing detection of pollution trends and episodes.

The application of post-mortem computed tomography (PMCT) for the anthropological examination of juvenile remains

Brough, Alison Louise January 2014 (has links)
Post-mortem computed tomography (PMCT) is a non-invasive medical imaging technology that could be a valuable adjunct to traditional techniques in forensic practice. However, despite numerous theoretical advantages, integration of PMCT into forensic pathology, anthropology and odontology is currently restricted by the lack of scientific evidence. This thesis reviews the literature regarding the anthropological investigation of juvenile remains. The experimental chapters use PMCT images of the Scheuer Juvenile Skeletal collection, a unique collection of remains, that span the full age range of the developing human held in Dundee, and cases from the PMCT image archive at the East Midlands Forensic Pathology Unit. Images were acquired using multi-detector CT scanners and analysed using OsiriX three-dimensional imaging software. This thesis considers 1) if anthropological measurements are reproducible using PMCT, 2) if PMCT-derived measurements are accurate, compared with dry bone and orthopantomogram (OPT) examinations 3) what images and data are required to conduct a full anthropological examination to determine an individual’s biological profile using PMCT and finally 4) how to format and display these images appropriately to facilitate data sharing, international interpretation and future development of this method. These techniques were also used in the anthropological investigation of Richard III. Using age as the principle parameter, and assessment of both long bones and dentition, I have shown that 1) measurements used in the most frequently applied forensic anthropology techniques can be extracted from PMCT data, 2) PMCT measurements are accurate, and repeatable by multiple practitioners of various professional backgrounds and experience and 3) the information required to conduct a comprehensive anthropological examination can be condensed into a concise twopage ‘minimum data-set’ form. The results of this thesis provide new evidence to support the implementation of PMCT for anthropological examination in events requiring forensic investigation and disaster victim identification.

The triarchic conceptualisation of psychopathy

Evans, Lydia January 2014 (has links)
This thesis examines the triarchic conceptualisation of psychopathy, mainly through the operationalization of a self-report tool the Trial'chic Psychopathy Measure (TriPM, Patrick, 2010). Following an introduction, chapter two presents a systematic review exploring how self-repOlt measures compare against other well validated measures of psychopathy; Psychopathy Checklist-Revised (PCL-R; Hare, 2003) and the PCL-SV (Halt, Cox & Hare, 1995). The review demonstrates that whilst they have their strengths, selfrepOlt measures need fmther development. Chapter three details a critical review of the TriPM. This explores the reliability, validity and clinical application of the tool. The strengths and limitation are discussed. Chapter four details an empirical research study testing the construct validity of the TriPM in a sample of personality disordered offenders. Analysis revealed significant relationships between the TriPM constructs and other conceptually relevant measures. Chapter five presents a case study detailing the assessment of personality and risk of intimate partner violence in an adult male community offender. A discussion of the work concludes the thesis.

New approaches to forensic analysis using Raman spectroscopy : portable instruments, gunshot residues and inks

Ho, Yen-Cheng January 2014 (has links)
In recent years, Raman spectroscopy has become popular in forensic science. However, most of the work in this area has used laboratory-based, research grade instruments. The aim of this thesis was to establish new methods for forensic Raman analysis which would be suitable for use in laboratories and at crime scenes using commercial portable Raman instruments. Although these systems could potentially enable forensic field analysis, they have been designed with very limited sampling options, typically with very short focal length excitation/collection optics. This makes it difficult to actually use them as intended at crime scenes due to problems locating and focusing on the samples. Here, a new sampling arrangement which involved changing the optical assemblies for excitation/collection and incorporation of a webcam for sample viewing was designed was constructed and tested. This gave a modified system that was actually usable as a handheld device. A new method for analysis of propellant and organic gunshot residue based on a combination of macroscopic observation of particle morphology and Raman spectroscopy for composition analysis was developed. The main obstacle in this work was fluorescence. However, washing with a 95:5 methanol:ethanol mixture removed the fluorescent interference from most propellants so Raman spectroscopy could be used to identify the chemical components of propellants/organic GSR particles and distinguish them from background particles. Raman spectroscopy has also been successfully employed to identify the main pigments/dyes in pen inks and to discriminate between them. Blue gel and liquid ink pens were examined using 633 and 785 nm excitation, it was found that the discrimination given by either of these excitation wavelengths was similar to the combination of both. In addition, the best way to discriminate between the pens was to combine normal Raman measurements on the pigment based inks with SERS on the dye-based systems.

Robust approaches for performing meta-analysis of genome-wide association studies to identify single nucleotide polymorphisms and copy number variations associated with complex traits

Charoen, Pimphen January 2013 (has links)
From 2007, there has been a huge proliferation in the discovery of genetic variants affecting human traits and diseases, achieved largely by the integration of multiple genome-wide association studies (GWAS) via meta-analysis. The principal objective of this thesis is to develop robust approaches for meta-analysis GWAS in order to reduce false positive findings and optimise statistical power. I consider both Single Nucleotide Polymorphism (SNP) and Copy Number Variant (CNV) GWAS. First, to gain background knowledge in GWAS and meta-analysis, I was involved in a large-scale meta-analysis GWAS to identify genetic variants associated with alcohol consumption, as the main statistical analyst. This study provided me with the opportunity to investigate ways of reducing the probability of false positive findings, via quality control procedures. The main discovery from the study was the identification of the Autism susceptibility candidate 2 gene (AUTS2) as associated with alcohol consumption at genome-wide significance. In the alcohol study, different phenotype transformations were applied to the data according to the inclusion or exclusion of non-drinkers, which led to questioning which transformation of skewed continuous phenotypes optimises statistical power in GWAS in general, forming the second major investigation in my thesis. It was shown that while the inverse normal transformation (INT) may not be the preferable choice of transformation in many epidemiological studies where effect sizes are large, its application to non-normal phenotypes in GWAS, where effect sizes are small and the priority is discovery over interpretability, may lead to an increase in the discovery of genetic variants affecting continuous traits. Finally, as knowledge about CNVs has accumulated in recent years, the meta-analysis of GWAS on CNVs has become a natural next step forward in the field. Therefore, I investigated and developed an approach to enable CNV meta-analysis to proceed in a similar way as SNP meta-analyses. This approach was developed into a software package, cnvPipe, which was applied to investigate CNVs associated with height and weight in the meta-analysis setting.

The relationship between adult anthropometry morbidity and functional status : a longitudinal study among rural Ethiopians

Feyissa, Ferew Lemma January 2002 (has links)
No description available.

Page generated in 0.0519 seconds