• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 17
  • Tagged with
  • 244
  • 244
  • 42
  • 35
  • 32
  • 27
  • 26
  • 26
  • 26
  • 25
  • 24
  • 22
  • 22
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Relationships between psychological factors and immune dysregulation in context : a life-course approach

Hammond, Catherine Campbell January 2018 (has links)
The thesis provides evidence about relationships between adverse exposures, psychological responses to them and immune dysregulation. The approach taken is informed by theories about the life-course, the stress process, the stress response and the inflammatory theory of depression. The first two empirical chapters provide evidence about the contribution of psychosocial factors to immune dysregulation. Immune dysregulation is measured by onsets of asthma and rheumatoid arthritis during adulthood. Comprehensive life-course data are used to provide valuable evidence about the epidemiology of each disease. More specifically, new evidence is provided about the psychosocial pathways that lead to disease onset. After adjustment for material adversities, social adversities predict onsets of each disease. Chronic as opposed to acute adversities are salient for rheumatoid arthritis onset, which is consistent with existing theory that chronic stress contributes to immune dysregulation. Depressive symptoms mediate an association between childhood adversity and asthma onset decades later. A small but consistent association between depressive symptoms and asthma onset soon afterwards may reflect psychological consequences of chronic inflammation preceding asthma diagnosis. The third empirical chapter tests prospective associations between chronic inflammation and depressive symptoms. It finds that chronic inflammation predicts depressive symptoms and provides new evidence that these associations are mediated by factors associated with sickness behaviours. Findings indicate the relevance of psychosocial pathways to the development of immune-mediated diseases and the potential involvement of immune behaviours in psychological symptoms. Practitioners and policy makers working with people who have conditions characterised by immune dysregulation should consider the psychological predictors and consequences of immune dysregulation. More research in this area is needed and this would be facilitated by the development and inclusion in surveys of well-validated measures of psychological and biological stress and of the psychological and behavioural correlates of sickness behaviours thought to be induced by inflammation.
142

Inequalities in mortality amenable to healthcare intervention in Scotland

Yates, Megan Amy January 2018 (has links)
Mortality amenable to health care intervention are premature deaths which, theoretically, should not occur in the presence of timely and effective health care. As Scotland has a universal health care system, where health care is freely provided at the point of access to all residents, there should be no socioeconomic inequalities in rates of amenable mortality (AM). However, gradients in rates of AM have been found in many countries, using various measures of socioeconomic position. The routine monitoring of rates of AM, and subgroups of amenable conditions, will contribute towards an indicator of health care performance. Records of all deaths occurring between 1980 and 2013, records of hospitalisations for amenable conditions, and mid-year population estimates were used to calculate rates of age standardised mortality and incident hospitalisations respectively. Absolute and relative inequalities in both rates for the total population were estimated using an area based measure of material deprivation, the Carstairs index. Individual level measurements of socioeconomic position, such as educational attainment, were used to measure inequalities in rates of deaths for a sample of the population, allowing for some comparison with European countries. Rates of AM in Scotland and England were compared in two natural experiments in the final two chapters, aiming to explore the direct and indirect effects of policy changes on health care systems abilities to effectively prevent amenable deaths. Rates of AM in Scotland have been found to be decreasing for both men and women. Mortality rates within two of the three subgroups of amenable conditions have also declined, with the third having too few deaths to comment on trends. The rates of incident hospitalisations of amenable conditions between 1996 and 2013 have remained relatively stable, suggesting that rates of AM may be reflecting improvements in the detection, treatment, and management of amenable conditions. Absolute and relative inequalities in mortality rates were largest when estimated using educational attainment, whilst occupational measures produced the smallest inequalities. The rate of decline in rates of AM slowed in Scotland, relative to England, following devolution, however the attempts to adequately control for differing levels of deprivation were unsuccessful. The final chapter saw step increase in rates of AM in England, compared to Scotland, following the publication of a White Paper for the Health and Social Care Act - however, this failed to reach statistical significance. This thesis concludes that the continued study of amenable mortality in Scotland is worthwhile, given that mortality rates continued to decline against stable rates of incident hospitalisations, and relative inequalities in mortality rates were found to be increasing, despite decreasing absolute inequalities. The monitoring of inequalities in rates of AM provides the potential for weaknesses in the provision and delivery of care to be identified and corrected.
143

Combining statistical methods with dynamical insight to improve nonlinear estimation

Du, Hailiang January 2009 (has links)
Physical processes such as the weather are usually modelled using nonlinear dynamical systems. Statistical methods are found to be difficult to draw the dynamical information from the observations of nonlinear dynamics. This thesis is focusing on combining statistical methods with dynamical insight to improve the nonlinear estimate of the initial states, parameters and future states. In the perfect model scenario (PMS), method based on the Indistin-guishable States theory is introduced to produce initial conditions that are consistent with both observations and model dynamics. Our meth-ods are demonstrated to outperform the variational method, Four-dimensional Variational Assimilation, and the sequential method, En-semble Kalman Filter. Problem of parameter estimation of deterministic nonlinear models is considered within the perfect model scenario where the mathematical structure of the model equations are correct, but the true parameter values are unknown. Traditional methods like least squares are known to be not optimal as it base on the wrong assumption that the distribu-tion of forecast error is Gaussian IID. We introduce two approaches to address the shortcomings of traditional methods. The first approach forms the cost function based on probabilistic forecasting; the second approach focuses on the geometric properties of trajectories in short term while noting the global behaviour of the model in the long term. Both methods are tested on a variety of nonlinear models, the true parameter values are well identified. Outside perfect model scenario, to estimate the current state of the model one need to account the uncertainty from both observatiOnal noise and model inadequacy. Methods assuming the model is perfect are either inapplicable or unable to produce the optimal results. It is almost certain that no trajectory of the model is consistent with an infinite series of observations. There are pseudo-orbits, however, that are consistent with observations and these can be used to estimate the model states. Applying the Indistinguishable States Gradient De-scent algorithm with certain stopping criteria is introduced to find rel-evant pseudo-orbits. The difference between Weakly Constraint Four-dimensional Variational Assimilation (WC4DVAR) method and Indis-tinguishable States Gradient Descent method is discussed. By testing on two system-model pairs, our method is shown to produce more consistent results than the WC4DVAR method. Ensemble formed from the pseudo-orbit generated by Indistinguishable States Gradient Descent method is shown to outperform the Inverse Noise ensemble in estimating the current states. Outside perfect model scenario, we demonstrate that forecast with relevant adjustment can produce better forecast than ignoring the existence of model error and using the model directly to make fore-casts. Measurement based on probabilistic forecast skill is suggested to measure the predictability outside PMS.
144

Topics on statistical design and analysis of cDNA microarray experiment

Zhu, Ximin January 2009 (has links)
A microarray is a powerful tool for surveying the expression levels of many thousands of genes simultaneously. It belongs to the new genomics technologies which have important applications in the biological, agricultural and pharmaceutical sciences. In this thesis, we focus on the dual channel cDNA microarray which is one of the most popular microarray technologies and discuss three different topics: optimal experimental design; estimating the true proportion of true nulls, local false discovery rate (lFDR) and positive false discovery rate (pFDR) and dye effect normalization. The first topic consists of four subtopics each of which is about an independent and practical problem of cDNA microarray experimental design. In the first subtopic, we propose an optimization strategy which is based on the simulated annealing method to find optimal or near-optimal designs with both biological and technical replicates. In the second subtopic, we discuss how to apply Q-criterion for the factorial design of microarray experiments. In the third subtopic, we suggest an optimal way of pooling samples, which is actually a replication scheme to minimize the variance of the experiment under the constraint of fixing the total cost at a certain level. In the fourth subtopic, we indicate that the criterion for distant pair design is not proper and propose an alternative criterion instead. The second topic of this thesis is dye effect normalization. For cDNA microarray technology, each array compares two samples which are usually labelled with different dyes Cy3 and Cy5. It assumes that: for a given gene (spot) on the array, if Cy3-labelled sample has k times as much of a transcript as the Cy5-labelled sample, then the Cy3 signal should be k times as high as the Cy5 signal, and vice versa. This important assumption requires that the dyes should have the same properties. However, the reality is that the Cy3 and Cy5 dyes have slightly different properties and the relative efficiency of the dyes vary across the intensity range in a "banana-shape" way. In order to remove the dye effect, we propose a novel dye effect normalization method which is based on modeling dye response functions and dye effect curve. Real and simulated microarray data sets are used to evaluate the method. It shows that the performance of the proposed method is satisfactory. The focus of the third topic is the estimation of the proportion of true null hypotheses, lFDR and pFDR. In a typical microarray experiment, a large number of gene expression data could be measured. In order to find differential expressed genes, these variables are usually screened by a statistical test simultaneously. Since it is a case of multiple hypothesis testing, some kind of adjustment should be made to the p-values resulted from the statistical test. Lots of multiple testing error rates, such as FDR, lFDR and pFDR have been proposed to address this issue. A key related problem is the estimation of the proportion of true null hypotheses (i.e. non-expressed genes). To model the distribution of the p-values, we propose three kinds of finite mixture of unknown number of components (the first component corresponds to differentially expressed genes and the rest components correspond to non-differentially expressed ones). We apply a new MCMC method called allocation sampler to estimate the proportion of true null (i.e. the mixture weight of the first component). The method also provides a framework for estimating lFDR and pFDR. Two real microarray data studies plus a small simulation study are used to assess our method. We show that the performance of the proposed method is satisfactory.
145

Uncertainties in gender violence epidemiology

Andersson, Neil January 2013 (has links)
This thesis contains 11 papers published in peer reviewed journals between 2006 and 2012. The papers focused on gender violence research methods, the prevalence of risk factors for gender violence, and its association with HIV and maternal morbidity. The accompanying commentary addresses three uncertainties that affect gender violence epidemiology. These are missing data, clustering and unrecognised causal relationships. In this thesis I ask: Can we reduce these three uncertainties in gender violence epidemiology? A systematic review of the intimate partner violence literature over the last decade found that few epidemiological studies manage missing data in gender violence questionnaires in a satisfactory way. Focus groups in Zambia, Nigeria and Pakistan confirmed that missing data lead to underestimation of gender violence prevalence. A partial solution to this problem was to place greater emphasis on interviewer training. In a reanalysis of the data from the published papers I compared different approaches to dealing with clustering in gender violence epidemiology. Generalised linear mixed models and other methods found that clustering potentially plays a causal role. This can be important in interventions that target a community at large, and act throughout the cluster. In a reanalysis of several datasets I show how a history of gender violence influences measurement of many associations related to HIV, possibly due to an unanticipated role of gender violence in the causal pathway with HIV. In conclusion, it is possible to reduce the uncertainties associated with missing data, clustering, and unrecognised causality in gender violence epidemiology.
146

Incorporating value judgments in data envelopment analysis

Allen, Rachel January 1997 (has links)
Data Envelopment Analysis (DEA) is a linear programming technique for measuring the relative efficiencies of a set of Decision Making Units (DMUs). Each DMU uses the same set of inputs in differing amounts to produce the same set of outputs in differing quantities. Weights are freely allocated in order to allow these multiple incommensurate inputs and outputs to be reduced to a single measure of input and a single measure of output. A relative efficiency score of a DMU under Constant Returns to Scale is given by maximising the sum of its weighted outputs to the sum of its weighted inputs, such that this ratio can not exceed I for any DMU; with the weights derived from the model being taken to represent the value attributed to the inputs and outputs of the assessment. It is well known in DEA that this free allocation of weights can lead to several problems in the analysis. Firstly inputs and outputs can be virtually ignored in the assessment; secondly any relative relationships between the inputs or outputs can be ignored, and thirdly any relationships between the inputs and outputs can be violated. To avoid/overcome these problems, the Decision Maker's (DM) value judgments are incorporated into the assessment. At present there is one main avenue for the inclusion of values, that of weights restrictions, whereby the size of the weights are explicitly restricted. Thus to include the relative value of the inputs or outputs, the relative value of the weights for these related inputs or outputs are restricted. The popularity of this approach is mainly due to its simplicity and ease of use. The aim of this thesis is, therefore, firstly, to demonstrate that, although the weights restrictions approach is appropriate for many DMs, for a variety of reasons some DMs, may prefer an alternative form for the expression of their values, e.g. so that they can include local values in the assessment. With this in mind, the second aim of this thesis is to present a possible alternative approach for the DMs to incorporate their values in a DEA assessment and, thirdly, it aims to utilise this alternative approach to improve envelopment. This alternative approach was derived by considering the basic concept of DEA, which is that it relies solely on observed data to form the Production Possibility Set (PPS), and then uses the frontier of this PPS to derive a relative efficiency score for each DMU. It could be perceived, therefore, that the reason for DMUs receiving inappropriate relative efficiency scores is due to the lack of suitable DEA-efficient comparator DMIUs. Thus, the proposed approach attempts to estimate suitable input output levels for these missing DEA-efficient comparator DMUs, i.e. Unobserved DMUs. These Unobserved DMUs are based on the manipulation of observed input output levels of specific DEA-efficient DMUs. The aim of the use of these Unobserved DMUs is to improve envelopment, and the specific DEA-efficient DMTJs that are selected as a basis for the Unobserved DMILTs are those that delineate the DEA-efficient frontier from the DEA-inefficient frontier. So, the proposed approach attempts to extend the observed PPS, while assuming that the values of the observed DEA-efficient DMIJs are in line with the perceived views of the DM. The approach was successfully applied to a set of UK bank branches. To illustrate that no approach is all-purpose, and that each has its strengths and weaknesses and, therefore, its own areas of application, a brief comparison is made between the approach of weights restrictions and the approach proposed in this thesis. This thesis is divided into three sections: A - Overview of the research area; B - An alternative perspective for incorporating values in DEA; C - The use of UDMUs to express the DM's values to improve envelopment
147

Performance measurement in UK universities : bringing in the stakeholders' perspectives using data envelopment analysis

Sarrico, Claudia S. January 1998 (has links)
This thesis is about performance measurement in higher education. It brings in different stakeholders' perspectives on performance measurement, in UK universities using data envelopment analysis. The introduction gives the background of the higher education sector in the UK at present and its history. It introduces the drive for performance measurement in higher education, and the motivation for the dissertation. The method data envelopment analysis is then described. The traditional use of performance indicators and peer assessment is reviewed and the use of DEA, instead of parametric techniques, is justified. The opportunity to use DEA in a somewhat different way than previously is identified. The novel proposed framework integrates in the same analysis the perspectives of three different levels of stakeholders. Firstly, the perspective of the applicant in the process of choosing a university to apply to; secondly, the perspective of the State that funds and evaluates university performance; and finally the institutional perspective. In the applicant's perspective, the use of DEA in university selection is compared to existing methods. The new approach devised recognises the different values of students and is empirically tested in a case study at a comprehensive school. This chapter clearly deals with a choice problem, and the link with MCDM is first approached. Finally, a comprehensive decision support system that includes DEA for university selection is arrived at. Then the relationship between the State and higher education over time is described, the current operational model explained and the future trends outlined. In order to measure performance, according to the mission and objectives of the state/ funding councils, a review of their three main remits is undertaken. The contribution of DEA to inform the State/ funding councils in their remit is then discussed. The problem of taking account of subject mix factor in the measurement of performance is dealt with, by linking the input/ output divide by means of virtual weights restrictions. It is shown how institutions can turn performance measurement to their own benefit, by using it as a formative exercise to understand the different expectations of them, by the two previous external evaluations. A methodology for institutional performance management is proposed that takes into account the external/ internal interfaces: the applicant/ institution, and state/ institution interfaces. The methodology is illustrated with an application to the University of Warwick. Virtual weights restrictions are widely used in this thesis, a reflection on its uses is offered. The reasons for mainly using virtual weights restrictions instead of absolute weights restrictions are explained. The use of proportional weights restrictions is reviewed, and the reasons for using simple virtual weights and virtual assurance regions in this thesis is ascertained. Alternatives to using virtual weights restrictions are considered, namely using absolute weights restrictions with a virtual meaning. The relationship between DEA and MCDM in this domain is elaborated upon. Several conclusions are arrived at and novel contributions are made to the knowledge of the subject treated: the importance of bringing in the perspectives of different stakeholders in an integrated approach; the contribution of DEA in choice problems; handling subject mix by means of virtual assurance regions; data availability policy is found to be inadequate; a more appropriate way of comparing departments within a university; and the superiority of virtual assurance regions to represent preference structures and link the input-output divide.
148

Spatial multilevel modelling of cancer mortality in Europe

Davies, Carolyn A. January 2005 (has links)
No description available.
149

An investigation of factors associated with traffic accident and casualty risk in Scotland

White, David Ian January 2002 (has links)
An investigation was conducted to identify factors associated with traffic accident involvement and traffic casualty involvement of road users in Scotland. This was done to determine to what extent accident and casualty involvement are related, and so assist policy-makers in the allocation of scarce resources. Traffic accident involvement was identified for Scottish-resident vehicle drivers. Traffic casualty involvement was identified for vulnerable road users, particularly child pedestrians. Traffic accident rates were determined from information provided by approximately one thousand Scottish-resident drivers who completed an extensive questionnaire on driving behaviours. Their personal characteristics, socio-demographic data, and information on attitudes to road safety issues, were also provided. This broad investigation revealed that traffic accident involvement was found to be associated with personal characteristics, driving behaviour, and attitudes to road safety issues. There is no evidence of any area effect on accident involvement of Scottish drivers, in terms of the administrative area in which they live, the relative level of affluence/deprivation of the area, or the population density of the area. A detailed statistical analysis of STATS19 traffic accident data was conducted to determine casualty rates for different groups of road user in Lothian, Scotland, for the years 1991-97. This involved the development of a unique index of multiple deprivation suitable for both urban and rural areas. Traffic casualty rates were found to be positively associated with the level of deprivation and the population density at postcode sector level. Analysis of injury-accident data identified that personal characteristics are also associated with casualty involvement for children aged 0-15 years old. As with accident involvement, the influence of behavioural and attitudinal factors on casualty involvement needs to be examined. A significant finding from this study is that traffic accident risk and traffic casualty risk are not associated with the same factors. Place of residence is significant in determining casualty risk, but has no significant effect on accident risk. Implications from this research are discussed and suitable recommendations are made.
150

Four essays in international macroeconomics

Jiang, Shifu January 2018 (has links)
Chapter 1: We propose an integral correction mechanism to model real exchange rate dynamics. In estimation, we also allow a Harrod-Balassa-Samuelson effect on real exchange rate long-run equilibrium. Using data from 19 OECD countries, we find the integral correction mechanism fitting in-sample data significantly better than the popular smooth transition autoregression model. The special dynamics of the integral correction mechanism help explain the PPP puzzle by distinguishing mean-reversion speeds in the long- and short- run. However, the integral correction mechanism shows a significant out-of-sample forecast gain over the random walk in only few cases. Though the gain is robust across forecast horizons and quite large at long horizons. Chapter 2: This chapter evaluates the ability of a standard IRBC model augmented with an input adjustment cost of imported goods to explain different aspects of the real exchange rate like the standard deviation, the autocorrelation function, the spectrum and the integral correction mechanism. I find that the simple IRBC model with an appropriate calibration can well capture all features of the real exchange rate. The input adjustment cost plays the key role. As compared to the standard model, it implies a reversed impulse response of the real exchange rate with a fast speed going back to steady state and introduces a long-run cyclical movement in most macroeconomic variables. I find that this particular impulse response helps explain the PPP puzzle. Chapter 3: I study optimal unconventional monetary policy under commitment in a two-country model where domestic policy entails larger spillovers to foreign countries. Equity injections into financial intermediaries turn out to be more efficient than discount window lending and the large-scale asset purchases that have been employed in many countries. Due to precautionary effects of future crises, a central bank should exit from its policy more slowly than the speed of deleveraging in the financial sector. The optimal policy can be changed considerably if cross-country policy cooperation is not imposed. In this case, interventions tend to be too strong in one country but too weak in the other. Gains from cooperation become positive if using unconventional monetary policy is costly enough, then correlates positively with the cost. Chapter 4: I consider the implementation of optimal unconventional monetary policy outlined in chapter 3. I find the Ramsey policy characterised by a simple rules responding to gaps in asset prices. However, it requires knowledge of asset prices that would be realized in a world free of financial friction so cannot be used to guide unconventional monetary policy in practice. The best practical simple rule responds to credit spread with inertia.

Page generated in 0.0626 seconds