• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 105
  • 14
  • 14
  • 13
  • 7
  • 6
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • Tagged with
  • 420
  • 251
  • 53
  • 42
  • 40
  • 35
  • 33
  • 28
  • 26
  • 26
  • 26
  • 24
  • 23
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Benscaffold / Bonescaffold

From, Anna, Elo, Robin, Selldén, Emmy, Rasmussen, Liv, Nordh, Tim, Stadig, David, Westbom, Rickard January 2010 (has links)
The healing process of human bone can be aided by the presence of differentmaterials. A material that is transformed into bone when in contact with bone iscalled a boneconductive material. By transforming into bone the material reduces thetime that it takes to heal a fracture. This report is a study of existing boneconductivematerials on the market to see if there is a material available that would be suitable totest on larger fractures. The wanted application is to be able to heal fractures up to50 mm which is a fracture too big for the body to heal on its own.The healing process of human bone can be aided by the presence of differentmaterials. A material that is transformed into bone when in contact with bone iscalled a boneconductive material. By transforming into bone the material reduces thetime that it takes to heal a fracture. This report is a study of existing boneconductivematerials on the market to see if there is a material available that would be suitable totest on larger fractures. The wanted application is to be able to heal fractures up to50 mm which is a fracture too big for the body to heal on its own.
152

ʼEmet ʼw ʼemwnah : Țabenqiyn mh̦ahek h̦alwșiym /

Kafkafi, Eyal, January 1900 (has links)
Texte remanié de: Diss. Ph. D.--History--Tel-Aviv university--Tel-Aviv, 1986. / Mention parallèle de titre ou de responsabilité : Truth or faith : Yitzhak Tabenkin as mentor of Hakibbutz hameuhad / Eyal Kafkafi. Contient un résumé en anglais. Le faux-titre porte : "HaMerkaz lh̦eqer ʼEreș Yiśraʼel wYyišwbah šel Yad Yișh̦aq Ben-Șbiy wʼWniybersiyțat Tel-ʼAbiyb" Index.
153

An ecological survey of the scleractinian coral community at Hoi Ha Wan, Hong Kong /

Cope, Margaret Anne. January 1984 (has links)
Thesis (M. Phil.)--University of Hong Kong, 1985.
154

Combining statistical methods with dynamical insight to improve nonlinear estimation

Du, Hailiang January 2009 (has links)
Physical processes such as the weather are usually modelled using nonlinear dynamical systems. Statistical methods are found to be difficult to draw the dynamical information from the observations of nonlinear dynamics. This thesis is focusing on combining statistical methods with dynamical insight to improve the nonlinear estimate of the initial states, parameters and future states. In the perfect model scenario (PMS), method based on the Indistin-guishable States theory is introduced to produce initial conditions that are consistent with both observations and model dynamics. Our meth-ods are demonstrated to outperform the variational method, Four-dimensional Variational Assimilation, and the sequential method, En-semble Kalman Filter. Problem of parameter estimation of deterministic nonlinear models is considered within the perfect model scenario where the mathematical structure of the model equations are correct, but the true parameter values are unknown. Traditional methods like least squares are known to be not optimal as it base on the wrong assumption that the distribu-tion of forecast error is Gaussian IID. We introduce two approaches to address the shortcomings of traditional methods. The first approach forms the cost function based on probabilistic forecasting; the second approach focuses on the geometric properties of trajectories in short term while noting the global behaviour of the model in the long term. Both methods are tested on a variety of nonlinear models, the true parameter values are well identified. Outside perfect model scenario, to estimate the current state of the model one need to account the uncertainty from both observatiOnal noise and model inadequacy. Methods assuming the model is perfect are either inapplicable or unable to produce the optimal results. It is almost certain that no trajectory of the model is consistent with an infinite series of observations. There are pseudo-orbits, however, that are consistent with observations and these can be used to estimate the model states. Applying the Indistinguishable States Gradient De-scent algorithm with certain stopping criteria is introduced to find rel-evant pseudo-orbits. The difference between Weakly Constraint Four-dimensional Variational Assimilation (WC4DVAR) method and Indis-tinguishable States Gradient Descent method is discussed. By testing on two system-model pairs, our method is shown to produce more consistent results than the WC4DVAR method. Ensemble formed from the pseudo-orbit generated by Indistinguishable States Gradient Descent method is shown to outperform the Inverse Noise ensemble in estimating the current states. Outside perfect model scenario, we demonstrate that forecast with relevant adjustment can produce better forecast than ignoring the existence of model error and using the model directly to make fore-casts. Measurement based on probabilistic forecast skill is suggested to measure the predictability outside PMS.
155

Topics on statistical design and analysis of cDNA microarray experiment

Zhu, Ximin January 2009 (has links)
A microarray is a powerful tool for surveying the expression levels of many thousands of genes simultaneously. It belongs to the new genomics technologies which have important applications in the biological, agricultural and pharmaceutical sciences. In this thesis, we focus on the dual channel cDNA microarray which is one of the most popular microarray technologies and discuss three different topics: optimal experimental design; estimating the true proportion of true nulls, local false discovery rate (lFDR) and positive false discovery rate (pFDR) and dye effect normalization. The first topic consists of four subtopics each of which is about an independent and practical problem of cDNA microarray experimental design. In the first subtopic, we propose an optimization strategy which is based on the simulated annealing method to find optimal or near-optimal designs with both biological and technical replicates. In the second subtopic, we discuss how to apply Q-criterion for the factorial design of microarray experiments. In the third subtopic, we suggest an optimal way of pooling samples, which is actually a replication scheme to minimize the variance of the experiment under the constraint of fixing the total cost at a certain level. In the fourth subtopic, we indicate that the criterion for distant pair design is not proper and propose an alternative criterion instead. The second topic of this thesis is dye effect normalization. For cDNA microarray technology, each array compares two samples which are usually labelled with different dyes Cy3 and Cy5. It assumes that: for a given gene (spot) on the array, if Cy3-labelled sample has k times as much of a transcript as the Cy5-labelled sample, then the Cy3 signal should be k times as high as the Cy5 signal, and vice versa. This important assumption requires that the dyes should have the same properties. However, the reality is that the Cy3 and Cy5 dyes have slightly different properties and the relative efficiency of the dyes vary across the intensity range in a "banana-shape" way. In order to remove the dye effect, we propose a novel dye effect normalization method which is based on modeling dye response functions and dye effect curve. Real and simulated microarray data sets are used to evaluate the method. It shows that the performance of the proposed method is satisfactory. The focus of the third topic is the estimation of the proportion of true null hypotheses, lFDR and pFDR. In a typical microarray experiment, a large number of gene expression data could be measured. In order to find differential expressed genes, these variables are usually screened by a statistical test simultaneously. Since it is a case of multiple hypothesis testing, some kind of adjustment should be made to the p-values resulted from the statistical test. Lots of multiple testing error rates, such as FDR, lFDR and pFDR have been proposed to address this issue. A key related problem is the estimation of the proportion of true null hypotheses (i.e. non-expressed genes). To model the distribution of the p-values, we propose three kinds of finite mixture of unknown number of components (the first component corresponds to differentially expressed genes and the rest components correspond to non-differentially expressed ones). We apply a new MCMC method called allocation sampler to estimate the proportion of true null (i.e. the mixture weight of the first component). The method also provides a framework for estimating lFDR and pFDR. Two real microarray data studies plus a small simulation study are used to assess our method. We show that the performance of the proposed method is satisfactory.
156

Uncertainties in gender violence epidemiology

Andersson, Neil January 2013 (has links)
This thesis contains 11 papers published in peer reviewed journals between 2006 and 2012. The papers focused on gender violence research methods, the prevalence of risk factors for gender violence, and its association with HIV and maternal morbidity. The accompanying commentary addresses three uncertainties that affect gender violence epidemiology. These are missing data, clustering and unrecognised causal relationships. In this thesis I ask: Can we reduce these three uncertainties in gender violence epidemiology? A systematic review of the intimate partner violence literature over the last decade found that few epidemiological studies manage missing data in gender violence questionnaires in a satisfactory way. Focus groups in Zambia, Nigeria and Pakistan confirmed that missing data lead to underestimation of gender violence prevalence. A partial solution to this problem was to place greater emphasis on interviewer training. In a reanalysis of the data from the published papers I compared different approaches to dealing with clustering in gender violence epidemiology. Generalised linear mixed models and other methods found that clustering potentially plays a causal role. This can be important in interventions that target a community at large, and act throughout the cluster. In a reanalysis of several datasets I show how a history of gender violence influences measurement of many associations related to HIV, possibly due to an unanticipated role of gender violence in the causal pathway with HIV. In conclusion, it is possible to reduce the uncertainties associated with missing data, clustering, and unrecognised causality in gender violence epidemiology.
157

Incorporating value judgments in data envelopment analysis

Allen, Rachel January 1997 (has links)
Data Envelopment Analysis (DEA) is a linear programming technique for measuring the relative efficiencies of a set of Decision Making Units (DMUs). Each DMU uses the same set of inputs in differing amounts to produce the same set of outputs in differing quantities. Weights are freely allocated in order to allow these multiple incommensurate inputs and outputs to be reduced to a single measure of input and a single measure of output. A relative efficiency score of a DMU under Constant Returns to Scale is given by maximising the sum of its weighted outputs to the sum of its weighted inputs, such that this ratio can not exceed I for any DMU; with the weights derived from the model being taken to represent the value attributed to the inputs and outputs of the assessment. It is well known in DEA that this free allocation of weights can lead to several problems in the analysis. Firstly inputs and outputs can be virtually ignored in the assessment; secondly any relative relationships between the inputs or outputs can be ignored, and thirdly any relationships between the inputs and outputs can be violated. To avoid/overcome these problems, the Decision Maker's (DM) value judgments are incorporated into the assessment. At present there is one main avenue for the inclusion of values, that of weights restrictions, whereby the size of the weights are explicitly restricted. Thus to include the relative value of the inputs or outputs, the relative value of the weights for these related inputs or outputs are restricted. The popularity of this approach is mainly due to its simplicity and ease of use. The aim of this thesis is, therefore, firstly, to demonstrate that, although the weights restrictions approach is appropriate for many DMs, for a variety of reasons some DMs, may prefer an alternative form for the expression of their values, e.g. so that they can include local values in the assessment. With this in mind, the second aim of this thesis is to present a possible alternative approach for the DMs to incorporate their values in a DEA assessment and, thirdly, it aims to utilise this alternative approach to improve envelopment. This alternative approach was derived by considering the basic concept of DEA, which is that it relies solely on observed data to form the Production Possibility Set (PPS), and then uses the frontier of this PPS to derive a relative efficiency score for each DMU. It could be perceived, therefore, that the reason for DMUs receiving inappropriate relative efficiency scores is due to the lack of suitable DEA-efficient comparator DMIUs. Thus, the proposed approach attempts to estimate suitable input output levels for these missing DEA-efficient comparator DMUs, i.e. Unobserved DMUs. These Unobserved DMUs are based on the manipulation of observed input output levels of specific DEA-efficient DMUs. The aim of the use of these Unobserved DMUs is to improve envelopment, and the specific DEA-efficient DMTJs that are selected as a basis for the Unobserved DMILTs are those that delineate the DEA-efficient frontier from the DEA-inefficient frontier. So, the proposed approach attempts to extend the observed PPS, while assuming that the values of the observed DEA-efficient DMIJs are in line with the perceived views of the DM. The approach was successfully applied to a set of UK bank branches. To illustrate that no approach is all-purpose, and that each has its strengths and weaknesses and, therefore, its own areas of application, a brief comparison is made between the approach of weights restrictions and the approach proposed in this thesis. This thesis is divided into three sections: A - Overview of the research area; B - An alternative perspective for incorporating values in DEA; C - The use of UDMUs to express the DM's values to improve envelopment
158

Performance measurement in UK universities : bringing in the stakeholders' perspectives using data envelopment analysis

Sarrico, Claudia S. January 1998 (has links)
This thesis is about performance measurement in higher education. It brings in different stakeholders' perspectives on performance measurement, in UK universities using data envelopment analysis. The introduction gives the background of the higher education sector in the UK at present and its history. It introduces the drive for performance measurement in higher education, and the motivation for the dissertation. The method data envelopment analysis is then described. The traditional use of performance indicators and peer assessment is reviewed and the use of DEA, instead of parametric techniques, is justified. The opportunity to use DEA in a somewhat different way than previously is identified. The novel proposed framework integrates in the same analysis the perspectives of three different levels of stakeholders. Firstly, the perspective of the applicant in the process of choosing a university to apply to; secondly, the perspective of the State that funds and evaluates university performance; and finally the institutional perspective. In the applicant's perspective, the use of DEA in university selection is compared to existing methods. The new approach devised recognises the different values of students and is empirically tested in a case study at a comprehensive school. This chapter clearly deals with a choice problem, and the link with MCDM is first approached. Finally, a comprehensive decision support system that includes DEA for university selection is arrived at. Then the relationship between the State and higher education over time is described, the current operational model explained and the future trends outlined. In order to measure performance, according to the mission and objectives of the state/ funding councils, a review of their three main remits is undertaken. The contribution of DEA to inform the State/ funding councils in their remit is then discussed. The problem of taking account of subject mix factor in the measurement of performance is dealt with, by linking the input/ output divide by means of virtual weights restrictions. It is shown how institutions can turn performance measurement to their own benefit, by using it as a formative exercise to understand the different expectations of them, by the two previous external evaluations. A methodology for institutional performance management is proposed that takes into account the external/ internal interfaces: the applicant/ institution, and state/ institution interfaces. The methodology is illustrated with an application to the University of Warwick. Virtual weights restrictions are widely used in this thesis, a reflection on its uses is offered. The reasons for mainly using virtual weights restrictions instead of absolute weights restrictions are explained. The use of proportional weights restrictions is reviewed, and the reasons for using simple virtual weights and virtual assurance regions in this thesis is ascertained. Alternatives to using virtual weights restrictions are considered, namely using absolute weights restrictions with a virtual meaning. The relationship between DEA and MCDM in this domain is elaborated upon. Several conclusions are arrived at and novel contributions are made to the knowledge of the subject treated: the importance of bringing in the perspectives of different stakeholders in an integrated approach; the contribution of DEA in choice problems; handling subject mix by means of virtual assurance regions; data availability policy is found to be inadequate; a more appropriate way of comparing departments within a university; and the superiority of virtual assurance regions to represent preference structures and link the input-output divide.
159

Spatial multilevel modelling of cancer mortality in Europe

Davies, Carolyn A. January 2005 (has links)
No description available.
160

Language academies, language planning, and the case of the Hebrew revival

Nahir, Moshe. January 1900 (has links)
Thesis (Ph. D.)--University of Pittsburgh, 1973. / Includes bibliographical references (leaves 215-226).

Page generated in 0.0233 seconds