• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 413
  • 58
  • 47
  • 19
  • 13
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 7
  • 7
  • 4
  • 3
  • Tagged with
  • 693
  • 132
  • 97
  • 95
  • 76
  • 70
  • 63
  • 60
  • 56
  • 54
  • 48
  • 43
  • 38
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Development Of A Gis Software For Evaluating Network Relibility Of Lifelines Under Seismic Hazard

Oduncucuoglu, Lutfi 01 December 2010 (has links) (PDF)
Lifelines are vital networks and it is important that those networks are still be functional after major natural disasters such as earthquakes. The goal of this study is to develop a GIS software for evaluating network reliability of lifelines under seismic hazard. In this study, GIS, statistics and facility management is used together and a GIS software module, which constructs GIS based reliability maps of lifeline networks, is developed by using geoTools. Developed GIS module imports seismic hazard and lifeline network layers in GIS formats using geoTools libraries and after creating a gridded network structure it uses a network reliability algorithm, initially developed by Yoo and Deo (1988), to calculate the upper and lower bounds of lifeline network reliability under seismic hazard. Also it can show the results in graphical form and save as shape file format. In order to validate the developed application, results are compared with a former case study of Selcuk (2000) and the results are satisfactorily close to previous study. As a result of this study, an easy to use, GIS based software module that creates GIS based reliability map of lifelines under seismic hazard was developed.
462

An Effectiveness Evaluation Method For Airburst Projectiles

Saygin, Oktay 01 May 2011 (has links) (PDF)
Airburst projectiles increase the effectiveness of air defense, by forming clouds of small pellets. In this work, in order to evaluate the effectiveness of airburst projectiles, Single Shot Kill Probability (SSKP) is computed at different burst distances by using three lethality functions defined from different measures of effectiveness. These different measures are target coverage, number of sub-projectile hits on the target and kinetic energy of sub-projectiles after burst. Computations are carried out for two different sub-projectile distribution patterns, namely circular and ring patterns. In this work, for the determination of miss distance, a Monte Carlo simulation is implemented, which uses Modified Point Mass Model (MPMM) trajectory equations. According to the results obtained two different distribution patterns are compared in terms of effectiveness and optimum burst distance of each distribution pattern is determined at different ranges.
463

Mixed Effects Models For Time Series Gene Expression Data

Erkan, Ibrahim 01 December 2011 (has links) (PDF)
The experimental factors such as the cell type and the treatment may have different impact on expression levels of individual genes which are quantitative measurements from microarrays. The measurements can be collected at a few unevenly spaced time points with replicates. The aim of this study is to consider cell type, treatment and short time series attributes and to infer about their effects on individual genes. A mixed effects model (LME) was proposed to model the gene expression data and the performance of the model was validated by a simulation study. Realistic data sets were generated preserving the structure of the sample real life data studied by Nymark et al. (2007). Predictive performance of the model was evaluated by performance measures, such as accuracy, sensitivity and specificity, as well as compared to the competing method by Smyth (2004), namely Limma. Both methods were also compared on real life data. Simulation results showed that the predictive performance of LME is as high as 99%, and it produces False Discovery Rate (FDR) as low as 0.4% whereas Limma has an FDR value of at least 32%. Moreover, LME has almost 99% predictive capability on the continuous time parameter where Limma has only about 67% and even it cannot handle continuous independent variables.
464

A method to establish non-informative prior probabilities for risk-based decision analysis

Min, Namhong 28 April 2014 (has links)
In Bayesian decision analysis, uncertainty and risk are accounted for with probabilities for the possible states, or states of nature, that affect the outcome of a decision. Application of Bayes’ theorem requires non-informative prior probabilities, which represent the probabilities of states of nature for a decision maker under complete ignorance. These prior probabilities are then subsequently updated with any and all available information in assessing probabilities for making decisions. The conventional approach for the non-informative probability distribution is based on Bernoulli’s principle of insufficient reason. This principle assigns a uniform distribution to uncertain states when a decision maker has no information about the states of nature. The principle of insufficient reason has three difficulties: it may inadvertently provide a biased starting point for decision making, it does not provide a consistent set of probabilities, and it violates reasonable axioms of decision theory. The first objective of this study is to propose and describe a new method to establish non-informative prior probabilities for decision making under uncertainty. The proposed decision-based method is focuses on decision outcomes that include preference in decision alternatives and decision consequences. The second objective is to evaluate the logic and rationality basis of the proposed decision-based method. The decision-based method overcomes the three weaknesses associated with the principle of insufficient reason, and provides an unbiased starting point for decision making. It also produces consistent non-informative probabilities. Finally, the decision-based method satisfies axioms of decision theory that characterize the case of no information (or complete ignorance). The third and final objective is to demonstrate the application of the decision-based method to practical decision making problems in engineering. Four major practical implications are illustrated and discussed with these examples. First, the method is practical because it is feasible in decisions with a large number of decision alternatives and states of nature and it is applicable to both continuous and discrete random variables of finite and infinite ranges. Second, the method provides an objective way to establish non-informative prior probabilities that capture a highly nonlinear relationship between states of nature. Third, we can include any available information through Bayes’ theorem by updating the non-informative probabilities without the need to assume more than is actually contained in the information. Lastly, two different decision making problems with the same states of nature may have different non-informative probabilities. / text
465

BEAM-FOIL STUDIES OF ATOMIC MEAN-LIVES IN THE VACUUM ULTRAVIOLET

Rathmann, Peter Walden January 1981 (has links)
The beam-foil method was used to determine mean-lives of excited atomic states. Initial studies were done on states. Initial studies were done on states of the helium- and hydrogen-like ions B IV and B V, with the mean-lives determined by fitting the decay curves to sums of exponential terms. Since theoretical values of the mean-lives are very precise in these simple atomic systems, our results indicate the accuracy of the experimental method. A series of measurements was made of the low lying 2s 2p⁴ states in nitrogen-like Ne IV, Na V, Mg VI, Al VII, and Si VIII. The experimental results were compared to theoretical calculations of Fawcett and Sinanoglu, and showed excellent agreement with Sinanolu's nonclosed-shell many electron theory. The lifetimes of the 4p ²P₁/₂ and 4p ²P₃/₂ states in copper-like Br VII were determined by measuring decay curves for both the primary and cascade decays and then analyzing the curves jointly. Our resulting mean-life values are considerbly shorter than those of previous experiments which analyzed only the primary decay curve. Comparison with theoretical calculations showed excellent agreement with those which include core polarization effects.
466

Distributions of some random volumes and their connection to multivariate analysis

Jairu, Desiderio N. January 1987 (has links)
No description available.
467

New formulae for higher order derivatives and a new algorithm for numerical integration

Slevinsky, Richard Unknown Date
No description available.
468

Likelihood based statistical methods for estimating HIV incidence rate.

Gabaitiri, Lesego. January 2013 (has links)
Estimation of current levels of human immunodeficiency virus (HIV) incidence is essential for monitoring the impact of an epidemic, determining public health priorities, assessing the impact of interventions and for planning purposes. However, there is often insufficient data on incidence as compared to prevalence. A direct approach is to estimate incidence from longitudinal cohort studies. Although this approach can provide direct and unbiased measure of incidence for settings where the study is conducted, it is often too expensive and time consuming. An alternative approach is to estimate incidence from cross sectional survey using biomarkers that distinguish between recent and non-recent/longstanding infections. The original biomarker based approach proposes the detection of HIV-1 p24 antigen in the pre-seroconversion period to identify persons with acute infection for estimating HIV incidence. However, this approach requires large sample sizes in order to obtain reliable estimates of HIV incidence because the duration of antigenemia before antibody detection is short, about 22.5 days. Subsequently, another method that involves dual antibody testing system was developed. In stage one, a sensitive test is used to diagnose HIV infection and a less sensitive test such is used in the second stage to distinguish between long standing infections and recent infections among those who tested positive for HIV in stage one. The question is: how do we combine this data with other relevant information, such as the period an individual takes from being undetectable by a less sensitive test to being detectable, to estimate incidence? The main objective of this thesis is therefore to develop likelihood based method that can be used to estimate HIV incidence when data is derived from cross sectional surveys and the disease classification is achieved by combining two biomarker or assay tests. The thesis builds on the dual antibody testing approach and extends the statistical framework that uses the multinomial distribution to derive the maximum likelihood estimators of HIV incidence for different settings. In order to improve incidence estimation, we develop a model for estimating HIV incidence that incorporate information on the previous or past prevalence and derive maximum likelihood estimators of incidence assuming incidence density is constant over a specified period. Later, we extend the method to settings where a proportion of subjects remain non-reactive to a less sensitive test long after seroconversion. Diagnostic tests used to determine recent infections are prone to errors. To address this problem, we considered a method that simultaneously makes adjustment for sensitivity and specificity. In addition, we also showed that sensitivity is similar to the proportion of subjects who eventually transit the “recent infection” state. We also relax the assumption of constant incidence density by proposing linear incidence density to accommodate settings where incidence might be declining or increasing. We extend the standard adjusted model for estimating incidence to settings where some subjects who tested positive for HIV antibodies were not tested by a less sensitive test resulting in missing outcome data. Models for the risk factors (covariates) of HIV incidence are considered in the last but one chapter. We used data from Botswana AIDS Impact (BAIS) III of 2008 to illustrate the proposed methods. The general conclusion and recommendations for future work are provided in the final chapter. / Theses (Ph.D.)-University of KwaZulu-Natal, Pietermaritzburg, 2013.
469

A Disaster risk management approach to seismic risk on Vancouver Island, British Columbia

Seemann, Mark R. 02 January 2013 (has links)
Communities on Vancouver Island, British Columbia face significant exposure to damaging earthquakes. This seismic risk arises not only from the Island’s proximity to crustal, sub-crustal and subduction earthquake sources in the Cascadia Subduction Zone and from their associated aftershock sequences, but also from environmental (natural and human-made) and social vulnerabilities in Vancouver Island communities and their current capacities to respond and recover from a large seismic event. Seeking to 1) assist community officials and the general public to better understand the scope of the earthquake risk on Vancouver Island; 2) raise awareness of the gaps in Vancouver Island’s risk assessment; 3) encourage and facilitate comprehensive seismic risk discussions at all levels of governance; and 4) offer quantitative data on which to base sound funding and policy decisions, this dissertation offers three new studies, presented in paper format, toward the comprehensive management of seismic risk on Vancouver Island. The first paper, reviews the components of risk and, building on international risk management standards and best practices, develops a new, comprehensive Disaster Risk Management (DRM) Framework for practitioners. This DRM Framework is then used to review existing knowledge of Vancouver Island’s seismic risk. A number of information gaps are identified, and two in particular, mainshock and aftershock hazard assessment, are targeted for further analysis. / Graduate
470

Experimental Design With Short-tailed And Long-tailed Symmetric Error Distributions

Yilmaz, Yildiz Elif 01 September 2004 (has links) (PDF)
One-way and two-way classification models in experimental design for both balanced and unbalanced cases are considered when the errors have Generalized Secant Hyperbolic distribution. Efficient and robust estimators for main and interaction effects are obtained by using the modified maximum likelihood estimation (MML) technique. The test statistics analogous to the normal-theory F statistics are defined to test main and interaction effects and a test statistic for testing linear contrasts is defined. It is shown that test statistics based on MML estimators are efficient and robust. The methodogy obtained is also generalized to situations where the error distributions from block to block are non-identical.

Page generated in 0.0423 seconds