• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • Tagged with
  • 20
  • 20
  • 20
  • 20
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Investigation Of Students

Mut, Ali Ihsan 01 December 2003 (has links) (PDF)
ABSTRACT INVESTIGATION OF STUDENTS&rsquo / PROBABILISTIC MISCONCEPTIONS MUT, Ali ihsan M.S. in Secondary Science and Mathematics Education Supervisor: Assoc. Prof. Dr. Safure BULUT December 2003, 86 pages The purpose of the study was to investigate the students&rsquo / probabilistic misconceptions with respect to grade level, previous instruction on probability and gender. The sample of the study was 885 students from different types of the schools (general high schools, private collages, Anatolian high school, vocational high schools, and elementary schools) and from grade levels (5, 6, 7, 8, 9 and 10). The sample represented a range of students with respect to socio-economic level and cultural background. To collect data for the study Probabilistic Misconception Test (PMT) and a questionnaire were administered. The test consisted of 14 problems from 8 probabilistic misconception types. Its content validity was tested. The data of the study were analyzed by means of SPSS. Each misconception type is investigated with respect to all variables. The results of the study revealed that: (a) The frequencies of all misconception types varied across grade levels. (b) The percentages of students who received instruction on probability in the school were higher than those who did not received instruction in terms of misconceptions on Effect of Sample Size and Time Axis Fallacy. In addition, the other misconception types were more frequent among the students who did not receive a certain instruction on probability than the students who received a certain instruction probability before the study / (c) The frequencies of all misconception types varied across gender.
2

Reliability Based Safety Assessment Of Buried Continuous Pipelines Subjected To Earthquake Effects

Yavuz, Ercan Aykan 01 January 2013 (has links) (PDF)
Lifelines provide the vital utilities for human being in the modern life. They convey a great variety of products in order to meet the general needs. Also, buried continuous pipelines are generally used to transmit energy sources, such as natural gas and crude oil, from production sources to target places. To be able to sustain this energy corridor efficiently and safely, interruption of the flow should be prevented as much as possible. This can be achieved providing target reliability index standing for the desired level of performance and reliability. For natural gas transmission, assessment of earthquake threats to buried continuous pipelines is the primary concern of this thesis in terms of reliability. Operating loads due to internal pressure and temperature changes are also discussed. Seismic wave propagation effects, liquefaction induced lateral spreading, including longitudinal and transverse permanent ground deformation effects, liquefaction induced buoyancy effects and fault crossing effects that the buried continuous pipelines subjected to are explained in detail. Limit state functions are presented for each one of the above mentioned earthquake effects combined with operating loads. Advanced First Order Second Moment method is used in reliability calculations. Two case studies are presented. In the first study, considering only the load effect due to internal pressure, reliability of an existing natural gas pipeline is evaluated. Additionally, safety factors are recommended for achieving the specified target reliability indexes. In the second case study, reliability of another existing natural gas pipeline subjected to above mentioned earthquake effects is evaluated in detail.
3

Bayesian Inference In Anova Models

Ozbozkurt, Pelin 01 January 2010 (has links) (PDF)
Estimation of location and scale parameters from a random sample of size n is of paramount importance in Statistics. An estimator is called fully efficient if it attains the Cramer-Rao minimum variance bound besides being unbiased. The method that yields such estimators, at any rate for large n, is the method of modified maximum likelihood estimation. Apparently, such estimators cannot be made more efficient by using sample based classical methods. That makes room for Bayesian method of estimation which engages prior distributions and likelihood functions. A formal combination of the prior knowledge and the sample information is called posterior distribution. The posterior distribution is maximized with respect to the unknown parameter(s). That gives HPD (highest probability density) estimator(s). Locating the maximum of the posterior distribution is, however, enormously difficult (computationally and analytically) in most situations. To alleviate these difficulties, we use modified likelihood function in the posterior distribution instead of the likelihood function. We derived the HPD estimators of location and scale parameters of distributions in the family of Generalized Logistic. We have extended the work to experimental design, one way ANOVA. We have obtained the HPD estimators of the block effects and the scale parameter (in the distribution of errors) / they have beautiful algebraic forms. We have shown that they are highly efficient. We have given real life examples to illustrate the usefulness of our results. Thus, the enormous computational and analytical difficulties with the traditional Bayesian method of estimation are circumvented at any rate in the context of experimental design.
4

Probabilistic Latent Semantic Analysis Based Framework For Hybrid Social Recommender Systems

Eryol, Erkin 01 June 2010 (has links) (PDF)
Today, there are user annotated internet sites, user interaction logs, online user communities which are valuable sources of information concerning the personalized recommendation problem. In the literature, hybrid social recommender systems have been proposed to reduce the sparsity of the usage data by integrating the user related information sources together. In this thesis, a method based on probabilistic latent semantic analysis is used as a framework for a hybrid social recommendation system. Different data hybridization approaches on probabilistic latent semantic analysis are experimented. Based on this flexible probabilistic model, network regularization and model blending approaches are applied on probabilistic latent semantic analysis model as a solution for social trust network usage throughout the collaborative filtering process. The proposed model has outperformed the baseline methods in our experiments. As a result of the research, it is shown that the proposed methods successfully model the rating and social trust data together in a theoretically principled way.
5

Development Of A Gis Software For Evaluating Network Relibility Of Lifelines Under Seismic Hazard

Oduncucuoglu, Lutfi 01 December 2010 (has links) (PDF)
Lifelines are vital networks and it is important that those networks are still be functional after major natural disasters such as earthquakes. The goal of this study is to develop a GIS software for evaluating network reliability of lifelines under seismic hazard. In this study, GIS, statistics and facility management is used together and a GIS software module, which constructs GIS based reliability maps of lifeline networks, is developed by using geoTools. Developed GIS module imports seismic hazard and lifeline network layers in GIS formats using geoTools libraries and after creating a gridded network structure it uses a network reliability algorithm, initially developed by Yoo and Deo (1988), to calculate the upper and lower bounds of lifeline network reliability under seismic hazard. Also it can show the results in graphical form and save as shape file format. In order to validate the developed application, results are compared with a former case study of Selcuk (2000) and the results are satisfactorily close to previous study. As a result of this study, an easy to use, GIS based software module that creates GIS based reliability map of lifelines under seismic hazard was developed.
6

An Effectiveness Evaluation Method For Airburst Projectiles

Saygin, Oktay 01 May 2011 (has links) (PDF)
Airburst projectiles increase the effectiveness of air defense, by forming clouds of small pellets. In this work, in order to evaluate the effectiveness of airburst projectiles, Single Shot Kill Probability (SSKP) is computed at different burst distances by using three lethality functions defined from different measures of effectiveness. These different measures are target coverage, number of sub-projectile hits on the target and kinetic energy of sub-projectiles after burst. Computations are carried out for two different sub-projectile distribution patterns, namely circular and ring patterns. In this work, for the determination of miss distance, a Monte Carlo simulation is implemented, which uses Modified Point Mass Model (MPMM) trajectory equations. According to the results obtained two different distribution patterns are compared in terms of effectiveness and optimum burst distance of each distribution pattern is determined at different ranges.
7

Mixed Effects Models For Time Series Gene Expression Data

Erkan, Ibrahim 01 December 2011 (has links) (PDF)
The experimental factors such as the cell type and the treatment may have different impact on expression levels of individual genes which are quantitative measurements from microarrays. The measurements can be collected at a few unevenly spaced time points with replicates. The aim of this study is to consider cell type, treatment and short time series attributes and to infer about their effects on individual genes. A mixed effects model (LME) was proposed to model the gene expression data and the performance of the model was validated by a simulation study. Realistic data sets were generated preserving the structure of the sample real life data studied by Nymark et al. (2007). Predictive performance of the model was evaluated by performance measures, such as accuracy, sensitivity and specificity, as well as compared to the competing method by Smyth (2004), namely Limma. Both methods were also compared on real life data. Simulation results showed that the predictive performance of LME is as high as 99%, and it produces False Discovery Rate (FDR) as low as 0.4% whereas Limma has an FDR value of at least 32%. Moreover, LME has almost 99% predictive capability on the continuous time parameter where Limma has only about 67% and even it cannot handle continuous independent variables.
8

Experimental Design With Short-tailed And Long-tailed Symmetric Error Distributions

Yilmaz, Yildiz Elif 01 September 2004 (has links) (PDF)
One-way and two-way classification models in experimental design for both balanced and unbalanced cases are considered when the errors have Generalized Secant Hyperbolic distribution. Efficient and robust estimators for main and interaction effects are obtained by using the modified maximum likelihood estimation (MML) technique. The test statistics analogous to the normal-theory F statistics are defined to test main and interaction effects and a test statistic for testing linear contrasts is defined. It is shown that test statistics based on MML estimators are efficient and robust. The methodogy obtained is also generalized to situations where the error distributions from block to block are non-identical.
9

One Factor Interest Rate Models: Analytic Solutions And Approximations

Yolcu, Yeliz 01 January 2005 (has links) (PDF)
The uncertainty attached to future movements of interest rates is an essential part of the Financial Decision Theory and requires an awareness of the stochastic movement of these rates. Several approaches have been proposed for modeling the one-factor short rate models where some lead to arbitrage-free term structures. However, no definite consensus has been reached with regard to the best approach for interest rate modeling. In this work, we briefly examine the existing one-factor interest rate models and calibrate Vasicek and Hull-White (Extended Vasicek) Models by using Turkey&#039 / s term structure. Moreover, a trinomial interest rate tree is constructed to represent the evolution of Turkey&rsquo / s zero coupon rates.
10

Yield Curve Modelling Via Two Parameter Processes

Pekerten, Uygar 01 February 2005 (has links) (PDF)
Random field models have provided a flexible environment in which the properties of the term structure of interest rates are captured almost as observed. In this study we provide an overview of the forward rate random fiield models and propose an extension in which the forward rates fluctuate along with a two parameter process represented by a random field. We then provide a mathematical expression of the yield curve under this model and sketch the prospective utilities and applications of this model for interest rate management.

Page generated in 0.0642 seconds